Tools

Web crawling and web scraping

ECOPER has incorporated two new types of artificial intelligence tool into its analysis and evaluation work used for big data analysis: web crawling and web scraping. Web crawling is a process by which a ‘robot’ (in this case, an algorithm) called a crawler systematically navigates through the internet. The crawler is provided with an initial…

Internet metrics and the evaluation of projects

In project evaluation, it is common to use techniques such as document reviews, semi-structured interviews, focus groups, and surveys. However, increasingly, development projects can leave their mark on the internet, leading to internet metrics becoming a new evaluation technique. ECOPER regularly evaluates United Nations projects that aim to implement improvements in public policies in developing…

Following the activity of city networks

Cities often form networks in order to exchange good practices and cooperate through working groups. Examples include the Network of Intercultural Cities (RECI), the World Organization of United Cities and Local Governments or the Network of Port Cities. ECOPER has begun a study to follow the cooperation of city networks and analyze their alignment with…

Following the 2030 Agenda on Twitter

ECOPER’s analysis and evaluation area is adapting artificial intelligence tools in order to gather monitoring indicators for the Sustainable Development Agenda. The aim in doing this is to determine to what extent the ideas of the 2030 Agenda permeate various public policies at the national level and how the activity of certain international actors, such…