Web crawling and web scraping
ECOPER has incorporated two new types of artificial intelligence tool into its analysis and evaluation work used for big data analysis: web crawling and web scraping. Web crawling is a process by which a ‘robot’ (in this case, an algorithm) called a crawler systematically navigates through the internet. The crawler is provided with an initial…
What has become of the graduates? The tracer study
Tracer studies are used to assess levels of preparation and satisfaction of graduates of educational programmes. The approach began with the International Program for the Eradication of Child Labor of the International Labor Organization (ILO) in 2003, though their use has spread since 2016 as a result of the joint work of the ILO and…
Internet metrics and the evaluation of projects
In project evaluation, it is common to use techniques such as document reviews, semi-structured interviews, focus groups, and surveys. However, increasingly, development projects can leave their mark on the internet, leading to internet metrics becoming a new evaluation technique. ECOPER regularly evaluates United Nations projects that aim to implement improvements in public policies in developing…
Following the activity of city networks
Cities often form networks in order to exchange good practices and cooperate through working groups. Examples include the Network of Intercultural Cities (RECI), the World Organization of United Cities and Local Governments or the Network of Port Cities. ECOPER has begun a study to follow the cooperation of city networks and analyze their alignment with…
Following the 2030 Agenda on Twitter
ECOPER’s analysis and evaluation area is adapting artificial intelligence tools in order to gather monitoring indicators for the Sustainable Development Agenda. The aim in doing this is to determine to what extent the ideas of the 2030 Agenda permeate various public policies at the national level and how the activity of certain international actors, such…