Syrisk flyktning i Libanon
Photo: Espen Røst

Evaluation in a time of crisis

The main finding from last year’s evaluations is that the development assistance system needs to improve its learning capability.

‘This finding is not new or even surprising,’ says Håvard Mokleiv Nygård, Evaluation Department Director. ‘the weak learning capability of the system has unfortunately been a recurring feature of several previous annual reports from the Evaluation Department. This year’s annual report reiterates the need to continue focusing on this. Ensuring that lessons from previous evaluations are incorporated into the ongoing work to improve Norwegian development cooperation is the responsibility of all those involved.’

Main lessons learned

The annual report contains three key lessons:

  1. First, the knowledge base for Norwegian development cooperation must be improved. Norwegian development cooperation is often perceived as flexible and not too risk averse, but this also presents some challenges regarding how the system is able to provide sufficient knowledge to enable sound decisions.
  2. Second, to improve its results-based management and management by objectives, Norwegian development cooperation must enhance its ability to collect data, analyse and evaluate on an ongoing basis. Currently, we often lack knowledge of whether initiatives achieve their expected results and objectives.
  3. Third, we need to work more systematically with countries that ‘move up’ from low-income to middle-income status.

Independence, credibility and use

‘Independence, credibility and use are fundamental principles that must be upheld in all evaluation work,’ says Nygård.

‘Last year, five major evaluations were completed,’ Nygård tells us. One of these assessed the quality of the decentralised evaluations. These are evaluations and reviews that the aid agencies themselves are responsible for undertaking as part of their management of funds and agreements. The evaluation showed that the quality is weak and needs improvement. A considerable quality upgrade is required for decentralised evaluations to become a useful tool for improving development cooperation. Around one-half of these evaluations have such serious weaknesses that they beg the question of whether they give credible information on the effects and results of Norwegian development assistance.

Another evaluation reviewed Norway’s anti-corruption work in the area of development policy and aid, and showed that Norway lacks an overarching, coherent strategy as a basis for its anti-corruption efforts.

A third evaluation reviewed Norway’s International Climate and Forest Initiative, showing that the programme has resulted in greater international focus on such issues as the importance of reducing and reversing the loss of tropical forests and developing sectoral policies that can facilitate participation in dialogue between key actors. Moreover, the evaluation showed that the programme has granted project funding primarily to civil society organisations and served as a flexible donor with a focus on pilot projects. The evaluation finds, however, that the scope and size of these projects are insufficient to contribute to change at an aggregate level, and that a poverty orientation needs to be more specifically included.

Large production in spite of the pandemic

‘Last year was marked by the pandemic, but the department nevertheless produced a large number of evaluations, reports and events,’ Nygård says. ‘We could do more than we thought was possible by using virtual and digital tools and methods, but there are also limits as to when these are sufficient. Not least, the pandemic showed us how crucial it is to have good local partners.’

In addition to the five major evaluations that were published throughout the year, the department also produced a number of reports that summarise international evaluations of Norway’s partner countries. A total of 17 such partner country reports have been completed to date in recent years. The department also prepared worksheets on various topics, including lessons learned from cash transfers, a separate review of initiatives related to COVID-19, and an evaluability study of the multi-donor UN fund for COVID-19 (UN COVID-19 MPTF). Several new large, independent evaluations were also initiated, including Norway’s efforts in the areas of women, peace and security, which will be completed next year, Nygård explains.

Knowledge and public discourse

A prominent part of the work of the Evaluation Department consists of disseminating knowledge and informing public discourse. The department arranged eleven webinars with different topics and lecturers. This was not only a process of rapid learning for the department, but also a way to involve lecturers digitally from wider parts of the world. Over the year, the department also produced eight newsletters and contributed to twenty feature articles and opinion pieces in the media. The department had a total of eleven full-time and part-time staff.

About the evaluation process

The Evaluation Department’s remit is to help development policy actors learn from experience and to hold them accountable. The department is governed by a separate set of instructions and reports directly to the Secretaries General of the Ministry of Foreign Affairs and the Ministry of Climate and Environment.

Evaluations shall:

be undertaken independently of those who are responsible for administration and implementation

  • be undertaken in accordance with recognised evaluation norms and standards
  • elucidate relevant issues
  • produce feasible recommendations that can be applied in the design of budgets and further development of the activities being evaluated
  • help promote a constructive and open public discourse

Each year, the department decides what is to be evaluated through a three-year, rolling evaluation programme. To ensure the relevance and use of the evaluations, the programme is designed in consultation with actors within and outside the development aid administration. Provisions are also made for dialogue with stakeholders during the evaluation processes.

Published 19.08.2021
Last updated 19.08.2021