Guidelines for the evaluation process and for preparing reports

These are the guidelines from the Evaluation Department on the evaluation process, the inception report and the main report.

The evaluation process

All phases of the evaluation shall adhere to the DAC Quality Standards for Development Evaluation (OECD, 2011). While those standards are applicable both for the client (the Evaluation department) and the evaluation team, the team is responsible for all aspects of those standards that fall within its scope of work, whether specified and facilitated by Norad or not.

The approach and methodology shall be in accordance with recognised professional standards in the field of evaluation or in relevant social science disciplines. For Impact Evaluations, specific guidelines apply.

Since stakeholders associated with the donor side of the aid relation normally have better access to information and often engage in close interaction with the evaluation team, the team shall actively try to balance inputs and influence from different categories of stakeholders over the evaluation process and results. Particular attention should be given to stakeholders who have otherwise little influence over strategy and decision making in aid, including marginalised groups.

All parts of the evaluation process shall be carried out in accordance with recognised ethical standards. The rights and welfare of all participants in the evaluation shall be protected and informed consent obtained.

When interacting with stakeholders the team shall behave professionally and respectfully, strive to reduce the time and other demands on stakeholders, and actively manage expectations to avoid unjustified expectations among   for continued assistance.

The evaluation team shall show sensitivity to gender, beliefs, manners and customs of all stakeholders and act with integrity and honesty. The anonymity and confidentiality of individual informants shall be protected when requested and/or as required by law, the context or ethical considerations. Direct references to informants’ statements in reports shall be done in ways that do not make it possible to trace statements to individuals, unless agreed with the informant concerned or unless the statements were made in public.

If the evaluation team during implementation finds any reason to suspect corruption or misuse of Norwegian aid funds, the team shall immediately inform the Evaluation department or use the ‘whistleblowing’ procedures described at Norad’s website. In such cases, members of the evaluation team shall not consult anyone suspected of involvement in corrupt behaviour.

The inception report

Purpose and basic principles

The purpose of the Inception report is to establish a shared understanding between the evaluation team and the Evaluation Department regarding all parts of the planned evaluation process. It is a crucial step in which the evaluation team clarifies what it has learned about the evaluation object so far, how it understands the scope and the purpose of the evaluation, and how it plans to solve the task.

This will in turn enable the Evaluation Department to correct any misunderstandings if needed, assess whether the evaluation design, choice of methods and other key components constitute an appropriate response to the Terms of Reference, and make a decision regarding approval of the plans. For the team, it will allow proper implementation, monitoring and quality assurance in the remaining stages of the evaluation process.

Process

Upon submission of the Inception report, the Evaluation Department makes a preliminary assessment. If it finds that the report can be shared with stakeholders, it will be forwarded to them for consultation. The Evaluation Department will make its own assessment of the Inception Report and prepare detailed feedback, and will also compile and forward feedback from stakeholders.

The Evaluation Department may either approve the Inception report, request a revised version, approve the report under specific conditions, or disapprove.

Key data collection processes including country visits, surveys and/or in-depth, structured interviews should wait until the Inception Report is approved, unless explicitly agreed with the Evaluation Department.

Content and format
 

The inception report shall describe all key parts of the planned evaluation, including:

  • A brief overview of the information collected to date, identifying information gaps and a strategy to fill those gaps.

  • The overall analytical and methodological approach (evaluation design). An evaluation matrix may be considered to explain how the evaluation design responds to the Terms of Reference. 

  • All methods and techniques planned for data collection, and how they relate to each other, including how triangulation of methods and multiple information sources will be used to substantiate findings and assessments.

    This will include search and selection strategies; a sample selection and other key choices made with regard to data collection; and an outline of tools (e.g. surveys, structured interviews).

    These strategies and options shall be presented in sufficient detail for the Evaluation Department to assess their appropriateness. If not all details can be presented at this stage, the report may present examples, conceptual models or a ‘prototype’ of parts of the data collection process or analytical approach instead of a detailed description.

  • Strategies for analysing data (how the evaluation team plans to analyse data).
     
  • An explicit discussion on the degree to which the proposed methodology will enable conclusions on contribution/attribution of identified results to aid, if relevant to the Terms of Reference.

  • An explicit elaboration and justification of any deviation from the tender document. 

  • A discussion on constraints and limitations, including aspects regarding the independence of the evaluation.

  • Plans for country/project visits when relevant, specifying time period, list of potential informants to be contacted or strategies for selection, and a preliminary schedule for the visit.

    If plans for country visits cannot be laid out in detail at this early stage, the Evaluation Department may ask for a separate submission on such plans at a later stage.

  • Any particularly sensitive issues, for instance regarding confidentiality, ethical aspects or expectation management, shall be discussed, including a plan for how these issues are to be managed.

  • A detailed work plan, specifying the roles and responsibilities of each evaluation/study team member, and a preliminary outline of the final report (see below).

The evaluation report

Basic principles

The evaluation report(s) shall present all information necessary to substantiate the findings, conclusions and recommendations even for readers who are not familiar with the evaluation object. This means, inter alia, that all findings must be traceable to the supporting evidence.

The report(s) shall convey insights in an informative, clear and concise manner, to the extent possible in a form that is understandable even for readers not familiar with the field. Language shall be concise, and the use of abbreviations and acronyms, footnotes and professional terminology limited to a minimum.

The report shall be based on the principles of academic freedom within the framework of what has been agreed as regards topics and methods in the agreement. This entails that the Evaluation Department cannot instruct the team on any specific formulation, and that the influence of stakeholders on the final product should be minimised and should be considered only when the evaluation team finds stakeholders’ concerns justifiable.

Still, the Evaluation Department can instruct re-writing of the report when justified by the Terms of Reference, OECD DAC’s Quality Standards, or recognised evaluation or academic principles.

Process

The report shall be submitted in the form of a draft report and a final draft report. Upon approval of the final draft report, the Evaluation Department will produce a final report for publishing.

The draft report shall preferably include numbered paragraphs for ease of reference in the Evaluation Department’s feedback.

The Evaluation Department will, upon a preliminary assessment that finds that the draft report can be shared, distribute the draft report to selected stakeholders. The Evaluation Department will provide its own feedback, and will also compile and forward feedback from stakeholders to the evaluation team.

The team shall consider all feedback on the draft report and prepare a revised draft (the final draft report). Upon verification that comments have been heeded in a satisfactory way, the Evaluation Department will approve a final version.

When the final draft report submitted by the evaluation team has been approved by the Evaluation Department, it will be made available to the public. It may be subject to layout decided by the Evaluation Department and a preface prepared by the evaluation department may be added.

If requested, stakeholder comments shall be included in the final report, for instance in the form of Annexes or footnotes.

Content and format

The report, both in draft and in final draft version, shall include the following if not otherwise agreed:

  • A presentation of the team and the process, including the name(s) of the firm(s) responsible for the report, team leader and team members, and division of labour between the team members. Any unresolved differences of opinion within the team should be acknowledged in the report.

  • A declaration stating, “This report is the product of its authors, and responsibility for the accuracy of data included in this report rests with the authors. The findings, interpretations, and conclusions presented in this report do not necessarily reflect the views of the Evaluation Department”. 

  • Table of Contents.

  • A list of acronyms and abbreviations. 

  • Executive Summary of maximum four pages, presenting the purpose, methodology with emphasis on limitations, and key findings and recommendations. The summary shall be easy to understand for non-experts in the field. Use of acronyms, abbreviations and technical terms should be avoided. The summary shall function as an independent text that can be read and understood independently from other parts of the report.

  • Relevant literature, previous research and evaluation

  • A description of the evaluation object, elaborating (when relevant) the programme/intervention logic and the underlying assumptions (the theory of change), emphasising significant, untested assumptions. 

  • Relevant literature, previous research and evaluation

  • A description of relevant aspects of data collection and analysis (cf. the Inception Report), emphasising constraints and limitations. Specific information such as data collection instruments may be presented in Annexes. Evaluations addressing results (outcomes, impact) shall explicitly address the degree to which it has been possible to establish a causal link between interventions and the assumed outcomes, and shall elaborate which methods have been used. 

  • Explicit elaboration of any evaluation questions for which it has not been possible to reach conclusions or to make meaningful assessments and why. The team shall list the sources sought and not found, and describe what information it would have required to make an assessment. 

  • Any issues relating to the independence of the evaluators from the policy, operations and management function of the commissioning agent, implementers and beneficiaries. Substantive disagreements expressed by stakeholders to findings, conclusions, recommendations and lessons learned should be acknowledged. 

  • All relevant findings presented in a way that specifies to the readers the supporting evidence behind all findings. It is particularly important to specify if some findings are relatively weaker than others in terms of supporting evidence. Observations or considerations that are not substantiated according to recognised evaluation methods shall not be presented as findings, but may be framed as hypotheses or questions for further study. 

  • Conclusions and recommendations presented in a way that establishes an explicit, logical chain where conclusions build directly on the evidence-supported findings, and recommendations derive logically from the conclusions. If the report includes numerous conclusions, they should be prioritised and/or categorised, for example organised according to institutions that have a natural responsibility for follow-up.

  • Annexes including the Terms of Reference, data collection instruments, complete list of institutions and persons consulted literature references, and other information if relevant. If needed in order to protect the privacy and confidentiality of informants, alternative ways of providing lists of informants shall be included.

Literature references, presentation and numbering of illustrations and other key elements shall be consistent throughout the report. To ease final design: Make sure to refer illustrations by name and not by their position on a page, please use bold/italics sparsely and keep to four title levels in the structure.

Before submission of the final report, the evaluation team shall use the following checklist: 
  • Table of contents is complete

  • All acronyms are explained

  • Executive summary  is  accessible to non-expert readers

  • Method
    • Clear statement of the analytical framework
    • All assumptions and limitations clearly stated 

  • Data 
    • Clear documentation of the data collection procedures
    • All relevant findings presented and summarised
    • All calculations clearly documented and checked
    • All data sources clearly referenced
    • All biographical references complete
    • All tables and figures are referred to by number/title of table/figure NOT physical placement (e.g. page, above etc)

  • Analysis 
    • All conclusions supported by well-documented data and evidence
    • Clear and complete statement of the limitations
    • Sensitivity of the conclusion to the assumptions is clarified
    • The Report  
    • Responds to TOR including all evaluation questions
    • Responds to the inception report 
    • Response to stakeholder comments as per DAC quality evaluation standards 
    • Is language vetted, proofread and formatted according to guidance given by the Evaluation Department to facilitate easier typesetting and better end result
    • Quality assurance is complete and explains deviations if any

The structure of the report may be as follows:

  • Front page/title page

  • Preface (to be prepared/added by the Evaluation Department)

  • Acknowledgements, including presentation of the firm, the team and a declaration that the report is the product of the authors and not the Evaluation Department 

  • Table of Contents 

  • Acronyms and abbreviations 

  • Executive Summary

  • Introduction or background, including a brief presentation and discussion of the evaluation object and the Terms of Reference

  • Literature review/previous research and evaluations

  • Methodology, giving an overall presentation of the evaluation design, conceptual approaches and key choices regarding data collection and analysis, with a discussion of limitations. More details and data collection tools should be given in Annexes. 

  • Findings presented in one or more chapters

  • Conclusions and recommendations, clearly distinguishing between the two

  • Annexes, including
    • Terms of Reference (TOR)
    • List of institutions and persons consulted including details of the field work if relevant
    • Definitions, data and survey instruments
    • Other information 
    • References
Published 17.02.2016
Last updated 06.03.2017