Monitoring & Evaluation

M&E approach of the NAMA Facility


Monitoring

Monitoring provides a way of measuring how well projects and programmes are being implemented. It generally analyses the efficiency and effectiveness of a project or a programme (i.e. measuring actual outputs or outcomes against planned outputs or outcomes). Monitoring is a systematic management activity aiming to analyse data on a continuous basis so that action can be taken if the implementation is deviating from the expected results. This is a swift and continuous activity that produces an immediate corrective effect, which makes it of key importance for improving performance.


The OECD defines monitoring as ‘a continuing function that uses systematic collection of data on specified indicators to provide management and the main stakeholders of an ongoing development intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds’ (OECD, 2002).
As far as the NAMA Facility is concerned, the Technical Support Unit (TSU) monitors how the overall programme (i.e. the totality of the supported NAMA Support Projects) performs. Individual project results feed into the overall monitoring of the NAMA Facility project portfolio and are aggregated within the overall NAMA Facility outcome. 
The information and insights generated by the TSU’s monitoring form one component of the continuous learning and improvement of the NAMA Facility. Monitoring and reporting is one of the TSU’s core responsibilities. 
The NAMA Facility monitoring seeks to answer key questions as outlined below. The monitoring is based on NAMA Support Project reports on the five mandatory core indicators of the NAMA Facility. 

  • Are the outputs described in the NAMA Facility logframe being delivered as planned?
  • Will the planned and delivered outputs of individual NAMA Facility contribute to attain the NAMA Facility outcome?
  • What issues, risks and challenges faced by NAMA Support Projects need to be taken into account in the future?
  • What lessons for ongoing and future projects can be learned from the implementation of the NAMA Facility?

 

The mandatory core indicators of the NAMA Facility
The NAMA Facility makes use of five mandatory core indicators. 

CategoryIndicator
MitigationReduced greenhouse gas emissions in NAMA Support Projects
Co-benefitsNumber of people directly benefitting from NAMA Support Projects
Transformational changeDegree to which the supported activities catalyse impact beyond NAMA Support Projects (potential for scaling-up, replication and transformation)
Public FinanceVolume of public finance mobilised for low-carbon investment and development
Private FinanceVolume of private finance mobilised for low-carbon investment and development

Evaluation
The OECD defines evaluation as a ‘systematic and objective assessment of an ongoing or completed project, program or policy, its design, implementation and results. The aim is to determine the relevance and fulfilment of objectives, development efficiency, effectiveness, impact and sustainability’ (OECD, 2002). It is, above all, a learning exercise.
NAMA Support Projects may be evaluated upon request by the Donors.
The NAMA Facility will be subject to at least a mid-term and an ex-post evaluation.
M&E requirements for NAMA Support Projects
All NAMA Support Projects need to present a monitoring plan. In addition to the mandatory core indicators, sector and project specific indicators regarding output, outcome and impact (from the project logframe) are followed up for the purpose of measuring progress and reporting. The monitoring plan contains detailed information on the monitoring tasks relevant to a particular project, including frequency of and responsibility for data collection. Monitoring activities are an integral component of project management. All monitoring costs must be included in the project budget.

Project monitoring seeks to answer the key questions outlined below. These are based on information arising from the development of the indicators.

  • Are the listed project outputs being delivered efficiently and as planned?
  • What issues, risks and challenges faced by or expected in the project need to be taken into account to ensure the expected results are achieved?
  • What decisions need to be made to adjust the project planning?
  • Will the planned and delivered outputs of individual NAMA Facility contribute to attain the NAMA Facility outcome? 
  • Are the envisaged outcomes still relevant and effective in terms of achieving the overall project goal and the desired impact? 
  • What lessons can be learned?

Delivery Organisations are required to send semi-annual and annual reports to the TSU. A final report must be submitted within six months of the end of the project. 


Evaluation at project level
It is expected that at least one evaluation per NAMA Support Project will be carried out. This provides an assessment of the overall project performance and its contribution to the overarching objective of the NAMA Facility. Results from the project evaluations provide another input into the continuous learning process of the NAMA Facility facilitated by the TSU.

Source: OECD. (2002). Glossary of Key Terms in Evaluation and Results Based Management. ISBN 92-64-08527-0. OECD Publications, Paris.

Read more on relevant topics