Terms of Reference

 Mid- Term Evaluation of the 1st Call for Proposal of the Energy Facility under the 9th EDF

1          BACKGROUND INFORMATION

1.1         The EU Energy Initiative and the Energy Facility

In August 2002, at the World Summit Sustainable Development, the European Union launched the EU Energy Initiative for Poverty Eradication and Sustainable Energy (EUEI), to contribute to the achievement of the MDGs, in particular the goal of halving the number of people in extreme poverty by 2015. This is to be achieved through the provision of adequate, affordable and sustainable energy services to the poor.

As a response to these objectives in the ACP region, the European Commission, in October 2004, presented a Communication[1] on the development of the EUEI and the modalities for the establishment of an Energy Facility for ACP countries. On 25 June 2005 the ACP-EU Council formally approved the creation of an €220 million ACP-EC Energy Facility.

The long-term overall objective of the ACP-EC Energy Facility (EF) is to contribute to the achievement ofthe MDGs, in particular the goal on poverty, through increased access to energy services by the poor rural population. In accordance with the broad objectives and areas of action, the three specific objectives are:

  1. Improved access to modern energy services by poor rural people, with priority for the un-served population living in scattered settlements, villages, rural towns, peri-urban areas and remote islands, using the grant funds to leverage additional investment
  1. Improved governance and management in the energy sector by strengthening poverty related policy making in the energy sector and across sectors, the institutional and legal framework and the capacity of stakeholders, and
  1. Facilitation of future large-scale investment programmes in cross-border interconnections, grid extensions and rural distribution.

To achieve the three specific objectives, three main components can be distinguished:

  • Component 1: actions to increase access to energy services in rural areas;
  • Component 2: actions to improve energy management and governance; and
  • Component 3: actions to improve cross-border cooperation in the energy sector.

1.2         Management of the Water and Energy Facility and justification of a mid-term evaluation.

The lion’s share of the EF funds, 196 M€, was channelled through a Call for proposals (CfP) launched in June 2006. The CfP was closed in October of the same year. In 2007, after the selection process, 74 projects were awarded a contract for a EF grant. Among the 74 contracts, 3 have been cancelled and 2 are already terminated. All the others begun between one to three years ago and  are still being implemented.

In addition to the projects selected through the CfP, the Energy Facility devoted 10 M€ for activities in preparation to the Africa-EU Infrastructure Partnership. In this context five technical assistance projects in support to four Sub-Saharan African Power Pools and the African Forum for Utility Regulators (AFUR) were launched in 2008-2009 and are still being implemented.

The Action Fiche of the Energy Facility clearly specifies that a mid-term evaluation of the programme will take place. Given the above mentioned progresses in the implementation of the activities, it is considered appropriate to proceed with a  mid-term evaluation. This would allow to integrate the conclusions and recommendations in order to enhance the implementation process of the projects of 1st CfP, the upcoming projects of the 2nd CfP and, at the end, it will provide  necessary information for a possible renew of the mandate of the EF.

This mid-term evaluation will not include the activities undertaken in the context of the Africa-Europe Infrastructure Partnership (i.e. technical assistance in support to the African  Power Pools and AFUR). In fact, these activities will be evaluated later and independently, in collaboration with the ACP Secretariat. Therefore, the results of their evaluation will be integrated later into the final general evaluation of the EF.

A so called “Mid-Term-Evaluation” of the Water and Energy Facilities was already carried out  in 2007. However,  since the EF CfP was at an early implementation stage, this mid-term evaluation could cover only the selection phase up to the establishment of 91 projects on the short list. This evaluation should be completed so that it becomes really instructive, as mentioned above.

2          Objectives and expected results of the evaluation

2.1         Principal objective and purpose of the evaluation

The  mid -term evaluation objective is to analyse to what extent the results attained by the implementation of the 1st EF CfP are in line with the general and specific objectives defined in the Action Fiche.

More specifically, the purpose of the evaluation is to draw key lessons in order to improve the relevance, efficiency, effectiveness, sustainability and impact of:

  • The implementation of the 1st CfP and the commencing implementation of the 2nd CfP by the DEVCO HQs.
  • The decision making process involving the EU Delegations, the European Commission and ACP Secretariat for the Energy Sector Policies, Programmes and Financing Instruments.
  • The continued follow up of the implementation of the EF projects portfolio by the EU Delegations.

The evaluation should cover the operational aspects of the 1st CfP implementation, as well as the programming and management aspects. It will also encompass the effective integration, in the 1st CfP implementation, of the recommendations and conclusions made during the 2007 ‘Mid–Term Evaluation’.

At the end of this study, the Evaluation Team should be able to provide precise and complete conclusions, recommendations and lessons learnt. The conclusions and recommendations should be sufficiently informative so that they can be translated into operational terms. In any case, the conclusions and recommendations should not duplicate those of the 2007 ‘Mid-Term Evaluation’.

In order to ensure that these conclusions and recommendations can be effectively translated into operational terms and take into account the reality, as perceived by the different stakeholders of the projects (EU Delegations, applicants, partners, and beneficiaries), missions on the field will be organised.

2.2         Specific objectives of the evaluation linked to the strategic principles for the implementation phase

In the Action Fiche (Annex 1 – modalities), the general and specific objectives of the Energy Facility are completed by strategic implementation principles that should optimize the impact of the EF. Therefore, the EF and the supported projects have been designed according to  these principles. The principles are divided in two broad categories:

i) the dominant principles: focus, effectiveness, sustainability and ownership;

ii) the auxiliary principles: efficiency, leverage and scaling-up, innovation.

An analysis of the implementation of these principles will be integrated in the evaluation. It will complete the evaluation, always keeping in mind that conclusions and recommendations should be informative so that they can be translated into operational terms.

By the end, crosscutting issues mentioned in the Action Fiche (poverty issues, environmental sustainability, gender equality and good governance) will be tackled in the evaluation, provided that information is already available.

3          Scope of the work

3.1         General overview of the activities scope and of related available information

The mid-term evaluation will focus on the implementation of the 74 projects of the 1st  EF CfP financed under  the 9th  EDF. A list of financed projects is available on the EF website:

http://ec.europa.eu/europeaid/where/acp/regional-cooperation/energy/energy-facility-former-calls/former_calls_en.htm

It will resume the evaluation process from where it was left by the 2007 ‘Mid-Term Evaluation’, i.e. from the establishment of the short listed 91 projects in the framework of the 1st CfP.

The evaluation will be based on all available relevant documentation on the EF. More specifically key documentation will include the official documentation establishing the EF, defining its objectives, its organisational set up and its implementation modalities. It will also include project reports, yearly-quarterly project performance sheets prepared under the current TA contract for monitoring the EF projects, the Result Oriented Monitoring conducted on 22 EF projects[2], the report of the 2007 ‘Mid-Term Evaluation’, information contained in the JRC database[3] or contained in the websites of the EF and EUEI, including the website on the 1st CfP projects monitoring (cf. Annexe 1).

In order to complete available information and more specifically information on the quality of the implementation and management of the CfP, questionnaires may be sent to the project managers and/or partners and /or beneficiaries of the EF projects. Other evaluation tools could also be used.

The information shall also be completed by field missions. The field missions will be carefully selected through a sampling process.

A Reference Group will be established and will follow-up this mid-term evaluation (cf. section 5). A particular focus will be put on the points 3.2, 3.3 and on the quality of the conclusions and recommendations.

3.2         Evaluation Questions

The evaluation will be based on a set of questions. These questions are intended to give a more precise and accessible form to the evaluation criteria and to articulate the key issues of concern to stakeholders, thus optimising the focus and utility of this mid-term evaluation.

In fact, the implementation phase of the 1st EF CfP  is still undergoing and it will not be possible to analyse as thoroughly some aspects as others. For instance, it seems to be too early for a sound measure of the impact of the activities even though a potentiality could be assessed. Relevance of activities has been already studied in part. Instead, the evaluation should focus on efficiency and effectiveness that characterise the implementation phase. Another important aspect would be to examine if sustainability may be guaranteed in the light of the ongoing advancement of projects.

At the management level, the evaluation should also include the last steps of the selection process of the 1st CfP (such as contracts negotiations) that were not assessed by the 2007 ‘Mid-Term Evaluation’, as well as an analysis of the framework set up for the monitoring and assessment of the projects.

In this variety of potential issues, the Evaluation Questions will allow focusing adequately the study towards the most relevant and adequate issues at this point of time of the CfP process.

In a first step, a provisional list of Evaluation Questions will be established by the Reference Group. For each question, Judgement Criteria will be defined.

The selected consultant will receive the provisional list of Evaluation Questions (10 maximum) together with Judgement Criteria. He will have to revise the list and propose modifications according to available information. For each Judgement Criteria he will define indicators. The indicators should be measurable with existing information or complementary information that could be easily collected.

For each Evaluation Question a Design table (cf. annexe 2) will be drafted.

3.3         Field phase for project analysis

A wide variety of subjects could be addressed during the field phase (e.g. why a project is performing well while others are severely challenged?, how  beneficiaries perceive the outputs of projects?, are the project partners  really engaged and involved on the field?, are the reports in line with reality?). Therefore, the choice of projects to be evaluated will be done after the selection of the Evaluation Questions, once we will know exactly what kind of information we need from the field in order to complete the desk studies. It will allow a detailed examination of the actual elements in line with the evaluation goals.

At this stage the purpose and expected outputs will be defined for the field study and a sampling methodology of the projects will be set up and clearly described. The sampling methodology shall be adapted to the purpose of the field study[4].The methodology  may be based on a projects’ classification  (e.g. classification by ROM performance, by geographical zones, by thematic or by stakeholders). This classification would allow putting in value selected characteristics of projects before defining the sample. The sample should provide a fair representation of the EF’s intervention portfolio in the energy sector. The EU Delegations should be consulted for the selection of the projects and agree for their evaluation.

The final selection of the projects will be reviewed and validated by the Reference Group.


4          Specific activities: Evaluation Phases and Reporting

The evaluation will be divided in four stages: i) structuring of the evaluation; ii) data collection; iii) analyses; iv) judgements.

These stages will be implemented within four phases: i) desk phase; ii) field phase; iii) report writing phase; iv) communication phase.

4.1         Desk Phase

4.1.1        Starting the evaluation

Prior to embarking in the structuring stage of the evaluation , the team leader will participate in a one day briefing meeting with representatives of DEVCO headquarters in Brussels, during which the TORs and the structure of the evaluation components will be reviewed and agreed upon with the consultant.

4.1.2        Drafting a Launch note

After the above mentioned meeting, and prior embarking in the structuring stage, the consultant should submit a Launch Note, to be presented to DEVCO headquarters, within 5 working days from the launching briefing. The Consultant will set out: i) the team’s understanding of the TORs; ii) the proposed methodology iii) the proposed allocation of resources (including budget and team CVs); and iv) the proposed activities timetable.

The DEVCO headquarters approval of the Launch Note will mark the official start of the evaluation. Copies of the Launch Note will be addressed to the Reference Group.

4.1.3        Structuring of the study and drafting of desk phase report

After the Launch note approval, the Evaluation Team will proceed with the structuring stage of the evaluation in two steps. The first step consists mainly in the following work:

  • Reviewing the institutional and political context of the EF including its origin and its administrative management.
  • Reviewing available information for the evaluation.
  • Preparing a final list of Evaluation Questions and Judgment Criteria.
  • Defining the purpose and expected outputs of the field study.
  • Defining a methodology for the selection and analysis of projects for the field study.
  • Proposing/presenting the use of complementary evaluation tools in order to complement information (interview, surveys, questionnaires, objective diagram etc.). At least two complementary evaluation tools will be developed.

A first intermediate desk phase report, with explanatory comments, will be sent to the DEVCO headquarter and will be reviewed by the Reference Group. Comments will be provided to the Evaluation Team within 6 working days.

Then the Evaluation Team will proceed with the second step of the structuring phase:

  • Completing Design tables for each Evaluation Question, including methods of data and information collection and analysis.
  • Proposing the first elements to be used in responding to the evaluation questions and the first hypothesis to be tested in the field.
  • Proposing a set of projects for the field study together with a calendar for missions and a project evaluation sheet.
  • Proposing modifications of the time schedule if necessary.
  • Proposing a table of content for the final report.
  • Defining the methodology, purpose and operational mode for the complementary evaluation tools.

At the end of this work, a Final draft desk phase report (in English) shall set out the results of this structuring stage, together with explanatory comments. At this stage, a change of the Evaluation Team composition may be proposed according to the findings of the desk phase report. The desk phase report will be sent to the DEVCO headquarters. Within 10 days from reception of the document, a briefing meeting will be done inBrussels, with the Reference Group. The aim of the meeting will be to discuss and comment the proposed Design tables for each Evaluation Question, the final project selection and any other relevant point.

After the meeting, the Evaluation Team will incorporate comments and within 5 working days from the meeting, submit the final desk phase report for endorsement by DEVCO headquarters.

4.1.4        Desk data collection

Once the desk phase report has been agreed upon, the desk data collection will continue according to the adopted methodologies, evaluation tools and analysis framework.

4.2         Field Phase

Following satisfactory completion of the desk phase report, the Evaluation Team will proceed with the field missions. The field work shall be undertaken on the basis set out in the desk phase report and on general recommendations made by the Reference Group. The mission calendar should be as the one submitted in the Final desk phase report. The duration of this phase shall not exceed one month and a half. The duration of each mission will be agreed together with the involved EU Delegations and the EF team. The field missions will include at least 12 projects (within 6 countries covering the 5 regions of intervention of the EF: Western Africa – Central Africa – Eastern Africa – Southern Africa andCaribbean). If during  the fieldwork any significant deviations form the agreed methodology or schedule, are perceived necessary, these shall first be negotiated with the EF. Contacts must be taken and liaison organised with the EU Delegations prior to each field visit.

Once the field missions are concluded, the team will (i) give a detailed on-the-spot de-briefing to each EU Delegation on their provisional findings; (ii) give a general de-briefing on all missions to the DEVCO headquarters, with project evaluation sheets, within two weeks from return from the last mission (the exact dates may be changed according to people availability during the summer time). A power point presentation will be prepared for the meeting.

4.3         Report writing Phase

The Evaluation Team will deliver a Draft Final Report to the DEVCO headquarters according to the agreed calendar and following the table of content as approved in the final desk phase report. On acceptance, the report will be circulated for comments to the Reference Group, which will convene to discuss it about 10 days after circulation, in the presence of the Team Leader. A power point presentation will be made to present the report. On the comments received from the EF team and the Reference Group, the Evaluation Team will make the appropriate final amendments and submit their final report to the EF team within 10 working days.

Comments requesting methodological quality improvement shall be taken into account, except where there is a demonstrated impossibility, in which case full justification should be provided by the Evaluation Team. Comment on the substance of the report may be either accepted or rejected. In the latter instance, the Evaluation Team has to motivate and explain the reasons in writing.

The report shall be of an outstanding quality, well written, concise and to the point. The recommendations shall be operational, supported by solid conclusions which are based on clear judgement criteria, solid and concrete information and rational argumentation. The conclusions of the evaluation have to be based on a rigorous demonstration. If the EF is already reorienting its work in order to face issues underlined in the evaluation, the report will acknowledge it. The report shall include, as appropriate, tables, maps and graphs in Annexes (including illustrating information collected on the indicators).  The draft and the final report will be provided in English and French. The report will be completed by a 3 pages summary for decision makers (also in English and French). All final documentation (final report and executive summary) should be submitted in 5 hard copies and one Word and one pdf electronic copy. All the primary data collected for the evaluation of the indicators of the Evaluation Questions and used in order to prepare statistics, tables, maps, graphs, should be compiled in a clear manner, specifying the source of the information, and put together on 2 DVDs for the EF team.

4.4         Communication and dissemination phase

After approval of the final report, the EF team will proceed with the dissemination of the results (conclusions and recommendations) contained within the report. The summary for decision makers and the final report will be made available on the website by the EF team.

A short seminar will be organised by the EF team, inBrussels, according to Commission’s interest. The Evaluation Team will be invited to present to the EC services and relevant stakeholders, the evaluation’s findings, conclusions and recommendations. A power point presentation will be prepared for the meeting.

The EF team will assess the Quality of the Mid Term Evaluation together with the Reference Group, through the Quality Grid (see in annex). A “fiche contradictoire” (in order to collect services opinions on each recommendation and follow them) will be elaborated and circulated as necessary.

An article about the evaluation and its main findings will be included in the Newsletter by the EF team. It will include Reference Group’s comments.

5          Management and supervision of the evaluation

Responsibility for the management of this Mid Term Evaluation will lie with the DEVCO headquarters. A project officer within the EF team will be designated as the focal point for management purposes.

The progress of the evaluation will be followed by a Reference Group. The Reference Group will be composed of staff of the Commission and a delegate of the ACP Secretariat. The principal functions of this Reference Group will be to:

  • Establishment of the provisional list of evaluation questions and follow-up of the revisions proposed by the evaluation team;
  • Comment on the launching note of the Evaluation Team;
  • Discuss and provide comments on the desk phase report prepared by the Evaluation Team;
  • Make sure that the Evaluation Team as access to all relevant information sources and documentation of the Commission;
  • Assess the final report and assist in incorporating the conclusions and recommendations of the Mid Term Evaluation into ongoing and future programme design and implementation.

6          Evaluation Team

The contractor is required to provide a team composition corresponding to the team profile and experience requested below. The team will be guided by a Team Leader who will be responsible for team management and overall quality control with regard to the technical, financial and linguistic contribution by the experts.

The Evaluation Team shall possess expertise in the following areas:

1) A proven knowledge and experience in external development co-operation in African andCaribbeanstates (ACP group) at the levels of policy, programming and implementation with a particular focus on the area of energy, in particular with regard to energy access/access to modern energy services.

2) A good knowledge concerning EC management procedures within EDF rules (including procurement issues and call for proposal procedures).

3) Proven expertise and capacity in conducting evaluations of EU external development co-operation, if possible at a sector level.

4) Expertise and capacity in stakeholders analysis.

5) Expertise and capacity in projects evaluation methods in field situations.

A demonstrated capacity to analyse cross-cutting issues on gender, environment, governance and human rights, would be an asset.

The team must be prepared to work in English and French, and possess excellent drafting skills. Additional knowledge of Portuguese and Spanish would be a strong asset.

Team leader – Senior Expert – 46 days – Maximum 1 person

Minimum Requirements:

The Team Leader shall be a senior specialist in the development cooperation in ACP countries. He must have at least 5 years of experience in energy access/access to modern energy services in rural and peri-urban zones of developing countries. He must also cover areas of expertise 2 and 3. Working knowledge of French and English is required. The team leader must also demonstrate previous experience as team leader or team management.

Additional requirement

An expertise on area 4 and 5 would be an asset.

The team leader will be responsible to ensure the outstanding quality of the outputs of the work of the team and will pay a special attention to the quality of the final report drafting.

 

Field phase experts – Senior Experts – 84 days in total (18 for the preparation and 66 days for the field phase) – Maximum 5 experts – Minimum 3 experts.

Minimum requirements:

For the field phase, 3 to 5 senior experts shall be appointed. Each expert shall be a senior specialist of the field of cooperation. Each expert must have a minimum of 5 years of specific experience on energy access/access to modern energy services in rural and peri-urban zones in developing countries, at the level of programming and implementation.

Each expert must have a very good knowledge of at least one of the following ACP regions of intervention: Western Africa, Central Africa, Eastern Africa, Southern Africa, and/orCaribbean.

All together, the experts have to cover the 5 regions. Each expert have to be assigned to one or two regions of intervention (the ones they best know) and speak the main languages of this(ese) region(s).

Additional requirements:

A knowledge of project evaluation methods in field situations and of EC management procedures and evaluation methods would be a strong asset.

Junior Experts – 100 days – Maximum 2 persons

The team leader and the field experts will be assisted for the desk work by junior experts.

Minimum requirements:

The junior experts must cover areas of expertise 1 or/and 3 and speak English and/or French.

The two junior experts together must cover areas of expertise 1 and 3.

The team composition will initially be agreed between the contractor and the EF team but may be subsequently adjusted, if necessary, due to the findings of the desk phase report (including the final selection of countries for the field phase).

Categories are defined as in the Global terms of reference. All experts should have at least a Master Degree relevant to the assignment. If not, a four year additional relevant experience should be added to the number of years of experience requested for each category of expert.

Regarding conflict of interest, experts who have been involved in the implementation, monitoring or evaluation of projects covered by this evaluation, are excluded from this assignment.

A declaration of absence of conflict of interest should be signed by each consultant and annexed to the launch note.

7          Timing

The evaluation will start at the beginning of June 2011 with completion of the Final Report scheduled for end October 2011. The following is the indicative schedule (The dates mentioned in the table may only be changed in view of optimising the evaluation performance, and with the agreement of all concerned):

Evaluation Phases and Stages

Notes and Reports

Dates

Meetings or Comments

Selection process April RG comments
Starting Stage Launch Note Beginning of June Kick off meeting – EF with the team leader – HQ
Desk Phase Starts June
Structuring Stage June
Intermediate desk phase report Draft Mid  June RG and EF comments
Draft Final desk phase report Draft End of June RG Meeting with the team leader – HQ
Final desk phase report Final desk phase report beginning of July
Desk compilation of data June to August
Field Phase July and  August
Preparation of the field phase Until mid July
Field Missions Beginning of July to End of  August
Presentation Power point Beginning of September RG Meeting with team leader and one field evaluators – HQ
Final Report-Writing Phase Draft Final Report August and September
1rs draft Final Draft Fourth week of September RG Meeting with the team leader – HQ
Final Report Final Mid October
Seminar Power point End of October Team leader – HQ

8          Administrative information

The present contract is a Global Price Based Commission Framework contract 2011, under lot 1. The cost of the evaluation will not exceed 199 999 €.

The offer prepared by the contractor shall include the methodology foreseen for the implementation of the present contract (5 pages maximum).

Reimbursable items authorized include per diem, visas, international travelling, translation and local travelling.

Payment modalities should be as follow: 40% at the acceptance of the desk phase report; 40% at the acceptance of the draft final report and 20% at the acceptance of the final report.

Annexe 1 – Key documentations

 

The following list of documents an websites is indicative and by no means exhaustive. The Evaluation Team is requested to take into account any other documents relevant to the present evaluation


 

Annexe 2 – Structure of a design table

 

 

 

 Question  Text of the Question
  Comments  Why is the question asked?
 Scope  What does the question cover?
 Judgement criterion / criteria
 How will the merits and success be assessed?
 Indicator(s)
 Which data will help assessing the merits and success?
 Target level(s)
 Which level or threshold can be considered as a success?
 Chain of reasoning
 Steps of reasoning planned to answer the question by:

– quantifying / qualifying indicators

– analysing information

– formulating the value judgement

 Analysis strategy
 Type(s) of analysis to be applied
 Investigation areas
 Areas where data are to be collected and analysed
 Information sources and tools
 What will be the origin of data

ANNEX 3. QUALITYGRID

The draft and final versions of the Final Report will be assessed using the below “quality grid”.

. Concerning these criteria, the evaluation report is:

Unacceptable

Poor

Good

Very Good

Excellent

1. Meeting needs: Does the evaluation adequately address the information needs of the commissioning body and fit the terms of reference?
2. Relevant scope: Is the rationale of the policy examined and its set of outputs, results and outcomes/impacts examined fully, including both intended and unexpected policy interactions and consequences?
3. Defensible design: Is the evaluation design appropriate and adequate to ensure that the full set of findings, along with methodological limitations, is made accessible for answering the main Evaluation Questions?
4. Reliable data: To what extent are the primary and secondary data selected adequate. Are they sufficiently reliable for their intended use?
5. Sound analysis: Is the quantitative and qualitative information appropriately and systematically analysed according to the state of the art so that Evaluation Questions are answered in a valid way?
6. Credible findings: Do findings follow logically from, and are they justified by, the data analysis and interpretations based on carefully described assumptions and rationale?
7. Validity of the conclusions: Does the report provide clear conclusions? Are conclusions based on credible findings?
8. Usefulness of the recommendations: Are recommendations fair, unbiased by personal or stakeholders’ views, and sufficiently detailed to be operationally applicable?
9. Clearly reported: Does the report clearly describe the policy being evaluated, including its context and purpose, together with the procedures and findings of the evaluation, so that information provided can easily be understood?
Taking into account the contextual constraints on the evaluation, the overall quality rating of the report is considered


[1] COM (2004)711

[2] The ROM is an external monitoring performed each year on a number of selected projects according to a precise methodology. It is appointed by the Headquarter with a view to enhance the follow-up of the projects and the quality of the work.

[3] JRC was appointed by the EF in order to implement a database on the characteristics of the projects financed by the Facility.

[4] For instance, we should not use the same sampling methodology if we want to compare projects or to have an overview of the CfP projects on a specific issue.