This article was written by Dr. Belay Hagos, Associate Professor of Education, Director, Institute of Educational Research, Addis Ababa University and Fitsum Mulugeta, Education Economist, The World Bank. It was originally published on the Research Technical Assistance Center’s Together for Early Childhood Evidence site on 1 April 2021.
Enrollment in Early Childhood Education (ECE) expanded ten-fold over the past decade when the Ethiopian government and development partners increased their investment in the sub-sector. To guide the effort of complementing this expansion in access with quality enhancement, stakeholders need data to understand where they started and track the progress. This is a story of how, through a government partnership, we helped develop an early childhood measurement system that will soon be scaled across the country.
Early childhood education in Ethiopia consists of O-Class, a one-year preparatory program for children aged 6 in the year before entering grade 1. In 2015, Ethiopia’s Education Sector Development Programme V prioritized government-led pre-primary education. As of 2020, Ethiopia’s gross enrollment rate for pre-primary was 45 percent, a massive increase from less than 2 percent in 2000. Yet, quality pre-primary education remains inaccessible for many Ethiopian children.
In 2018, we began working as a consortium of researchers from Addis Ababa University and the REAL Centre at the University of Cambridge to collect data on the quality and outcomes of early childhood education in Ethiopia. With funding from the World Bank’s Early Learning Partnership (ELP) Systems Research Initiative, we adapted the Measuring Early Learning Quality and Outcomes (MELQO) tools to the Ethiopian context. Baseline data was collected in 2019 and we recently finished our second round of data collection, or end-line data. Readers will have to wait a few more weeks for the results to come out, but we share here some reflections on the process of building Ethiopia’s first national assessment of pre-primary education in eight local languages (Amharic, Afan Oromo, Af Somali, Berta, Hadiysa, Sidamu Afu, Tigrinya, and Wolaittato).
Adapting the MELQO tools to the Ethiopian context
The MELQO tools were selected as they have shown good results in other countries with similar contexts to Ethiopia, and they were being used by countries engaged in the ELP Systems Research Initiative.
The leadership of the National Educational Assessment and Examinations Agency (NEAEA) appreciated the introduction of the MELQO tools in Ethiopia. They identified a gap in national assessments for the early years (with the Early Grade Reading Assessment (EGRA) starting at Grade 2 and national learning assessments at Grade 4) and found the MELQO modules to be suitable for adaptation. We collaborated with the NEAEA on every step of the process, from tool development and adaptation to collecting and analyzing the MELQO data. We held an adaptation workshop with the Ministry of Education (MoE) and Regional Education Bureaus and enlisted their help with translation. The participants included experts from the curriculum, assessment, and ECE departments of the MoE.
Some examples of how this adaptation process worked:
- We revised some of the items in the MELQO tools to align with the minimum learning competencies (MLC) expected from O-Class. For instance, the Measure of Development and Early Learning (MODEL) originally included an item that asked children to count to 30. The curriculum team suggested revising the count to 20, as that is what was required in the MLC.
- Identifying high-frequency letters (‘fidel’) in all the languages would have taken a lot of time. Fortunately, the EGRA, supported by USAID, had already identified these and it was possible to adapt them from EGRA.
- Initially, MELQO tools were piloted in six regions and, therefore, in six local languages (i.e., from six regional states: Amhara, Benishangul-Gumuz, Oromia, Southern Nations, Nationalities, and Peoples (SNNP), Somali, and Tigray). We held validation workshops with representatives of the NEAEA and MoE and shared the results of the MELQO pilot study. Results of the pilot study indicated that the direct assessment component of MELQO was found reliable and valid in the Ethiopian context.
After adapting and translating the modules, we conducted the initial pilot in 2018 with more than 1,000 children in a random sampling of schools. We found the MELQO tools to be relevant and valid, and thus we were ready to launch the nationally representative study.
Conducting Ethiopia’s first study of early childhood quality and outcomes
From November to December 2019, we collected baseline data on 3,214 children in the six regional states. We included both children enrolled in O-Class and children in the same communities who were not enrolled in ECE programs. To identify the non-O Class children, the data collectors went into the communities where the randomly selected schools were located and asked if there were children under age 7 who did not go to school. We made a list of names and phone numbers, and then took a random sample from each village. About one-third of our sample was not enrolled in O-Class. Our end-line data collection occurred after schools reopened in October 2020 following the six-month shutdown starting in March 2020 due to the COVID-19 pandemic.
Our partners in the MoE and NEAEA have been very receptive to the assessment of early learning using the piloted MELQO tools and are now preparing to collect data from a nationally representative sample starting the next academic year (2021/22). This will take NEAEA another step forward in educational assessment in Ethiopia.
How did we take a small pilot and turn it into a government-led, national assessment?
First, it’s important to acknowledge our longstanding and smooth relationship with the government. Addis Ababa University and Dr. Belay have been working with the government since 2010 to develop a national policy framework for ECE. This continuous engagement between the government and the local university over many years was the foundation of this effort.
Second, the NEAEA and MoE were our partners from the very start. They were involved in the decisions on the study design, tools and languages. They recruited data collectors and received the training themselves. As a result of this partnership, they trust and accept our research results and are willing to incorporate the findings into their policy design.
Our work has not been without challenges. With the outbreak of violence in the Tigray region, we could not collect end-line data there. Early on, we also had pushback from senior officials who said, “We know the quality is very poor. Why do we need to measure it?” We countered this view with the fact that we need to know where we started from in order to improve our system. We can use the data from this work to inform teacher training and curriculum engagement. If we only start measuring two or three years down the road, how will we know if we have made progress?
Increasing access and improving quality in a national ECE system are sometimes considered competing priorities, where focusing on one comes at the expense of the other. We have shown that through partnerships between the government, national universities, international research centers and development partners, it is possible to work on both at the same time!
Acknowledgments: We would like to thank the MELQO global team in general and Dr. Abbie Raikes, Rebecca Sayre, Dr. Tricia Kariger, and Dr. Dawn Davis in particular for their significant contributions in building the capacity of the assessment team in Ethiopia. Furthermore, we would like to thank the leadership and the assessment team of the National Educational Assessment and Examinations Agency (NEAEA), with particular appreciation going to Yilkal Wondimeneh, Abiy Kefyalew, Asefa Leta, and Aregawi Gidey for their commitment from day one and for their active involvement in the capacity building process and ultimately instituting the assessment of early learning using the MELQO tools into the assessment system of the NEAEA with the full support of the leadership. Kate Anderson provided editorial support for this blog.