Using Evaluation to Inform Education Policy in South Africa

Home » Using Evaluation to Inform Education Policy in South Africa

By Leticia Taimo, Margie Roper, and Zamokuhle Thwala

Khulisa is pleased to announce that our post about using evaluation to inform education policy in South Africa was recently published as part of the #Eval4Action “Walk the Talk” blog series, featuring lessons and reflections that inspire greater action for influential evaluation in the Decade of Action. The full text of the blog post appears below.

Even before the COVID-19 pandemic struck, South Africa’s education system was facing tremendous challenges in providing quality education in the majority of the country’s schools. As Khulisa has reported in early grade reading evaluation reports, the basic education system in South Africa (Grade 1 to 12) consistently performs poorly on international ratings. In 2016, the Progress in International Reading and Literacy Study (PIRLS) indicated that 78% of South African Grade 4 learners were not reading for meaning. This means that 8 out of 10 South African children do not learn to read for meaning in the early years of school. [1]

 

The COVID-19 pandemic and its ensuing series of lockdowns, school closures, and the necessity to rotate learners and observe strict social distancing protocols have intensified these challenges tremendously. In the wake of the pandemic, evaluating teaching methods and learner outcomes has become more critical than ever to address learning losses and build back better. At the same time, evaluations have become more difficult to conduct during the pandemic. More so than ever before, evaluations must be flexible and adaptable in order to be effective.

 

Conducting actionable evaluations during a pandemic: What we’ve learned


Khulisa and its partners undertook an assignment focused on evaluating early grade reading; creating language benchmarks for learners in two languages; and researching the social-emotional effects of COVID-19 on early grade reading, learning, and teaching. The circumstances of the pandemic, while challenging, have created unique opportunities for innovation and accelerated the demand for, and immediate use of, evaluation data.


The bulk of the data collection took place in September 2021, between two devastating waves of COVID-19. Conducting this work during such a difficult time for the South African education system taught us several important lessons about how to make our evaluations accessible and immediately useful to policymakers:

 

  1. While conducting a high-stakes evaluation, it is crucial to build and maintain a strong relationship with the client/partner, in this case, the South African Department of Basic Education (DBE) and the United States Agency for International Development (USAID). The DBE was very interested in the results of these evaluations and played a crucial role in evaluation design, instrument development, and training and selecting fieldworkers. Constant communication and joint planning with DBE led to more trust and buy-in, which increased the chances DBE would adopt the recommendations that emerged from our evaluations. For example, in February 2022, due to the recommendations provided in Khulisa’s research and other studies, the DBE and the Government of South Africa changed COVID-19 regulations to allow learners to return to school full-time and ended rotational learning.

 

  1. Evaluators must be agile and flexible in choosing the data collection methods that are responsive to their client’s needs. While in the proposal-writing stage, since it was uncertain when and how schools would reopen, Khulisa intentionally included a data collection method that did not require physical presence in schools. Khulisa contracted Geopoll to conduct Computer Assisted Telephonic Interviews (CATI) with school management teams, teachers, and parents, which ensured we were able to collect the data we needed amidst the uncertainty of pandemic school closures.

 

  1. Breaking data collection into multiple points within the span of the evaluation helps increase the data’s usability. In our case, we broke the data collection into three phases: 1) collecting data from school leaders; 2) collecting data in schools (whenever we were allowed back in schools); and 3) collecting data from parents. This approach helped us be strategic about which questions to ask when and to whom, avoiding duplication of effort and maximizing our evaluation insights. Our phased data collection approach also allowed the client to receive evidence in a timely fashion and take action in the moments that mattered most.

Moving forward


This project was implemented during the COVID-19 pandemic, but the lessons learned are useful for all evaluations – not only those conducted during a crisis. Constant engagement with key stakeholders and flexibility in project implementation are important for dealing with unexpected challenges in every project. Thus, it’s essential to build in sufficient time for flexibility when budgeting for the evaluation, and to be intentional about creating opportunities for key stakeholders to be involved in the evaluation process and take action on the results.


In the case of the COVID-19 research portion of Khulisa’s project with DBE and USAID, which has now been concluded, our team was successful in accelerating demand for the use of evidence by policy makers.

 

We also learned that, as evaluators, we should always consider innovative ways to be responsive to client needs and provide timely data, which leads to increased interest in the evaluation findings and ultimate use of the evidence. We will be taking these lessons forward in our evaluations in the future.

 

The other component of this project (the language benchmarks and two impact evaluations) are still underway, and we are looking forward to seeing how evidence emerging from this project phase is used for action.

 

[1] Spaull, N. 2017, The unfolding reading crisis: the new PIRLS 2016 results. Available from: https://nicspaull.com/2017/12/05/the-unfolding-reading-crisis-the-new-pirls-2016-results/

Leave a Reply

Your email address will not be published. Required fields are marked *