by Evidence Action’s Sasha Gallant and Julie Wang’ombe
In late 2018, our Evidence Action team attended the inaugural Teaching at the Right Level conference in South Africa, hosted by pioneers in the field, Pratham and J-PAL. The conference brought together policymakers, researchers, and implementing partners from across the continent to exchange lessons and ideas to advance what is now one of the most rigorously evaluated, proven approaches to improving foundational literacy and numeracy – “Teaching at the Right Level” (TaRL). It was a great opportunity to learn from and contribute to a range of rich discussions on developing, implementing, and monitoring TaRL models, and left us more excited to be part of the growing TaRL community working to equip children with the basic skills that they need to succeed.
Through our Beta incubator, we’ve now spent several years working with the Government of Kenya to co-design and implement a youth volunteer-led Teaching at the Right Level program, which can be expanded to other countries under our broader education initiative, Winning Start. Our experience in Kenya has provided many lessons on how to target and recruit highly-motivated, high-performing youth volunteers and keep them engaged and motivated throughout the volunteering period.
On a panel with Young 1ove, After School Game Changer, and IPA Ghana – organizations piloting variations of youth or volunteer-led TaRL models across Africa – our Program Coordinator, Fred Abungu, shared what we’ve learned from working with the Government of Kenya to effectively and sustainably recruit, retain, and motivate volunteers to deliver remedial support at steadily increasing scale.
In this post, we’ll explore some of the insights he offered, briefly discussing why the youth volunteer-led TaRL model was the right fit for Kenya before delving into how we’ve worked with the government to target and recruit steadily growing numbers of volunteers.
Why youth volunteers?
Several models have been tested and shown to be effective in delivering TaRL. In some, government teachers are trained to apply the pedagogy, while others equip volunteers to support students in a way that complements the work of teachers. The success of each approach is highly context-dependent and driven by the opportunities for, and constraints of, implementation at scale.
In an upcoming blog, we’ll elaborate on the case for using youth volunteers to deliver TaRL remedial sessions. As a preliminary overview, Winning Start is designed around evidence showing the volunteer-led, out-of-school support model to be the most effective at improving learning outcomes, leading to average learning gains of over 22 percentage points in reading in India, and test score increases of over 6% in Ghana, for example. In Kenya, the volunteer-led model offered other potential gains for the government, including an opportunity to improve the work-readiness of underemployed, post-university youth, and enhance their cultural tolerance by deploying them to live and serve outside their home counties.
Despite its potential for impact, the volunteer-led model isn’t without its challenges, key among which is the question of how to recruit and retain volunteers. It’s a critical issue: low recruitment rates drive down the potential for scale, while high volunteer turnover can decrease a program’s cost-effectiveness by forcing continuous recruitment and retraining of new volunteers. Meanwhile, recruiting volunteers that lack motivation and commitment can dampen program impact if the volunteers engage children inconsistently or ineffectively. Well-targeted recruitment and retention efforts are central to the model’s cost-effectiveness at scale.
How do we attract and retain the right candidates?
Over the last four years, our TaRL collaboration with the Government of Kenya’s national youth service initiative – G-United – has deployed over 2,000 youth volunteers across Kenya who have reached over 40,000 learners. Each year, we’ve deployed more volunteers across the country, from 150 at the program’s inception, to over 1,200 volunteers in 2018. The program expects to deploy another 1,600 volunteers to reach an additional 40,000 learners in 2019 alone.
Since 2014, we’ve tested several platforms for targeting and recruiting youth volunteers. With initial cohorts, the program focused primarily on advertising through more traditional channels – print and broadcast media. In cohort three, we combined traditional advertising with digital media marketing to better reach a tech-savvy audience who typically have a smartphone, internet connection, and active social media presence, and who consume news on digital media far more than through newspapers. The move was also driven by aspirations for enhanced cost-effectiveness and impact. If digital media marketing attracted large numbers of quality applicants, it could allow us to “economize without compromise” – one of our core values at Evidence Action – by substituting for more costly traditional media investments. Further, we expected that sharing the recruitment opportunity online, linking directly to the application page, would increase the likelihood of youth following through on their application rather than having to remember to apply after encountering a print advert (we’ve now introduced the ability to “save and return” to an application, with the same goal of retaining promising applicants at the recruitment stage). Social media campaigns on Instagram, Twitter, and Facebook allowed us to target messages to our best prospects, based on age, geography, and interest, and to leverage our volunteers’ own social networks to attract applicants. Following these low-cost investments in digital marketing, we saw applications rise by 75% in one year and more than double the next – from 2,000 in cohort two, to 3,500 in cohort three and 8,000 in cohort four.
Building on our digital marketing efforts, we’re now integrating behavioral science evidence into our recruitment processes, using it to inform our messaging. Earlier this year, one of our partners, the Busara Centre for Behavioral Economics, helped us understand from volunteers their key motivations for joining G-United. The exploratory analysis showed that most applicants’ motivations could be categorized in three ways: “prosocial” (seeking to serve and “give back”), “career-oriented” (seeking professional advancement), or “adventure-seeking” (seeking exploration and adventure). In our last recruitment drive, we developed key messages highlighting each theme for use in our recruitment materials (see above for a sample advertisement aligned to the “prosocial” motivation). In total, the recruitment drive attracted 13,500 applicants for cohort five, which will deploy in early 2019.
We’ve also learned the importance of keeping lines of communication with applicants open across multiple channels throughout the recruitment process. Joining G-United requires ongoing, demonstrated commitment from volunteers – but volunteers also expect transparency and regular updates from the program to keep them both informed and engaged. Applicants who aren’t sure of next steps can easily lose morale and discontinue the process; similarly, applicants may start an application with enthusiasm, but may need a nudge to follow the process through to completion. Following submission of an application, we stay in regular contact with applicants through email and SMS updates. Because most applicants don’t regularly check their email, SMS reminders offer an easy, low-cost way to keep candidates connected and up to date, regardless of their internet access (which is also why we regularly use SMS-based platforms not just for communicating with potential volunteers, but for collecting data from volunteers once they begin their service). Applicants can also call a hotline through which the program team offers a more personal touch, answering questions and solving issues in real time.
Given most applicants are active on social media, we use these platforms – particularly closed, targeted Facebook groups – to connect applicants to one another early on in the process and share updates with them. This gives applicants a forum to begin bonding with their potential cohort early on, allowing them to connect with and learn about others interested in the program and develop a sense of community before coming onboard. However, we’re cognizant that there may be costs to this approach: social media also provides a platform where negativity, rumors, and misinformation can spread quickly, and platforms needs to be constantly monitored to quickly attend to emerging issues. Managing volunteers’ expectations may also prove tricky, particularly as some candidates move forward in the recruitment process while others do not. In the spirit of “iterate, again” – another core Evidence Action value – our experiences using social media in the cohort five recruitment process will inform whether and how we continue to integrate digital platforms into the recruitment process for future cohorts.
Overall, we’ve substantially improved our targeting efforts through a series of small tweaks, bringing up both the number and quality of applications for the G-United program. Iterative improvements have also enabled us to design a cost-effective recruitment process: the cost of recruiting an applicant is now just $0.69 per eligible applicant. We’re still working to better understand our target youths’ motivations and aspirations, and exploring how to leverage these insights to improve recruitment outcomes as well as retention and performance rates among deployed volunteers. In future blog posts, we’ll be sharing more lessons from this ongoing process, which we hope will continue to drive cost-effectiveness for the program while priming it for scale. Stay tuned!
You can read more about Evidence Action here and donate to support their work here.