Skip to content
Blog

Lessons from Cape Town: 5 tips to recruit study participants from vulnerable populations

This post follows up on our previous post presenting the results of a long-term follow-up evaluation of the SNHU-Kepler program in Rwanda. The evaluation demonstrated that a blended learning program with a curriculum tailored to local labour market needs, plus in-person career coaching, may help youth in sub-Saharan Africa bridge the skills gap and improve labour market outcomes. In this post, we share what we learned around the recruitment of vulnerable groups from the process of constructing our comparison group for an evaluation of a blended learning model in Cape Town, South Africa. In the next post, we will describe how we were able to still learn about the program in South Africa despite differences between the treatment and comparison groups.

This post follows up on our previous post presenting the results of a long-term follow-up evaluation of the SNHU-Kepler program in Rwanda. The evaluation demonstrated that a blended learning program with a curriculum tailored to local labour market needs, plus in-person career coaching, may help youth in sub-Saharan Africa bridge the skills gap and improve labour market outcomes. In this post, we share what we learned around the recruitment of vulnerable groups from the process of constructing our comparison group for an evaluation of a blended learning model in Cape Town, South Africa. In the next post, we will describe how we were able to still learn about the program in South Africa despite differences between the treatment and comparison groups.

— — — — — — — — —

Southern New Hampshire University (SNHU) aims to provide high-quality, low-cost university education by using a blended learning model of online coursework and in-person instruction to students who otherwise may not have access to tertiary education. The model was established in Rwanda in 2013 in partnership with Kepler.

From 2013 to 2019, we evaluated the program in Rwanda and found SNHU-Kepler students performed better on academic skills tests and had better employment prospects than a matched comparison group of students at other local universities. In 2018, SNHU expanded its program to refugee and asylum seeker communities, including Cape Town, South Africa, which has a diverse migrant population. IDinsight worked with SNHU and the Scalabrini Centre of Cape Town to evaluate the program’s impact on student skill development during the first year of implementation. This research will contribute to a broader understanding of whether such a program can have a similar impact in different contexts, and aims to help raise further funding for effective education programs for vulnerable populations in Cape Town.

How we recruited a comparison group in Rwanda

In Rwanda, we ran an impact evaluation with a matched comparison group design to test the program’s effects on student learning. We went to university campuses to sign up first year-students who had not heard of the SNHU-Kepler program. Offering a mobile top-up as a participation incentive, we then asked these students to take a demographic survey and write baseline academic tests simulating the SNHU-Kepler admissions process. Our final matched comparison group consisted of students who had similar demographics and background (including age, gender, and previous education, and employment) and similar baseline learning levels (based on test scores) to SNHU-Kepler students. In effect, the comparison group included students who were similar to SNHU-Kepler students and would have been accepted to SNHU-Kepler had they heard about the program and applied. In Rwanda, there were many local university students who met these criteria.

Given the success of this strategy in Rwanda, we planned to conduct the same process in Cape Town.1 However, appropriate comparison students from refugee and asylum seeker communities in the Cape Town area were fewer and more difficult to find, thus requiring us to adjust our recruitment approach.

How we recruited a comparison group in Cape Town

We found that students from these communities differed from typical local university students in two ways: they were less connected to the university community, and they placed lower trust in external studies than their peers. These students generally face more obstacles with integration on campus and in the wider community because of their immigration status. IDinsight also did not have pre-existing relationships with Cape universities, which made building networks and trust more difficult.

Although students with refugee and asylum seeker status were listed on university registers, we had a hard time identifying enough students through traditional university recruitment methods such as flyering and advertising at student events, even when offering monetary participation incentives. Many students were also unwilling to provide their basic information because they did not trust how their personal information would be used. We had to improvise and iterate on our approach to recruit enough participants. Though we failed to recruit an entirely suitable comparison group (as an upcoming post will describe), we were still able to generate insights on the program’s effect on learning. This blog covers five lessons we learned through this process.

Our recommendations for recruiting vulnerable populations:

  1. Connect with influential community members

Our first successful recruitment strategy was to identify prominent community members who could act as liaisons between us and hard-to-reach students. We approached SNHU, Scalabrini, and registered campus groups for recommendations, since they had more knowledge of university ecosystems. We contacted presidents of international student societies, for example, who helped to refer students to the study. In some instances, we were approached by community leaders who saw us at recruitment events and volunteered to assist us: at one recruitment fair we met with a pastor from a local church who was able to refer a number of eligible students to sign up for the study. Critically, these “connectors” not only referred students to the study but also explained its aims and helped build trust with students.

Relying on such highly networked individuals and locations to identify eligible participants can dramatically improve the efficiency of recruitment as well as improve our credibility. At the same time, we must remember that this approach may mean we miss out on certain types of students. Students that are involved in extracurricular activities or community groups might be different from less-involved students in ways that are related to learning outcomes (for example, they may be more connected to people who can help them through academic challenges, and thus learn new material more easily on average). As such, it is worth complementing this strategy with other approaches aimed at including people with fewer community connections.

2. Use word-of-mouth networks to build trust

We initially recruited students using methods with little personal interaction: sign-ups at orientation fairs, posting on social media, disseminating fliers, putting up posters, and advertising through university email blasts. We found that even if students were aware of the study, many with refugee or asylum seeker statuses were reluctant to participate because they perceived there to be a higher personal risk of being negatively targeted for their immigration status by peers, institutions, or the public. Some students mentioned they had been the subject of xenophobic interactions before.

Establishing a greater sense of credibility with these students was important to counteract this sense of risk. At the end of testing and during breaks, we encouraged participants to refer friends and family who were local first-year university students. We found that students who heard about the study from a friend or family member were more inclined to participate. In fact, word-of-mouth proved to be the most effective recruitment method when paired with monetary participation incentives. In several instances, participants who had already completed the study brought their friends to participate in the next day’s testing.

3. Adjust and frame incentives appropriately

Initially, we offered a R200 (~$13.4 USD at the time) cash incentive with an 8GB flash drive to students if they completed the academic testing to become part of the evaluation, but too few students signed up. Students with refugee or asylum seeker status in the Cape Town area, where the job market is relatively strong, generally came from low-income households and had part-time jobs or other commitments to support their families and their education. Their participation was dependent on an incentive amount that matched the cost of foregoing these other money-earning activities. We ultimately offered R350 (~$23.5 USD) for participation and R25 (~$1.7 USD) incentive if a student referred a friend or acquaintance to participate in the study. We increased the referral incentives to R40 (~$2.7 USD) after the first few days of baseline testing. We found these amounts provided an appropriate threshold to attract participants. In fact, many participants were referred to the study by one individual who was motivated to take advantage of the referral incentives to pay for her university accommodation.

4. Beware of fraudulent information

The high perceived risk of sharing personal information with strangers also increased the likelihood that individuals would provide false information. Immigration information is sensitive and these students did not believe their information would be safely used and stored. Conversely, we also encountered students who provided false information in hopes of receiving the incentive even though they were ineligible for the study. These challenges reduced both our recruitment efficiency and our ability to match students with treatment group students.

We were generally able to detect false information by asking for identifying documents such as university cards or residents permits. Students who gave false information were not able to complete the pretesting questionnaires and exited the process. We recommend not admitting students to the study unless they are able to verify their information with at least one official document.

5. Keep in contact to improve trust

The final step in recruitment was ensuring that students showed up to write academic tests. In the initial week of baseline testing, we had a 50% attendance rate for evaluation testing among our comparison students. This is quite low in contrast to a 90%+ participation rate from SNHU-Scalabrini students. Many students who did not participate said that even after signing up for the study, they did not trust external researchers to handle their information properly.

We conducted additional follow-ups with these students to overcome this distrust and convince them to not drop out before testing. We found that calling or texting students the day before testing to remind them to attend was most effective for increasing attendance rates. Many students had concerns about the testing process and speaking directly to us helped clear address their concerns. During these calls, we provided clarification on time, venue, and compensation to increase the chances of attendance. Consistent, personalized follow-ups can remind participants about the intentions of the study, humanize the researchers, and facilitate two-way trust.

Finally, collecting alternate contact numbers up-front and following up with the people who had referred or connected students to the study helped us stay in touch with students who were challenging to reach.

Conclusion

Participants were motivated to participate once they understood they would be compensated for their time, that other friends and family trusted our research team, and that the information provided was secure and would be used towards a positive end.

In the end, we were able to recruit a comparison group of students with refugee or asylum seeker status in their first year of university. Participants were motivated to participate once they understood they would be compensated for their time, that other friends and family trusted our research team, and that the information provided was secure and would be used towards a positive end (to ultimately help obtain funding for education programs for vulnerable populations in Cape Town). We grew our sample size by seeking out connected individuals and building positive rapport with participants who recommended other eligible participants. These strategies ensured we had enough students for each day of testing.

In the next blog post, we discuss how we adapted our analytical approach when we discovered differences in initial learning levels between the two groups.

  1. 1. The ideal way to measure program impact would have been to randomly select students to receive an offer to join the SNHU-Scalabrini program. This approach was not possible because there was a limited pool of qualified applicants to the program. Thus, random selection would have reduced the quality of program entrants. Additionally, randomization would have created a less useful comparison than a matched comparison group from local universities: many people in the randomized control group, who were denied program access, would likely not have accessed other tertiary education opportunities due to the timing of program and university entrance pipelines. Therefore, the study would have compared SNHU-Scalabrini students to students not in school, which would not tell us about the effectiveness of SNHU-Scalabrini relative to other local tertiary education options.