Skip to content

I just called to say I want to interview you

This post is aimed at research practitioners experienced at administering in-person surveys and interested in beginning phone surveys.

Interviewer Suale conducts backchecks in Northern Ghana © heather lanthorn

To continue collecting data despite COVID-19 disruptions, many governments and organizations are increasingly relying on remote methods. At IDinsight, we’re using phone surveys and interviews to help our clients track the effects of COVID-19 and the various policy responses taken by governments in low- and middle-income countries (LMICs). We’re also switching from in-person to phone interviews for other, non-COVID-specific projects.

As valuable as it is to understand our respondents’ perspectives during these difficult times, there are certainly challenges, both technical and personal. In many contexts in which we work (and live), people are not accustomed to strangers calling and having lengthy conversations about serious topics even during normal times. This means building rapport and seeking informed consent from respondents is even more vital.

In this post, we’ll share practical insights from our recent surveys on what’s worked — and what hasn’t worked — to build rapport with the respondents over the phone, and to seek their well-informed, well-considered consent.

We hope these insights help other researchers to build rapport and seek informed consent over the phone. As these are still early days for us, we would love to hear from you as well on what’s working and what’s not working in phone surveys and interviews. Please share your experiences and lessons in the comments!

Recently, a number of organizations have shared crucial advice for carrying out phone surveys in LMICs. Mathematica’s overview of switching from in-person surveys to phone surveys, and the World Bank and J-PAL’s guidance on phone surveys all emphasize the importance of carefully crafting concise introduction scripts for respondents. In this post, we explain in practical and granular detail how we have developed introduction scripts for our recent phone surveys and interviews. We also explain how we’ve sought consent, and discuss ways to build rapport with our respondents.

This post is largely based on our experience with three projects: one, a structured, quantitative COVID-19 survey in northern and eastern India;1 two, semi-structured qualitative interviews with banking agents, their supervisors, and their customers in southern India;2 and three, structured qualitative interviews with refugee and host communities about cash transfers in Uganda3.

Piloting how to initiate the call

Before seeking consent and building rapport, we needed to anticipate how respondents might react to the request for a phone conversation. Before the first round of our structured COVID-19 survey in India, we piloted a few open-ended interviews led by some of our most experienced field managers. As part of this pilot, we tried to understand the respondents’ main apprehensions and developed a script to address them. For example, we found people were wondering, “why are you calling me on the phone?” So, we included these sentences:

वैसे तो हम ये सवाल आपके घर आकर, आपसे मिलकर, पूछना चाहते थे। लेकिन करोना-वायरस के चलते जो लॉक -डाउन हुआ है, उस वजह से हम आपसे मिलने नहीं आ सकते| हालाँकि, आपकी राय और विचार हमारे लिए बहुत ज़रूरी हैं, इसलिए हम आपसे फ़ोन पर ही सवाल पूछ रहे हैं। आशा है सब कुछ जल्दी ही ठीक हो जाएगा, और हम आपसे दुबारा मिलने के लिए आएँगे।

Ideally, we wanted to come to your house, meet you, and ask these questions in person. But we cannot visit you because of the Coronavirus lockdown. However, your views and opinions are still very important for us, hence we are asking these questions over the phone. We hope that things will go back to normal very soon, and we will come and meet you in person.

Initiating the call and introducing ourselves

Respondents are more likely to trust us with their information if we can establish a connection, including making sure they know who we are and why we are asking questions.

Before giving a full introduction of IDinsight, we found it helpful to quickly develop interpersonal rapport with respondents to build credibility. Establishing rapport looked different depending on whether we had communicated with the respondents previously or not. For our qualitative research with customers of banking agents, we used phone numbers that the customers had given to agents during the program. We never met these customers, but they had signed a sheet indicating that they were willing to speak about the program on the phone. During the call, we explicitly mentioned the names of the customer’s village and their banking agent. We developed this protocol in response to initial interviews with customers, who seemed uncomfortable when the interviewer didn’t know information about their locality. We then introduced who IDinsight is, the purpose of our interview, and what research/policy decisions the respondent’s answers might be used for. We finally gave the respondent an opportunity to ask questions.

In Uganda, we continued working with respondents we’d already interviewed in person twice. In this situation, we found it helpful to build on and link back to the previous interviews (e.g., “Last time you mentioned that your husband went to Kampala for treatment. How is his health now? Has he returned yet?”). 4This may help the respondent to remember the previous conversations, and show that we are interested in and actively listening to what they shared with us.

Reaching the correct respondent and obtaining informed consent

We went into each of our surveys and interviews with a specific person/type of person we wanted to reach (e.g., the person we had previously interviewed, or the household’s primary female decisionmaker). Reaching the correct respondent over the phone is understandably more difficult than in person. As Mathematica suggested, interviewers should be prepared for multiple scenarios, including other family members answering the phone.

We often encountered this problem while interviewing customers of banking agents. We wanted to interview equal numbers of male and female customers. However, most female customers had provided the phone numbers of male relatives. Interviewers often had to seek some form of consent from the male relative before they would pass the phone to the female respondent. These conversations were sometimes tense at first and involved more suspicion and questioning than when female respondents answered the phone directly. Since men are more likely to have access to the phone, we will often need to engage in such persuasion. We’re experimenting in various contexts with different message framings to persuade the male household member to pass the phone to the selected female respondent.

In some cases, once connected with the correct respondent, interviewers might only need a short consent script. In other cases, due to IRBs, a country’s legal requirements, or our own ethical judgment, interviewers might have a longer list of consent criteria that need to be covered. Reading out a long consent script can bore and exhaust the respondent, despite its good intentions. To avoid this, interviewers can try to make the consent process more conversational. In our qualitative phone interviews in India and Uganda, this means establishing a list of all necessary consent topics to cover in a more conversational way, rather than focusing on reading the script verbatim. Pausing to ask questions has also helped to make the consent process more conversational.

Emphasizing flexibility, given the respondents’ other commitments, is more difficult over the phone than in-person, yet still central to promoting genuine consent. In India, several banking agents whom we called requested that we try again later in the day. In some cases, they even called us back. However, we later heard from our partner organization that some agents who completed interviews found the timing of our calls inconvenient because they were dealing with customers at that time. These respondents had still felt compelled to participate, even though the interview was voluntary. We responded by shifting our calls with subsequent agents to later in the day when they were less likely to have customers. We also started to explicitly ask agents if they were currently serving customers, in order to make sure we did not interrupt their work. Finally, we reaffirmed their ability to reschedule at a time most convenient for them if they were interested.

Keeping respondents engaged throughout the call

From the beginning to the end of the call, it is important for interviewers to remember how much verbal and audio cues matter on the phone, especially in the absence of the visual, non-verbal cues on which we so often train interviewers. In Uganda, we re-trained interviewers to switch from in-person to phone-based data collection. Interviewers were encouraged to pay attention to the mood of the respondents, as well as any background sounds that might detract from the interview or provide additional contexts, such as a child crying, lots of people, or outdoor noises.

Signals of active listening and verbal encouragement from the interviewer also play a role in developing rapport that sustains an interview on the phone. As part of the same training, interviewers were encouraged to habitually use brief affirmative words that do not interrupt the respondent, as well as to repeat statements back to respondents. Moreover, interviewers may need more concrete phrasing to assure respondents. In our COVID-19 survey, when we wanted to ask their preferences for future government action related to COVID-19, we tried to preface our question with a message of hope: “we sincerely hope that things will be back to normal again and all of us will be safe . . . but in case the government has to . . . ” Whatever the question, more assurance than normal is needed given the highly stressful circumstances.

Finally, we’ve tried to be flexible with the content we plan to cover with the respondent. Given our earlier findings that long phone interviews can distress respondents, we make sure to keep the interviews short and only ask questions that we think will be essential to policymakers. Given the time constraint in phone interviews compared to in-person interviews, we divided our qualitative interview content into different informational buckets for different respondents. We knew that some respondents would prefer to talk about certain topics over others. For this reason, we were always ready to change the subject of the interview if we felt that the respondent was not engaging or losing interest in the conversation.

Overall, ensuring care for respondents over the phone may require more intensive interviewer training or scripted language than for in-person surveys and interviews. As we collect data that will hopefully improve health and economic policy, we have found this extra effort is well worth it to ensure true informed consent and rapport with respondents during a difficult time.

Steps to consider when obtaining consent/building rapport during a phone survey:
  1. Have I piloted my survey with people similar to my intended respondents?
  2. Have I created a protocol/script to address potential respondent apprehensions before beginning the survey?
  3. Have I ensured my survey is short and only asks important decision-relevant questions, as to not trouble/inconvenience respondents?
  4. Have I given a complete explanation of the survey, including:
  5. (if my team has NOT met them in-person before) information about how we got their number, a clear explanation of who my organization is, and the purpose of the survey?
  6. (if my team has met them in-person before) a reminder of who my organization is, what we have talked to them about before, when we last spoke to them, and an explanation of the purpose of the survey?
  7. Have I created different protocols/message framing around reaching the correct respondent, particularly if my team is aiming to speak to women?
  8. Am I making the consent process conversational?
  9. Am I emphasizing flexibility and calling at times of day that are likely to be convenient for my respondents?
  10. Am I giving signs of active listening and reassurance throughout the call?
  11. Am I being flexible with time and the questions I ask if the respondent is losing interest?
  1. 1. For our COVID-19 phone survey, we are collecting four rounds of data on health awareness and practices, as well as relief benefits and economic effects. We had met the respondents and collected their phone numbers during previous in-person survey rounds.
  2. 2. We conducted qualitative phone interviews of banking agents, their supervisors, and their customers in the Indian states of Andhra Pradesh and Telangana. The interviews were part of an impact evaluation for a household savings calendar and also included conversations about COVID-19. We had met agents and their supervisors previously, but not the customers of those agents. Agent and supervisor phone numbers were sourced from the client. As part of the program, customers reported their phone numbers to agents several weeks or months prior with consent to be contacted in the future.
  3. 3. In Uganda, we are implementing an ongoing longitudinal qualitative study as part of an impact evaluation of GiveDirectly’s cash transfer program to people in protracted displacement. Households have experienced multiple co-shocks: health shocks form COVID-19, economic and food security shocks from the lockdown measures, the locusts infestation, and the reduced food and cash aid provided by the World Food Programme. It is important to keep hearing and learning from their experiences and whether and how cash transfers mitigate those shocks. We thus decided to move to phone-based qualitative interviewing. We had met the respondents and collected their phone numbers during previous in-person survey rounds.
  4. 4. See page 22 here: