Skip to content
Article

IDinsight-ing IDinsight: applying data to our internal operations decisions

Applying data to our internal operations decisions

We have an internal maxim to “IDinsight IDinsight,” or to use data to inform decisions in the same way we encourage our clients to do. Our Operations and Technical Team regularly collaborate to put this maxim into practice and make our own systems as data-driven as possible. This blog post is the first in a series that will highlight some of the ways we use data to inform internal decision-making at IDinsight. We hope you may be able to borrow and implement ideas from this series to make your own organization run more effectively.

One tool that we use to make decisions (and help our clients make decisions) is “A/B testing.” A/B testing refers to a randomized controlled trial that compares outcomes between one arm of a program or initiative versus another. You can use this tool to figure out which version of your initiative is more successful at accomplishing your goals: Roll out version “A” of your initiative to a randomly selected group of people and version “B” to a different group of people, and compare their outcomes at the end of the initiative.

We recently conducted an A/B test to answer a simple question: how can we improve completion rates on internal surveys? High response rates on internal surveys are crucial for ensuring that the results and recommendations represent the needs and preferences of all teammates. Yet Operations teams regularly face challenges in achieving high response rates. As we grew, we found it harder and harder to get more than 3 in 4 teammates to respond to these important surveys.

Two possible solutions jumped out at us from the behavioural economics literature: appeal to a person’s sense of values and duty — an ‘intrinsic’ motivation — or give them a carrot to get them to do what you want them to do — an ‘extrinsic’ motivation. But carrots can be expensive and may backfire if they undermine intrinsic motivation. So when the next internal survey rolled around we decided to collect data to figure out which approach was better. Half of our teammates were randomly assigned to an “intrinsic motivation only” group, where they received an email introducing the survey with the following text:

Several new systems are rolling out soon, and we want to know what is working and what gaps remain. As the end-users and beneficiaries of these systems, your voices are crucial to making sure that we get this right.

The other half of the organization (the “intrinsic + extrinsic motivation” group) received an email that included the following additional text:

As a token of appreciation, you will find a separate link at the end of the survey where you can enter your email address for a chance to win one of three $20 Amazon gift cards.

(For fairness, after submitting the survey, everyone had the option to enter their name in a separate form to be part of the raffle). After the survey, we compared responses to investigate the relative effect of each approach. In total, 49 employees responded to the survey.

So which approach increased survey completion rates more? In the end, they were pretty similar: 23 of the 35 teammates (66 per cent) in the intrinsic motivation arm completed the survey, compared to 26 of the 34 respondents (76 per cent) in the extrinsic motivation arm. Given the sample size, we can’t say whether this relatively small difference is due to the incentive or due to chance. However, this didn’t prevent us from using the data to make a decision: Even if the higher response rate were due to the incentive, the improvement was not large enough to justify future raffles or to justify creating a culture of external incentives for giving constructive feedback about the organization. Our leadership team decided upon seeing the results that, in most cases, we will not offer monetary incentives for survey completion.

At the same time, the incentive did increase how quickly people responded to the survey: 50 per cent of respondents who were prompted about the Amazon raffle responded on the day that the survey was launched, compared with 22 per cent in the no-incentive group (p < 0.05). So for urgent surveys, we may consider using incentives to speed up response rates.

One of the most promising aspects of this experiment is that it cost almost nothing to implement. Besides the money that we were already going to spend on a few gift cards, we just needed to divide people into two groups and track who responded to the survey. At the same time, it gave us hard data and valuable insights into how to more effectively run a core Operations function. In terms of value for money, A/B tests can be one of the most cost-effective and powerful tools in the arsenal of the data-driven Operations team.