What’s the counterfactual? What happens to people who don’t get to participate in your programme?
We’ve all faced this dreaded question. It frequently comes from a donor or a board member—but hopefully increasingly from the team itself—who has a genuine interest in better understanding the impact (or lack thereof) of the interventions they work so hard to design and deliver. In many cases, the question comes from all three.
However, a good counterfactual is hard to get. It typically involves conducting a randomised control trial (RCT). The general perception is that they are always exorbitantly expensive, can only be done if you are operating at a significant scale, and consume the entire organisation’s energy.
We’re here to bust that myth.
Since 2018, Medha has been conducting an RCT with J-PAL to measure our programme’s impact on young women’s career preparation and progression in the Hindi heartland. (To learn more about the findings, check out the midline report.)
We conducted this study on a shoestring budget for RCTs (USD 60,000/INR 45 lakh) and in a way that minimised its impact on our operations (less than 15 percent of the team was involved). We thought it would be helpful to share how we did this for all the other resource-constrained organisations out there. It is certainly not a miracle or something that hasn’t been done before, but we didn’t come across many examples from the Indian nonprofit world when we were looking.
1. Find a researcher who is passionate about what you do and the problem you are trying to solve
This is a non-negotiable. Don’t move forward unless you feel confident about this. In our case, it took many years to make this happen. We networked hard and reached out to many research firms. We shared (pitched) our work and potential research questions. Without a large grant behind us, the conversations didn’t go very far.
However, after multiple discussions with J-PAL, our partnership development form serendipitously found its way to Lori Beaman (I like to think of it as the equivalent of Remy finding Gusteau’s in Ratatouille). Only after having conversations with Lori, and gaining the confidence that both of us were equally excited and passionate about pursuing this, did we move forward.
2. Leverage your existing team as much as possible
There is no reason that you need to have countless third-party enumerators (who are costly!) collect the baseline data. In our case, we utilised our student relationship managers (SRMs) to hold the lottery for randomisation and facilitate the collection of baseline data through smartphones and tablets we provided (which enabled cost savings). Since SRMs directly work with students, it was tough for them not to be able to work with all interested students in that year (argh, randomisation). While it doesn’t necessarily make it any easier, we took extensive training sessions with the team so they understood the ‘why’ behind an RCT design.
3. Reduce the sample size
While there are reasons to have a big sample (researchers will frequently cite power calculations), it is in your best interest to reduce the sample size as much as possible (of course, without sacrificing the integrity of the research). In our case, we did a pilot sample of 600 students, followed by a second sample the following year of 1,500 students (treatment and control). This outreach was approximately 15 percent of our total students that year.
Reducing the sample size enabled us to insulate and minimise the study’s impact on the rest of the organisation and team. RCTs can be disruptive. They limit the number of people you can work with. And telling users that you are denying them service is never easy. If you can minimise this as much as possible, it will make life simpler.
So, while it is easier to accomplish this if you have extensive operations, as you can see, it’s still achievable with a scale of a couple of thousand users a year. In the Indian context, this is small.
4. Explore pilot/small grant opportunities and existing funds your researcher may have access to
Researchers often have discretionary funds that can be used for your study (granted, the amounts are not huge). In our case, Lori could apply some of these funds to recharge SIM cards for study participants and even hire a research associate when required.
If these funds are unavailable, you can explore pilot or smaller grants that do not require a significant scale. In most cases, the researcher will do the heavy lifting on these applications, saving you the time and expertise needed to secure them. In our case, we got a pilot grant from J-PAL’s PPE Initiative and also explored small grants from Spencer Foundation. There will likely be many such opportunities in your networks.
5. Position it as a learning exercise, and not as an existential question of whether your programme works
RCTs, and other empirical research methods, are just another tool in the MEL toolbox to understand more about your programme, users, and impact. They are neither perfect nor a silver bullet. Sometimes they give you minimal information about what you are doing.
One example from our experience: Our RCT told us that, compared to their peers, students who went through the Medha programme are more likely to have a CV. While that’s nice to know, it doesn’t tell us anything about how good that CV was and if it helped them secure the job they wanted. The point is, if we approach RCTs as a learning opportunity, it reduces the pressure on the whole team and makes us focus on what we can learn from them, and not on someone sitting in Chicago telling us if the last 10 years of our lives have been successful.
Happy randomising
So, there you have it, five ways to pull off an RCT in a low-cost and low-impact way. With a bit of persistence (working hard to be resourceful and scraping together what limited funds exist), luck (finding the right researcher), and collaborative spirit (getting the whole team behind the learning objective), you too can make it happen.
This article has been edited for IDR. It was originally published on Medha’s blog. The author is related to a team member at IDR.
Know more
- Read more about low-cost randomised control trials in this report by the Brookings Institute
- Learn about how to conduct rigorous program evaluations on a budget in this report by the Coalition for Evidence-Based Policy