Skip to content

(Audio) The role of data and evidence in advancing gender equity: A conversation with funders

The global community has made huge strides in achieving gender equity. But we still have a long way to go: data shows that women and girls still experience poverty at higher levels than men and there is still a notable discrepancy between the number of men and women in leadership positions. In this podcast, we speak with data-driven funders to explore the role of data and evidence in addressing gender inequity, and where there is room to grow.

Full transcript of the interview:

Sarah: Hello and welcome everyone to this podcast. My name is Sarah Lucas. I am the Global Lead for Data on Demand at IDinsight. IDinsight works with social sector partners in Africa and Asia to use data and evidence to improve decision-making and impact. This Women’s Month, IDinsight is focusing on the importance of gender data.

We’re going to be sharing insights and highlights from our work on closing the gender gap throughout this whole month. We’ll have a series of blogs, videos, in-person events across our region, and today’s podcast. We are talking today about the role of data and evidence in advancing gender equity. I am delighted to have two amazing women with us who have been putting their minds, hearts and funding into gender equity for many years.

Alfonsina Penaloza is the Director of Programs at Co-Impact. Co-Impact is a global philanthropic collaborative and she has been instrumental in standing up Co-Impact’s Gender Fund, which aims to raise a billion dollars to provide women-led locally rooted organizations in Africa, Asia and Latin America. Large-scale, long-term, flexible funding. It’s like a dream come true. Alfonsina, we can’t wait to hear more about that.

Dana Schmidt is a Program Director at Echidna Giving. Echidna is laser-focused on advancing girls’ education in countries with limited resources and modest education budgets. Echidna is deeply committed to the use of data, evidence and learning to advance all of its institutional goals. Echidna plans to fund 500 million to 700 million dollars worth of grants in this area of girls’ education over the next 40 years.

Both Co-Impact and Echidna have really ambitious goals and sizable budgets around gender equity. So today, we’re going to dive into how data and evidence fit into those goals and also into those budgets. Let’s start by getting to know Alfonsina and Dana as people and professionals, and then we’ll turn to hearing about their grantmaking goals and strategies.

Alfonsina, I’ll turn to you first and then to Dana. And I would love it if you would just give us a quick tour through your career and professional interests as it relates to gender and data. When did you come to believe that data is a central part of advancing gender equity? And tell us about your journey of learning and exploration on that set of topics. Over to you, Alfonsina.

Alfonsina: First, thank you so much for having me, Sarah. Especially this lovely reunion, and for saying my name so beautifully. It’s become quite a gift these days. Just delighted to be here. So, a quick tour – I was born and raised in Mexico City. I studied International Relations, and that’s when the feminist seed was probably planted in me and when some of these musings, I think, started happening. I then moved to do a Master’s in Gender and Development at the London School of Economics, where that seed fully blossomed, and I found my tribe and decided that this was what I wanted to do. And in the meantime, and before or after, I did a combination of professional opportunities at both think tanks, civil society and government.

And, I think that’s been really critical because it’s given me an insight into different parts of what we’re hoping to discuss today, including my latest experience, which has been in philanthropy, where I’ve been shockingly, to me, for ten years now. So I think I’ve had a different insight into the different parts of policymaking and how gender and data play a critical role in either of these.

Regarding your question around when I first came to believe that data was central for advancing gender equity, the first thing that came to mind is actually less related to gender but definitely to equity, which is my first econ class. We were studying some model, and I raised my hand and said, “well, what happens if you’re poor?” This was in Mexico. 60% of the population lives under the poverty line. And this is a school that produces most ministers of Finance and Economic, Development, and lots of government roles. And the answer I got was, “Well if you’re poor, you don’t enter the model.” And that, to me, was a first sort of red flag of being like, “Oh, okay, well, there’s something inherently wrong about the economics and the models that are being created where more than half the population don’t fit.”

And that was without even thinking about gender. And I think this ties nicely into maybe the second time when then I realized how key gender data was, which was when I started learning about feminist economics and how really key data is for the creation of those models and why those models don’t incorporate not only in this case, poor people, but when you think intersectionally – women who might actually live in rural areas and who might not be educated and all of these intersectionality identities that really impact how women experience different areas of their life and where data on their experiences might really inform the type of policies and models that are produced.

Sarah: That’s such a powerful quote – if you’re poor, you don’t enter into the model. And the thing that’s so unbelievable to me, and we’ll get to this a little bit further in the podcast, is how can it be that we are still talking about whether women should show up in data and economic modelling when we are 50% of the population of the world?

It just makes no sense and can only be explained by power dynamics that are out of our control. So we’ll get to that. But first, over to you, Dana, to share a little bit about your own professional journey and when you came to see the light on data as a key piece of advancing gender equity.

Dana: Great, thanks, Sarah and thanks, Alfonsina. So fun to be in this conversation with both of you. I have spent most of my career in philanthropy, so Alfonsina, I think you were shocked at your number, mine is even higher. I think I’m almost at two decades at this point, but always with a focus on education and equity.

I think that I really came to see the power of gender data actually by seeing how much energy and attention girls’ education has gotten as a result of data around gaps in education outcomes for girls. I think back to when I was at the Hewlett Foundation with both of you, when they launched the Girl Effect video, think about Laura Summers’ famous quote that “Investment in girls’ education may well be the highest return on investment in the developing world”.

I really think that there’s been some key kind of data points that have galvanized a lot of energy and interest in girls’ education. We can get to some of the limitations later in the conversation too so it’s not all kind of a rosy picture. But, I think just seeing the power of some simple data and evidence really galvanizing interest in this issue is one of the things that made me a believer.

And I think because there’s been so much talk about the importance of girls’ enrollment in school, we’ve also seen major progress on that front. Still a lot of gender inequities in education systems and beyond, but I think just in the last two decades, we’ve cut in half the number of out-of-school girls from 60 million to 30 million.

So that’s just a huge momentum in and closing gaps where they know they exist and already have the data to show what those gaps are. I think that’s really kind of what has made me a believer.

Sarah: I love that Dana, it’s like data as a galvanizing mechanism, but also a source of optimism and hope and pride. Right? To be able to put out a number like that, like the number of girls out of school that’s been cut in half. And we know that because of data and measurement, that can be a very powerful force to realize what change is possible. So I love that you shared that example. 

Let’s switch now to learn a little bit about your grantmaking strategies and goals in each of your institutions. I’ll just go back and forth. I will start with you, Dana, this time and then come to you, Alfonsina. Just give us a sense of what change are you seeking for girls and women with your current strategic portfolios and grantmaking portfolios. And how do investments in data and evidence play into achieving those goals?

Dana: At Echidna Giving, like you said Sarah, we are really laser-focused on gender equity in education, and I would say increasingly also through education. So not just how do we have gender equity in the number of girls going to school, completing school, achieving in school, but also how do we use education as a force for advancing more equitable gender norms, for example.

So that’s really what we’ve been focusing on. And I think we use data, I’lI point out two key ways –  although it shows up in a lot of our work – but I think one key way we use data in the work that we do is to surface what are the barriers, what are the gendered barriers in education so that we understand what those are and we can bring those to the attention of others in this space.

So that’s kind of one piece. Just understanding what are the problems, and what are the barriers. And then I think the second piece is really in understanding and generating knowledge about what actually works to address those barriers. We work to fund organizations who are really running programs that aim to address some of those barriers and fund research and analysis to understand what is and isn’t working.

And, we also fund work that’s more on just surfacing what the barriers are to begin with.

Sarah: Before I turn to you Alfonsina, Dana I want to ask you to give a couple of examples here because I have long believed and learned from wise people that work in the evidence-to-policy space, that the kinds of data and evidence that you need to galvanize attention and raise awareness on something is actually quite different than the kinds of data and evidence you need to know what to do to fix the problem.

So, I’m curious if you have some examples about the methods or data types or even data points that you would describe as being particularly powerful for galvanizing action versus particularly powerful knowing what to do to solve a problem.

Dana: Yeah, great question! And I feel like I want to hear your answer too, Sarah. I’m sure you have a lot to say on that. 

I think, on galvanizing attention, often it is simple data that’s very easy to understand and paints a very clear and stark picture about what the situation is. One example of that is what I started with – that there’s this gap in how many girls access school versus how many boys access school historically(still in some countries, but decreasingly so). I think partly because it’s such a clear simple, and (for many people) feels like a problem we should solve. Right? 

I think another example in the education space that you’re both familiar with, probably from our shared experience at the Hewlett Foundation, is clear data on whether or not kids are able to read and do math. Those are skills that most of us would agree education systems should at a minimum be equipping children with – literacy and simple numeracy.

We’ve been able to fund organizations like Pratham in India and Uwezo in East Africa, who are collecting household-based survey data on whether or not kids can read and do math. That has really, I think, shifted attention in the education space to say it’s not enough to just look at whether or not kids are in school, we should also be looking at whether they are learning. I think the clear, simple data that tells a story of where we have gaps that we need to fill, can help catalyze action. 

So those are a couple examples I would give on the on that first point. On the second point around what helps us know what to do about it. My sense, and again, feel like you should be able to weigh in on this too. But my sense is that there we need more work that is sort of having implementing organizations, working alongside research partners and/or building that capacity internally – doesn’t always have to be kind of an external partner – but really being able to iterate on what is and isn’t working and try new things and test it out.

A much more iterative process, I think gets much more in the weaves sometimes around like, does this modality work or does that modality work? For example, we are funding this organization, Youth Impact, that does a lot of work in Botswana and has increasingly been embedding within their efforts, a kind of A/B testing. They’re doing work on teaching foundational literacy and numeracy, but they’ve been doing some A/B testing to say does it work better if we have groups that are kind of single-gender groups like girls in one group, boys and another group. Do they learn more or do they learn more if they’re in mixed-gender groups? And really testing those things and understanding from their implementation teams, what are the questions that they have, testing those questions and then iterating from there. So I think it’s much more of  getting in into the nitty gritty details of the process and what is and isn’t working and doing that in an ongoing way.

Sarah: Thanks so much for that. And for listeners that aren’t familiar with A/B testing, it’s exactly as Dana described. You want to change an outcome, and you don’t know whether implementation or initiative ‘A’ having all girls in the classroom or initiative ‘B’ having mixed gender classrooms gets you a better outcome in terms of literacy or numeracy or whatever it is that you’re measuring. So you implement both and then test outcomes for both, to know, whether approach ‘A’ or approach ‘B’ works better. Thank you so much for that, Dana.  Over to you, Alfonsina, tell us, what change are you seeking for women and women’s lives and how does data play a role in advancing that goal that you’re seeking?

Alfonsina: I’ll just echo, Dana. I’d really love to hear what your insights are on this. I think it would be really interesting, given your experience and expertise. So going back to the first two things that you said really resonated with me. 

One was that you kind of find it incredible that, at this point, we’re still questioning why are women not included? And that really is the ethos behind the Gender Fund, which at Co-Impact it’s already been funding systems change, and I’ll briefly explain what we’re looking for. It very quickly became obvious that in order for systems to serve women, there needs to be a gender sort of analysis of it. The reason why these systems aren’t working for women and girls – and that’s sort of the reason behind the Gender Fund, which seeks to support organizations, usually in coalitions, to help find what are the root causes for particularly public systems, we do work with private market systems in some cases, but mostly we’re talking about public systems in education and health and economic opportunity and why they are not working for women and girls. And often, what we have seen is that these systems don’t work because they discriminate. And partly that discrimination is baked into the design of those systems, often unconsciously.

But it is part of how those education systems, health systems are designed that they, by design, exclude and discriminate against women and girls. And that is what we call systems change at Co- Impact, which is really encouraging and supporting program partners to think through what are some of those root causes. And this is the link to where the second part of your question to Dana really resonated with me, which is the difference between data that serves to galvanize, to maybe raise awareness about the data that is actually needed for these systems.

That is something that is really key and I think unique to Co-Impact’s model, which is that we really support and encourage our program partners to get into the weaves of that system. So is it that you need a law, for example, is that what is required? In many cases, I work in Latin America we have very advanced legal systems, problem is those laws aren’t implemented. So we encourage our program partners to then dig deep into why are these laws not implemented. Is it because, for example, they don’t have a budget or is it because folks at the ground level have no idea that this law exists or how to implement it? Is it that there’s a lack of rule of law and there’s no accountability measures? All of these things that really get into the technicalities. 

Those are part of the reasons where data in our experience really plays a key role into figuring out why is it that these systems don’t work. And again, as we said, often these are unconscious, they are unknown unknowns. And so it’s really difficult to fix the system if you don’t know why something isn’t working. And one thing that I’ll add that I think is particularly important to Co-Impact’s approach is one, our main goal is actually to have systems change, but in a slightly different way to how traditional philanthropy, particularly in development. We don’t just stop at the advocacy level we actually want to see how that particular change actually changed the lived experience of a woman or a girl. So it’s not just up again the law passed, but then what actually happened to women as a result of that law?

Or if you got more budget or you added capacity or skill building for policy workers, how was that actually implemented and had an effect on the women? And part of that for us also includes representation. We think that one of the main reasons why these systems are designed the way they are is because women’s experiences are not being reflected in their design. So as part of Co-Impact’s approach to systems change, we also encourage program partners to sort of figure it out, and this includes working with the government, by the way, which can be very challenging. How do we actually build not only representation but also the diverse leadership of women at all levels from the community level, all the way to the executive, to make sure that these experiences are reflected and are taken into consideration in the design.

And this, of course, has a backdrop of data because that data is often missing. In our experience, it’s not only the gendered lens or the gender analysis of those systems, but then when you break it down into the data, there’s actually very little knowledge about how those systems affect those women, and why, so data becomes a key point in that element too.

Sarah: Oh my gosh, I am loving this conversation Alfonsina, and I am being carried back to, I don’t know, five or six years ago with the Hewlett Foundation, where I credit you with helping me understand the trajectory of women’s issues in the global development field, from seeing women as individual actors who just need to be empowered to overcome their conditions versus what you have so beautifully both articulated, which is the orientation around what are the systems and structures that are routinely and historically holding women down to this day. And understanding what you need for a single woman to be an individual actor versus what you need to change that system – that’s a fundamentally different and deeper demand on data and information. And I just hadn’t thought about that link, and I love the way you just drew that. And, you actually have helped me answer my own question back because I think in the difference between knowing what the problem is in galvanizing data versus knowing how to solve the problem, you need at least three things, right?

You need causal information, which is some of the A/B testing that you started to talk about, but also this a much deeper dive into how are different people affected by a given intervention? You need much more granular data, so you need causal information. You need information differentiated across subgroups. And you can’t differentiate across subgroups unless you have large enough data sets. And that’s one of the main missions behind Data on Demand, which is the initiative I’m leading now at IDinsight, which is large-scale household survey data collection, that allows you to begin to pull apart the different experiences by gender, by caste, by religion, by geography, by income. And only when you can see people in subgroups that are large enough to be representative can you truly understand kind of those discrete experiences.

And, then the third is just the obvious one is just the root cause analysis. Like why are we seeing what we’re seeing? And that takes the kind of digging that you just described, Alfonsina. I love how all of this is coming together. 

Well, we are all evangelists and excited about data. We’ve all just beautifully made the case for why it’s so essential. But it isn’t the only ingredient in advancing gender equity. So I’m interested in hearing from you a little bit about some of the other tactics that your foundations support and promote, whether those are advocacy or social movements or policy research or training programs or whatever they are. And talk a little bit about the interplay, if you can, between data and evidence and some of those other pieces, including advocacy or social movements or protest, if that’s part of your portfolios. How do those different pieces fit together?

I’ll start with Alfonsina this time and then come back to you, Dana.

Alfonsina: I think it ties really nicely with what Dana was talking about and this distinction that you made between really thinking about the data for what and for whom. And to me, that is a key question that we always encourage our program partners to think about. We ourselves also ask, because so often in philanthropy, particularly, the data that is produced, is produced for the donor. And that I think, is a huge sort of driver of the problem that there isn’t an encouragement to ask, “Well who are you collecting this data for and for what?” And I think those two questions are really key. And, that helps you then to differentiate the different types of data and the things that you’re looking to do.

So I think there’s one question behind even the collection of the data, which is – what are you looking to do? And in our case, all of the tactics that you mentioned are funded because we actually don’t have a strategy or an approach other than that we encourage our program partners to think about systems in a systemic way. But how you go about that strategy is really up to our program partners, and it’s so dependent on the context and even the issue that you’re trying to tackle that it really differs across geographies. 

I think the question is first, what is the problem that you’re trying to solve? Then, what are some of the pathways or strategies that you’ll need? And the data sort of runs through all of those. Like the problem analysis is the first one where we particularly for systems change, how Co-Impact finds it, we are encouraging our program partners to think about solving a problem at scale. So what we mean by that is not scaling up, but rather saying, for example, in education 2 million children in Kenya go through school but actually they don’t eat properly. And so their learning outcomes are affected by that. That is the scale of your problem. It’s 2 million kids! You have to have that data. If you don’t have that data, then you can’t have an idea and a dimension of what the problem is.

We are currently supporting program partners in Brazil, for example, that are trying to come up with solutions to curb femicide, which is the highest country sadly, in the world. You require data to be able to give a dimension to the challenge. How many women are survivors of gender-based violence and how many of those end up being victims of femicide? That is required even if you don’t have the capacity to provide the service directly to those 2 million kids or those 5000 women. It’s the duty of the government to do so. But in order to build a strategy that gets to that problem, you have to have that data. And if you don’t, well, then that’s the first strategy: you have to get the data. 

And, then you arrive at the different strategies. If you want to mobilize feminist movements, you probably don’t necessarily need to feed 2 million kids. You have to figure out who are you mobilizing, what do they care about? Why would they actually go out to the streets? And maybe that’s just a case of one person, of one kid, of one woman that galvanizes. But that probably won’t work with a finance minister who says, listen, I’m all for curbing femicide. I don’t have the budget. Help me figure out how do I finance this? 

So there are different needs to different pieces of really complex issues where data, I think, plays a part in all of them. But what part it plays in how to use the data is completely different, and that’s part of what we encourage our program partners to do, which again, I think has to make sense to them and to the folks who are using the data, not to the donor. And I think that’s one of the biggest mistakes that philanthropy has made, is coming up with numbers, asking for numbers for either your own fundraising or to say, here’s the impact that we’ve had, rather than coming up with numbers that are helpful for the projects or the problems that you’re trying to address.

Sarah: I mean, that is a problem in philanthropy and it’s a 40-fold problem in official development assistance. Collecting data that meet your own needs. I want to underscore something you’ve said here, Alfonsina, because I’m fond of saying that there’s no one method or type of data and evidence that answers every policy and program question, which is one of the things I love about IDinsight is that we’re kind of method diverse because we know that partners need different things to answer different types of questions.

But you’ve given me the analogy from a funder’s perspective or from a real-world perspective, which is there is no one tactic that solves a social problem. You need this whole range of tactics working in concert or sometimes working in cacophony to really make social change. But I love how you’ve illustrated the data actually runs through almost all of those.

So thank you for that. I love these conversations because it expands my own thinking even when I’m mostly asking the questions. Dana, over to you, in terms of other tactics that you support and how data fit into those.

Dana: Well, first of all, I sort of just want to add a plus one to everything that Alfonsina has already said. That absolutely resonates with how we are thinking about things as well. So there’s no one size fits all, or only one piece that we do, and we’re often encouraging organizations to figure out in their context, what is it that makes sense? What are the barriers and opportunities that they can pursue and how do different tactics fit into that? 

We sometimes sort of bucket our grantmaking into categories of implementation, research and advocacy. We often find that with any one grantee, you might be working actually across all of those, although they’re often working more intensively in one than the other. And we think kind of the three of them are really important. Data alone is not very helpful if you don’t have organizations out there innovating new practices and trying things out and figuring out what solutions might be. So that’s where the implementation side kicks in. 

If that happens, sort of absent research, of understanding whether or not those implementation activities are effective, you’re going to be in trouble. And if any of that happens, absent sort of pushing on systems change into what Alfonsina was talking about – advocacy for effective practices being taken up in policy or by other implementing organizations. I think you’re also going to limit yourself. So that’s why we see all three of those as important. 

And another piece that we have a bit of in our portfolio is understanding that sometimes change is very relationship-driven. So data is a part of it. But also what are the kind of relationships and who’s out there sort of convincing other people to take up the data and use it? So we also have parts of our portfolio in which we support leadership development and trying to support those who are champions of gender equity and education to be able to do that work effectively. For example, we fund a program at the Brookings Institution for four scholars each year to have a chance to do deep-dive research on girls’ education and hopefully also influence Brookings’s own agenda with their contextual understanding from a number of countries about what challenges are there. 

We also fund an organization called Rise Up that also kind of works more with leaders in their own contexts and supports them. So I think that’s another kind of piece supporting the leaders who can make change happen as well would be another sort of dimension to bring into the conversation. 

One of the things that we are also trying to do is link up more across those three buckets. So we funded the Population Council to develop what’s called the Evidence on Gender and Education Resource and what they have kind of brought together is three things and they think of it in sort of a Venn diagram.

So one is who is out there and what is the work that they’re doing, kind of what implementation is on the ground. Another is what are the needs? So where are their gender gaps in education, access or learning or the like? And the third is what does the research in evidence say about different practices? And one of the really interesting things has been to see, where do those things overlap, where kind of implementation is meeting the need and we have research behind it and where do they not.

I think what’s really striking is that there is a lot of implementation work out there in the field of gender equity and education that we have almost no evidence about. And so we’ve been really trying to push to prioritize. Implementers are probably doing those things for a reason. They are working in communities, seeing needs and trying to meet those needs.

So surely we should prioritize building evidence about the things implementers feel are important. And we’ve been trying to kind of do more mapping in the space and filling some of those data and evidence gaps.

Sarah: I love that framing because it reminds me of how much we focus data collection on where the gaps are and less on where the solutions are. This notion of positive deviance, like there’s people out there doing really different things that might be incredibly powerful and we don’t even know. So how do you find those innovators and really try and capture the learning from that so that it’s not just isolated to a particular initiative or individual effort. It is really something we can all learn from. I love that. 

Well, speaking of data gaps, I feel like a lot of the discussion around gender data focuses on gaps, and where we have glaring gaps in information about women is a galvanizing data point in and of itself. But we’ve been talking about gender data gaps for at least solidly a decade, if not longer. There’s been a huge amount of funding. If you look at the two institutions that you work at, the Bill and Melinda Gates Foundation, the Hewlett Foundation, not to mention bilateral donors, multilateral donors, there’s been a huge amount of investment in trying to close those gender gaps.

So I’d like you to each share what is some point of progress and optimism that you see from that decade of work and attention to these issues around gaps in gender data. What do you think we should be really proud of, and where are you just kind of disappointed or flummoxed about why it is that we’re still struggling with something. Give us both the kind of a highlight and sort of the missed opportunity from the past decade. I can’t remember who I’m starting with now. Dana, is it you?

Dana: Oh, thank you. Although I wouldn’t be offended if you mixed it. So where we have progress – starting with the rosy side of the equation – I do think that at least in the education space, there is more disaggregation of simple statistics by gender. You know, I brought up an early example of school enrollment and, systematic tracking of that by organizations, by governments and most countries have actually achieved gender parity, at least a primary school level. So that’s great. And I think the success there has shown up in terms of kind of progress on closing those gaps. 

I also encourage that, although our focus is on gender and equity in education, we found a lot of organizations that work with both boys and girls. We think it’s actually important for everyone to succeed in education. And I’m encouraged by examples of organizations we work with that are working with both boys and girls. But in doing it in really smart gender-informed ways. 

I brought up the example of Youth Impact already. Another organization we’re funding in East Africa is Educate, and they have been supporting youth with life skills and entrepreneurship skills and really have taken gender very seriously in their own data systems. And we’re finding during COVID, when they went to a remote model that there were far fewer girls participating than boys and part of that was they had less access to mobile phone technology that was being used to deliver the programs remotely. And part of that was that it was harder to actually just do the outreach. But because they knew that data, they worked really hard to actually close those gaps, and they worked with more female mentors, more intensive community outreach to actually get girls into the program. I think by the end, they were finding they had more girls enrolled than boys, but that was only because they had sort of embedded this gendered approach in their ongoing data and were trying new things to try to close the gap. So I am encouraged that it is increasingly happening. I think there was probably a time when that wasn’t front and center for organizations, and they weren’t necessarily even aware of the fact that more boys were enrolled than girls. So I think that’s all good news.

On the other side of the equation, there are still shocking ways in which gender is not included in studies. We funded, a few years back, David Evans and some co-authors who did some research just looking at all of the impact evaluation data in education and trying to say, yes, there are some interventions that are specifically targeting girls. But if we look at any intervention in the education space, how well does it do on closing gender gaps in education? And so they gathered, something like 200 evaluations total, 200 different interventions, all of which targeted both boys and girls and only I think only half of them actually reported what the gender-differentiated impact was. How are we at a point where just half actually even report that? So they went back painstakingly to authors to try to get the data and do some of that analysis. Well, there was no difference by gender, so they didn’t report it. But I think we’re at a point where we should at least report that there was no difference. So I think there are just ways in which, like we’re frustratingly missing the very basics in some of this. So I’ll leave it at that. I’m sure Alfonsina will have more to add.

Sarah: I’m curious whether there was a secondary line of questioning in that study about the gender of the evaluator…

Dana: I should look at that. That would be really interesting. That was not part of the study to my knowledge, but we should go back and look at it.

Sarah: I mean, it’s so hilarious that RCT, on the one hand, is the quote-unquote gold standard for causal evidence and, on the other hand, can’t get this basic piece right. Like, I mean… 

Okay, Alfonsina, for hope and optimism and also the disappointment.

Alfonsina: I am not sure I will be able to deliver on that first one. And there are so many pieces of what both you and Dana have spoken that I want to just deep dig deeper into.

Now, there has been progress for sure. In the last decade, I would say that particularly putting some topics on the agenda and then getting funding and policy thinking around, like, sexual reproductive rights, gender-based violence is probably, I think, the biggest one in the last ten years. And then very recently, one of the silver linings of COVID – care work and reproductive work have been placed on the agenda. And the major data gaps that exist, in that there has been progress in the sense of putting them on the agenda, in some cases filling out some of the gender gaps. 

But I also think that progress has been made because the bar was really low. There wasn’t really much. So progress has been made in terms of getting the very basics of a baseline in. And I think that’s part of the progress. So there are still a lot of data gaps that I think that are still missing. I loved your point, Sarah, because I can’t think of a single issue, not one, that doesn’t have a gendered impact.

And to Dana’s point, you cannot have any intervention that does not have a gender analysis to it. And I agree with Dana. This happens even with some Co-Impact partners that were funded prior to the Gender Fund. When we start asking and thinking about, what is the gendered impact of your initiative, the most basic baselines of data aren’t even there.

So I think the starting point should be we should at least have the baselines, right? But then all of the progress that is required has to do with both the gaps. And then part of what you were talking about, which is methodologies and actual models of digging a little bit deeper into the nuances of that data. I think it’s very tempting, particularly for those that seek to sort of mainstream gender equality, to stop at gender-disaggregated data. Like, here’s the data we have, we will just separate women and men. And there’s so much more richness in there, even particularly if you’re a data geek that you should be wanting to dig deeper into. Like, what women? Women are just this massive category that needs to be broken down into all the different identities, which for sure impacts how you experience a particular service or right. 

That can give you insights into why a particular policy might not be working. There’s plenty of evidence the feminist research around, for example, economic empowerment, financial inclusion, initiatives that haven’t worked because they don’t get at one of the things that you mentioned up first, Sarah. The main reason, the only reason why we might be able to explain why women are missing, is power dynamics. And those have been missing. 

Do we have data on those power dynamics? What does that data even look like? I think a lot of progress needs to be made into also thinking what are the other types of data? To your point, are RCTs this golden standard, but they actually have been shown to not work very well for a lot of gender issues because of the complexity and the nuance of certain things that don’t lend themselves well to an RCT, because those are contextually dependent and the context is imbued in power dynamics that play a role in everything.

I think there’s a lot of promise as well. And I’m super excited in terms of figuring out what are the new methodologies, what does data even mean for so many people. Does it means just numbers? or can we think of other ways? Are these stories qualitative anthropological? There’s so much richness to different types of information that can certainly still inform policy making and design, but that doesn’t have to look the way that traditional sort of development data has looked like, including things like what you’ve mentioned.

Does the gender of a surveyor affect the type of questions that they ask? This has been documented widely in sociology and anthropology, particularly for issues like sexual reproductive health and gender-based violence. How you ask the question and who asks the question are just as important as the actual instrument and the policy that’s being designed. So I think there’s a lot of promise as well.

And even for those places where we have progressed in terms of the data, then digging deeper into that and thinking about what are some methodologies, what are models, I think for policymakers, particularly who’ve already sort of drank the Kool-Aid, if you will, the biggest challenges, what do I do? Like, tell me basically how to change the data, I want to make a change. How do I actually turn this into a policy that is doable, given all the restrictions that policymakers often have? 

I don’t know if I’m optimistic, but I’m excited because I’m a geek, and I love these things, and I love these questions, and happy to be in a podcast where I share that excitement with you all.

Sarah: Well, there’s so much in both of your answers. Did you want to jump in, Dana, with something?

Dana: Yes, I’ll just jump in because Alfonsina, I so appreciated your point about the lacking power analysis piece of this. And I don’t know if this is optimistic or pessimistic, I think they are both, but I think in some ways, at least we’re seeing in the girl’s education space, that some of our overly simplified data is almost coming back to bite us because, you had this simple analysis of girls are not in school and boys are, and now we’ve solved that problem.

And actually, in some countries and contexts, it’s the opposite. Boys are less likely to be in school than girls. And so people are saying, oh, we solved the gender problem and don’t have the deeper analysis or just because we’ve solved the access to school problem does not mean that we’ve solved these power dynamics.

So I think, in some ways, some of this can almost be a risk that oversimplify the messages, and we miss that power analysis. I think we need even more resistance and pushback. I think that the optimistic side of this, I think, is that is increasingly a part of the conversation. And actually Alfonsina, Co-Impact and Echidna Giving share a grantee in India, Breakthrough, and Sarah you’re doing work with them through Data on Demand I think as well, so there may be a shared experience across all of us. But they’re really saying well how do we within education systems begin to talk about that, those power dynamics and begin to shift them.

So I think the hope is that we will start to have more of that sophisticated analysis as well going forward.

Sarah: I love that. I feel like it’s the next-generation tagline here around the data to the decision-making community. In the beginning, it was just we have to articulate the risks of not using data to inform decisions. And now for the next generation we have to articulate the risks of using oversimplified data to inform our decisions. I love that.

Let’s close with a question I didn’t preview, so I apologize in advance. I’ll give you a second to think about it. But hearing everything you’re talking about has made me want to ask the following: If you were not bounded by your institutional strategies, if you were not bounded by the practicalities of grant-making, if you were not bounded by your budgets, what is something you would love to spend a million dollars on?

Like what is something you would love to be doing to really advance gender equity? And ideally, your answer includes gender equity and data, but it doesn’t have to. Really, I just want to know, dream big, what would you really want to do to advance gender equity bound by nothing? Alfonsina, do you mind if I start with you?

Alfonsina: No, I don’t mind, but my mind is going like in three thousand different directions of what I would do with a million dollars. But I’m trying to rein it into the topic of today. So I think this will become a little bit Mexican in style and maybe a little bit incoherent with some sort of links in between that.

I think one thing that has really surprised me in conversations with particularly feminist organizations is that there’s a lot of trauma around data use, data evaluation, particularly because of who have done it right. If you ask communities who are often part of our cities or evaluations, you know ‘X’ organization, usually global north based, comes in, they do some survey, they leave, they never hear back from them. They don’t know what happened. And, there’s an assumption that data and evaluation doesn’t really work, that it doesn’t serve the purposes. 

But when you actually start having the conversation of, well, what is your strategy and what are you trying to do, and how would you know if you’re reaching that right? And how would you know if you’re in the right direction? These are questions that they have asked themselves, and they have answers for. And so one aha moment that I’ve had and that I would love to spend maybe one part of that $1 million would be on, how do we reframe these conversations so that for some folks it makes sense for them without necessarily using the frames that we use like M&E or monitoring or evaluation or data use.

It’s almost like a translator of what makes sense because all of these different groups, regardless of what work you do, whether it’s community-level service or think tanks, they’re thinking about these things. It’s just you name them differently, and the way that you use them are different. So part of that would be sort of figuring out what language can we use to work with different audiences to sort of convey the importance of data, but in a way that resonates with them without sounding like an imposition or something that’s been named and used by others in probably very colonial and extractive ways.

So that’s one way. I also think there’s just very little funding on data and data use. And I think, you know, yes, we have filled data gaps, but we, for example, work with women in leadership and in the field of law under the same assumption of sort of representation of women, because economics and law are two fields where so many leaders come and make decisions about women’s lives and where women, particularly within the upper echelons, are not represented.

And every conversation we’ve had with every program partner, what they say is we don’t know. There’s not enough data on how many women judges do we have, where are they, how are courts, international bodies selecting candidates. So there’s still a lot of data gaps, and they’re not funded. So I think a million would have even if it’s a drop in the bucket of what would be needed in order to fill those data gaps.

And then a third element, which I think is really important, is around sort of the question that you were asking around what is needed for this to actually get implemented. I don’t know how to call it a politicization of data, or maybe this involves, very thankful for my serendipitous experience of having spent some time in government, because I think that gave me an insight into some of the challenges that government officials face. But often, I think civil society, particularly is not empathic to, and I understand why – it often is the responsible person for violating rights, not providing the services. But there are sometimes really real constraints that people are operating under and I think understanding those and then working together is important. So I don’t know if this sounds super kumbayah, but this sort of like meeting moment where you can have sort of civil society with government officials, but the actual people who get the work done, not the ministers and not the folks who end up in the pictures, the actual operators and mentors and figuring out are there overlaps where these things could actually work.

So I don’t know what that would look like. A million dollars would fall really short, but those are some initial thoughts that come to mind, given our conversation.

Sarah: Okay, I’m upping the ante to definitely $500 million. I am so excited about all those things you just laid out. And I want to observe Alfonsina also, how you so beautifully did what your first or second priority was, which was around using language to convey the importance of data and evidence in a way that resonates with people.

Because this last point that you’ve made, really understanding the constraints that governments are working in and bringing governments together with civil society, I call those things like institutionalizing evidence use and communities of practice. I mean, it just doesn’t mean anything. And so you’re already leaning into these, using language that is more responsive. It makes me think so strongly of the Dignity Initiative at IDinsight, which really tries to centre dignity and respect for the communities and the people that we are collecting data from and around and that we are ultimately trying to serve.

It seems so fundamental. But you’re right, it’s a glaring gap. And to have you use the word trauma associated with that is really a call to action. So thank you for putting such a fine point on it. Okay, Dana, how would you spend $500 million dollars?

Dana: 500? Oh, wow. I was thinking only of a million, Sarah

Sarah: Well, Alfonsina convinced me that a million wasn’t enough.

Dana: Well, where my head went, I think, it’s sort of an amalgamation, actually, of the first point you made Alfonsina about wanting organizations who maybe have historically felt left out or diminished in some ways almost by some of the data and evidence picture and the point you were making, Sarah, just how about the Dignity initiative, I feel like we have probably not done as much as we should at understanding kind of from the perspective of our end target themselves.

You know, adolescent girls, for example, or even younger children, about what is it that actually they are experiencing and how do we not overly simplify what data we’re gathering based on what we think is important, but actually understand what are the outcomes that they are seeking and using that as kind of the starting point for our data efforts.

I think organizations we are supporting are getting more towards that and actually kind of defining the questions from those who are, you know, in schools implementing programs, saying, wait, these are the things that we think might be the most important questions to answer. Can we take that even further and say, like children that they’re serving, what are their experiences? How do we think from the bottom up, what is the data that we should be gathering in the gaps that we should be analyzing and bringing to light and the kind of problems that we should be solving? 

So we’re headed in the direction of, thinking about this much more from the the bottom-up data demands that are more understanding of communities and being directed back into what they are looking for solutions around.

Sarah: Thank you so much. I think together you have done a few things in this conversation. One of them is a really powerful call to action for all of us in the gender equity space to really centre data  and all of us in the data and evidence space to really center gender. Which is a beautiful call to action, and one that you have done so compellingly.

The other thing which I really appreciated about this conversation is it’s such a deeper and more sophisticated discussion about what gender data gaps actually mean in practice and how that notion that we’ve called these things gender data gaps for a decade or two. But what that means has really shifted, and what we expect to fill in terms of gaps has really evolved and become more sophisticated.

So I appreciate how you’ve elevated the conversation and then also illustrated it with so many beautiful examples and closed us out with this vision of how many more exciting ways there are to make the world a better place for women and girls, and to use data and evidence in doing that. I come away from this conversation incredibly energized and thrilled by your brilliance and thoughtfulness.

Thank you both so much for taking the time. We’re going to close it here.