Edited transcript of the episode:
1:03
Devanshi: We wake up and reach for our phones. They’re with us all day, and most of us probably fall asleep scrolling as well. That sounds familiar, right? Day in and day out, technology is woven into our lives in ways we barely notice. But here’s the thing. More often than not, tech is not built for the people it’s meant to serve. Apps and platforms are designed for growth, clicks, and engagement, not necessarily for well-being, meaningful connection, or real impact. To explore some of the promises and pitfalls of technology, we’re joined by Grace Clapham and Priya Goswami. Grace has worked at the intersection of tech, systems, and communities for more than two decades, and was formerly the director of community partnerships, product, and programmes at Meta. She’s also the founder of the Worth Club, a movement reimagining leadership, value, and self-worth. Priya is a National Award–winning film-maker and technologist and co-founder of the Mumkin app, an ethical tech initiative focused on feminist technologies. Drawing on over a decade of advocacy with survivors of gender-based violence, Priya has spent her career thinking about how tech can actually serve people, not just reach millions.

Together, on this episode of ‘On the Contrary by IDR’, they’ll discuss what it means to design technology intentionally, scale deeply, prioritise value over vanity metrics, and build products that genuinely meet the needs of the communities they reach.
Priya, since your work is at the intersection of tech and mental health, from your vantage point, what do you think are some of the most promising ways that technology has improved well-being? And what are the ways in which you think it’s falling completely short?
02:52
Priya: Technology has made it super accessible for anyone to be accessing what they couldn’t earlier. Everything is a finger-click away. I’m going to indulge you in a little bit of nostalgia. Imagine the web of the ‘90s. Remember what that was. When my father [first] gave me a PC, he brought it to our house and he said, “You can learn as many languages as you want with the PC.” And I was a [child] so I said, “Wow, does that mean I can learn Spanish and German and all of that?” And he said, “No, I mean C++ and Java and those kinds of languages.”
How we deploy technology and who gets to deploy technology are the big questions I foreground as a tech creator now.
So that is my core memory of technology and connectivity: bringing the world on a common platform [through a] common language, even if someone is speaking Spanish, Italian, Hindi, or Bengali. I believe [it] is Joana Varon, one of the leading feminist researchers on technology, [who] coined this term that the web of the ‘90s sounded like [a] tool of revolution. For precisely this reason that it could be anything to anyone. For [a] teenager who wanted to learn a lot of languages, for me, it became my gaming console. It became my way to understand the world. It became my way to consume knowledge.

However, how we deploy technology and who gets to deploy technology are the big questions I foreground as a tech creator now. Because often, it is still perpetuating new colonial patterns of the White man ruling. The technology has been perpetuated, percolated down to us through White men from Stanford, Harvard, who are dictating how our world view should [take] shape. So how can we make this an egalitarian platform and bring back the promise of the ‘90s that [technology] can be anything for anyone?
4:51
Devanshi: Grace, from your experience in big tech, what do you think is fundamentally going wrong in how technology is designed and deployed?
4:59
Grace: Many people on these big tech platforms, they’re coming from an emerging markets [perspective, and have] a very different level of understanding. We’re often designing products from a Western perspective [and not from] the Eastern. And not that it’s East versus West, but I’ve seen [that when] products are designed from the West [and] come into Asia, [they do not take context into account such as] behavioural and cultural nuances. And algorithms aren’t able to pick a lot of that up because we’re still learning. And so, I think [the solution is] being able to design locally but staying globally attuned and locally attuned. Something that we do at Meta—and I think we need to do more of it—is that when we are supporting community leaders, some of the largest communities are in markets like India or Indonesia. And oftentimes the products that we were designing didn’t necessarily work on a 2G phone.
So that’s a very simple example [of] where information that’s being shared may work on a 5G phone, but doesn’t work on a 2G [phone]. Again, that’s a very simple way of looking at how we design for local [markets]. The other [thing] is thinking about behavioural usages in a market like Africa, Indonesia, and India. [For example,] the way payments are made in these countries are so different. Again, I’m not a payment gateway expert, but people in Indonesia have GoPay, [and other regions have many] different apps for making payments, [but] the US has Stripe or PayPal. That’s not how money is transferred necessarily [in other countries]. [Another thing is] taking [words] into consideration. Certain local or indigenous words may show up differently algorithmically, or mean something different when it shows up as an English [translation]. And that is also because there’s so many languages, there’s so many nuances to languages. And algorithms right now are still not at that stage because a lot of what’s being fed is still [coming] from the West, because [of] accessibility and all these other things.
7:25
Devanshi: Why do you think that is?
7:28
Grace: I think a lot of products right now are [being] designed [by] the people who are in the [tech] organisations, [but] not often designed [in collaboration with] the people we want to be serving. So often products are designed with the intention of monetising and scaling—which is not a bad thing [because] we need to be able to sustain ourselves—but often we’re driven because VCs are funding the platform or investors want to see a certain hockey stick growth. Instead, we need to start thinking [about] what the need is. Where is the gap? What should we be looking at? What problem are we trying to solve? And [how can we] bring the humans, the individuals, the communities around us at the start of that product design so that we’re co-creating versus just building something without thinking of others around us.
8:28
Devanshi: Grace, you’ve touched upon something important: scale versus impact. Tech is often seen as a tool to reach millions, driven by growth metrics that investors care about. But this approach can actually widen the very gap that it sets out to close. Priya, with the Mumkin app, the goal was to create a safe, foolproof digital space where users could interact with an AI chatbot about extremely difficult conversations on gender-based violence. The idea was to use tech to support mental well-being and facilitate difficult conversations safely. But well-being is personal. It’s not about reaching millions. In fact, it’s actually about reaching one person effectively. How do you reconcile both of these realities in the work that you do?
9:11
Priya: You don’t. You miserably fail. And that is okay. I’m going to talk about community first in this question so that I foreground what community-centered tech should [look like]. What is a community? Think of a brunch date with your friends. It doesn’t matter who’s what [or what our identities are]. We will commiserate openly. We will laugh openly. We’ll cry openly. We’ll cheer each other [up] openly. We will also perhaps share, and confide [our] wins and failures. That is a community. Community is where you can be yourself. And I think technology has the potential to unite everyone. But the kind of technology we are governed by…and I don’t use the word ‘govern’ lightly because we are currently being governed by tech created by Western sensibilities, Western capital centres at play…the kind of tech that is governing us at the moment does not allow room for us to fail and not be addicted to something or not tune into it if we don’t want to. The goal that they are operating by is you plug in [to social media] and community is where you unplug. So, there is no universe in which these are reconcilable concepts. They are not. And that is okay. And as a feminist tech creator, that was the first lesson I needed to learn. If people are not taking to my app, if people are not comfortable using it, if people are getting angry at it, if the emotion is more anger than catharsis, that is okay. Because it is a survivor’s journey, and it is not going to be envisaged by anybody in the design room who sat and designed that experience for the survivor. That survivor has been through beats of their own life and a machine can’t predict what their journey would look like. And that is okay, which is why, for me, the first and foremost question is technology can unite and construct and offer so much, but who’s building it and with what intention? That is the question that needs to be foregrounded in all conversations.
If the technology is built to scale, every feminist pilot will fail. And that is okay.
11:55
Devanshi: Priya, with the Mumkin app then, how do you reconcile both, right? Keep people at the heart of it, keep survivors at the heart of it, keep your feminist principles at the heart of it, but also make it an app that reaches more people, that gets out there.
12:10
Priya: Mumkin’s pilot was live between 2020 and 2023, and it was the pre-ChatGPT era. So, people were still not very used to apps talking to them and asking them what they want to talk about today. It was still a novel concept. And if I were to say, what would be a good example of [reconciling both], it would be tech that is designed with the idea that one size does not fit all. It’s designed with an idea that if I’m talking to survivors of gender-based violence [and if they are] from South Asia, I may alienate people from Australia or I may alienate people from somewhere else in the world, but that would be okay. My app will not have a monthly recurring revenue [of] millions of dollars. And my users will not [grow at a monumental rate]. My journey and goals will not be that. But my journey and goal is that five survivors enjoyed having that conversation. They felt very frustrated, and then they reached out to a therapist in real time, an actual human being, and got resolution to their trauma; that number five is more significant than 50 million. And as a feminist tech creator, I’m genuinely okay with that number five [being] a growth metric; that is where that trade-off can happen. That to me, to be able to tell you that I reached out to five survivors and they sought help and it was a very important process for them, my job is done.
And it’s not just Mumkin. There’s another app that I learned about [from Indonesia] [that focuses] on cyberbullying. In fact, it transcended 1 million+ users because there were so many women who were cyberbullied. There are enough examples of good tech led by feminist or ethical technologists who ground their community at the forefront. And what the needs of their community are has to be uniquely tailored by the creator. It may not look like the overarching goal of multiplying users, but it should be something effective.
14:35
Devanshi: And, Grace, then what do you think it would take for more technologists to centre well-being when they are designing technology?
14:43
Grace: I’m not saying this is worth-centred leadership, but I think what we’re seeing now is a movement of people wanting to build products with more purpose and with more intention and consciousness. But I just want to emphasise that I think worth-centred leadership is about how one leads. So, if we’re leading from that place of worth, we aren’t validating ourselves through what the people around us think or what the media thinks about us. When we’re coming from that place, we’re looking at it as, “Am I designing this with purpose and with intention? Am I designing this for value or am I designing this for extraction?”
15:23
Devanshi: I think I have to follow this up with you. You talk about worth-centred leadership. Can you explain what that is and how that ties into using products for good or how it ties into designing them?
15:36
Grace: So worth-centred leadership is a framework I’m still building. [It explores what happens when a] leader is leading from a place of inherent self-worth [as opposed to] a place of self-esteem. We’re not leading from a place of external validation and ego. We’re leading from our inherent worth. It [asks leaders big questions such as] does this or does this not enable human flourishing or societal flourishing. [In the process, it moves beyond] scale. [I can break down what each word in worth stands for in this framework]. W is for worth beyond metrics. [Understanding] what is the value [that] we’re creating and why are we creating it. Who are we creating it for? What is the impact that we want to have? O is for ownership of story. [This means] owning our story and being okay with [it] and embracing [it], and letting our individual selves shine in the story. [It’s about] reclaiming our narrative. R is for relationships and reciprocity, and it’s really about that value flows mutually. And I see it as this infinite loop where we need to ebb and flow with ourselves [as well as the] rest of society. And T is for transformation and tenacity. It’s [about] knowing that change is the only constant. How do we embrace change? How do we thrive in change? How do we remain resilient? And deep change is where we can really make big things happen for ourselves and society at large. And then, finally, H is [about] looking at life and success holistically. [Not just] financially, but looking at all the different dimensions of what success could mean for you, [which includes] social, financial, personal, spiritual, or relationships. That’s how I’m looking at it at the moment.
17:39
Devanshi: Thank you for breaking that down. I think it makes it a lot clearer. Grace, do you have any example of tech that centres well-being or self-worth? Or do you know what it could look like tangibly? And I ask this so that we can help people picture it.
17:53
Grace: One is thinking about designing versus the infinite scroll. Maybe thinking about how [to move beyond just] having someone on the platform [for] as long as possible. But maybe intentionally designing pause options, or questions before you post: Why are you posting this? Are you posting this for validation? Are you posting this because you feel you have to?
[An] ethical design should [let the users know:] Do you see this? Do you acknowledge this in plain and simple terms?
One UAE-based company right now that I’ve been advising is called Pipp. And they are looking at designing their product where it’s connecting creators, brands, and individuals, but [with] moments of pause in their app. They are looking at designing AI integrations [that ask the user if they have] been on the platform for too long. [It asks them if they] want to take a break. So we’re not trying to get as many eyeballs for as long as possible. That’s an example. [Another example] is transparency in terms and conditions [as well as] privacy and data. I mean, how many times do you go on an app [and realise that you] need to consent [to] this or [that]? And there’s just no clarity on [how to understand these]. So [an] ethical design should [let the users know:] Do you see this? Do you acknowledge this in plain and simple terms? I think those are things we should consider.
19:39
Priya: Very interestingly, just this morning I was [reading] a paper on emotional manipulation by AI. And one of the things that it was touching upon was this tactic that when you [are about to log off from the AI], the AI companion apps throw things like, “But before you log off, I have just one more thing to say.” And then that curiosity becomes the trigger that [keeps you on the app]. And the paper is done by Harvard scientists, and they’re classifying the types of emotional manipulation by AI. One of them being delaying the farewell [for] as long as possible. Look at that, delaying farewell. That means you get to spend more time with that AI companion, which is anthropomorphised by choice, by design, so that it makes you feel like a sentient bot is at the other end, which is a whole other conversation we can go into.
20:35
Devanshi: Okay, that’s both fascinating and a little scary to think about. I love how you’ve both landed on the mechanics of design and how tiny choices can have huge consequences. The idea of deliberate pause moments feels powerful, but Priya’s example also shows how easily design can be weaponised. Let’s zoom out for a second. Both of you have said that we need to scale deeply and not just widely. What does that mean for you in terms of how we build tech, lead teams, and work with communities? What does scaling deep mean? Grace?
21:10
Grace: Scaling deep [is a] term [that] came from Tatiana Fraser. And she talks about the transformation being systemic and [for a] longer term. For me, scaling deep [works on] three levels. One is the individual. How do we as individuals scale deeply? And really that is the foundation of everything: us, ourselves. Second is, how do we scale within societies or communities? And then three, how do we scale deeply when it comes down to the thingswe create that have a ripple effect? So, it’s sort of these three concentric circles. But, to me, scaling deep is I’d rather work with 1,000 people and go really deep in transformation and change. Or have a product that really has that deeper impact versus [reaching] a million people where then tomorrow they forget about you or move to a better platform because there’s someone else. I want my work to leave an imprint and a long-lasting effect that is generational. And that has a ripple effect.
22:44
Devanshi: But let’s talk about donors and investors, right? The metrics that they want to see are time spent on the app or the number of users the app is reaching. So what can we offer to them as alternatives or say that these are the metrics that we should be looking at? If we’re talking about tech that actually centres people, that centres community, that is feminist, that is intersectional, here are the things that we should be talking about. What would that look like? Priya?
23:12
Priya: From the perspective of donors and funders, the whole model needs a thorough evaluation. I say tear the paper, rubbish it, and then start afresh. Because the billions of dollars, they have to be put to good use somewhere. Community-driven technologists, whether they are feminists or have some other goal in mind, need to have that conversation with the donor. And they need to very clearly specify that time spent on the app is not the end goal. Monthly recurring users multiplying—that is not the end goal. Maybe a good business model could be a freemium membership, where if the user finds some kind of benefit in the app, they subscribe to a paid model. And there have been good examples of paid membership in other industries as well, which traditionally were thought not to be too viable. For example, Newslaundry. They came at a time when nobody thought anyone would be subscribing for news. But then there are paid audience groups who are very interested in a certain kind of news.
24:23
Devanshi: Priya, looking ahead, if you had a blank slate to build the next generation of well-being on mental health technology, what are the principles that you would centre?
24:33
Priya: I would ensure the team has a socio-technical lens. I would not just invest in the most bright coders from the shiniest institutes. I would be more interested in where they come from, [and] what their values are. And, as a founder, that is how I screen if I have to collaborate with anyone or any company or organisation. What are their values? So, first and foremost, centring a value-driven approach, not an efficiency-driven approach. I’m not interested in how quickly you can show me the results. I’m more interested in [knowing] if you are to hold a conversation with a person in an aggravated scenario, would you be able to queue in language tools that are sensitive to XYZ parameters? I don’t care if the screen goes blank and says processing for 30 seconds. I don’t need the fastest result in the room. So, centring a value-driven approach from top to bottom, from bottom to top, and making sure everyone in the room [has] a shared value. That shared value can be feminism, that shared value can be [a] community-first approach. The shared value can be anything, but having that very crystal-clear vision of what that value is and whom we are serving [is imperative]. We are not there to extract something from them; we are there to serve them. Can we bring that approach back?
25:55
Devanshi: Grace, as we wrap up, I’d love for you to take us a step back. You’ve spoken earlier about worth-centred leadership and designing with intention. If we think about the future of technology and well-being, what does it look like when we’re designing not just for attention, but also for belonging and real value?
26:14
Grace: I want to add to that question around not just belonging but designing [in a way] that creates positive and reciprocal value for ourselves, for the people around us, for the communities, for the society, and for the world at large. What [would enable] human flourishing? Belonging is essential as humans want to belong, and it has been the backbone of society. There is an innate need for humans to feel a sense of belonging. But when we take it from a lens of moving from just algorithms to belonging and value, it’s really looking at how we are designing for humans instead of just vanity metrics, instead of just likes or shares or reposts or followers. We need to start thinking about what is the value that this product or this platform or this tool brings to an individual that has deeper impact and deeper meaning? How are we driving meaning? How are we driving meaningful conversations? How are we driving and co-creating meaningful engagements and interactions with those who are on the platform? And so it creates more of a ripple effect.
—
Read more
- The Future Is TransFeminist: from imagination to action
- Emotional manipulation by AI companions
- The art of scaling deep
- It’s surprisingly easy to stumble into a relationship with an AI chatbot
- Transnational AI and corporate imperialism
- OpenAI models are steeped in caste bias
- The limits of AI in social change






