In 2020, CFI and Women’s World Banking began workstreams on algorithmic bias. This conversation between Alex Rizzi (CFI) and Sonja Kelly (Women’s World Banking) builds on recent reports by CFI and Women’s World Banking. Below are their answers to some important questions about the future of AI and machine learning as they apply to financial inclusion.

Where are we now in the conversation on algorithmic bias? Is it too late to address this?

Alex: It’s just the right time! While it may feel like global conversations around responsible tech have been going on for years, they haven’t been grounded squarely in our field. For instance, there hasn’t been widespread testing of de-biasing tools in inclusive finance (though Sonja, we’re excited to hear about the results of your upcoming work on that front!) or mechanisms akin to credit guarantees to incentivize digital lenders to expand the pool of applicants their algorithms deem creditworthy. At the same time, there are a bunch of data protection frameworks being passed in emerging markets that are modeled on the European GDPR and give consumers data rights related to automated decisions, for example. These frameworks are very new, and it’s still unclear whether and how they might bring more algorithmic accountability. So it’s absolutely not too late to address this issue.

Sonja: I completely agree that now is the time, Alex. Just a few weeks ago we saw a request for information here in the U.S. for how financial service providers use artificial intelligence and machine learning. It’s clear there is an interest on the policymaking and regulatory side to better understand and address the challenges posed by these technologies, which makes it an ideal time for financial service providers to be proactive about guardrails to keep bias from algorithms. I also think that technology enables us to do much more about the issue of bias — we can actually turn algorithms around to audit and mitigate bias with very low effort. We now have both the motivation and the tools to be able to address this issue in a big way.

What are some trends leading us down the wrong path when it comes to algorithmic bias?

Sonja: At the risk of being too broad, I think the biggest trend is lack of awareness. Like I said before, fixing algorithmic bias doesn’t have to be hard, but it does require everyone — at all levels and within all responsibilities — to understand and track progress on mitigating bias. The biggest red flag I saw in our interviews contributing to our report was when an executive said that bias isn’t an issue in their organization. My co-author Mehrdad Mirpourian and I found that bias is always an issue. It emerges from biased or unbalanced data, the code of the algorithm itself, or the final decision on who gets credit and who does not. No company can meet all definitions of fairness for all groups simultaneously. Admitting the possibility of bias costs nothing, and fixing it is not that difficult. Somehow it slips off the agenda, and that is very discouraging to me.

Alex: One of the concepts we’ve been thinking a lot about is the idea of how digital data trails may reflect or further encode existing societal inequities. For instance, we know that women are less likely to own phones than men and less likely to use mobile internet or certain apps; these differences create disparate data trails, and might not tell a provider the full story about a woman’s economic potential. And what about the myriad of other marginalized groups, whose disparate data trails are not clearly articulated?

Who else needs to be here in this conversation as we move forward?

Alex: For my colleague Alex Kessler and me, a huge takeaway from the exploratory work was that there are plenty of entry points to these conversations for non-data scientists, and it’s crucial for a range of voices to be at the table. We originally had this notion that we needed to be fluent in the code-creation and machine learning models to contribute, but the conversations should be interdisciplinary and should reflect a strong understanding of the contexts in which these algorithms are deployed.

Sonja: I love that. It’s exactly right. I would also like to see more media attention on this issue. We know from other industries that we can increase innovation by peer learning. If sharing both the promise and pitfalls of AI and machine learning becomes normal, we can learn from it. Media attention would help us get there.

What are the immediate next steps here? What are you focused on changing tomorrow?

Sonja: When I share our report with external audiences, I first hear shock and concern about the very idea of using machines to make predictions about people’s repayment behavior. But our technology-enabled future doesn’t have to look like a dystopian sci-fi novel. Technology can increase financial inclusion when deployed well. Our next step should be to start piloting and proof-testing approaches to mitigating algorithmic bias. Women’s World Banking is doing this over the next couple of years in partnership with the University of Zurich and with a number of our network members, and we’ll share our insights as we go along. Assembling some basic resources and proving what works will get us closer to fairness.

Alex: These are early days. We don’t expect there to be universal alignment on de-biasing tools anytime soon, or best practices available on how to enforce data protection frameworks in emerging markets. Right now, it’s important to simply get this issue on the radar of those who are in a position to influence and engage with providers, regulators, and investors. Only with that awareness can we start to advance good practice, peer exchange, and capacity building.

Follow CFI (newsletter, LinkedIn, and Twitter) and Women’s World Banking over the coming months to stay up-to-date on algorithm bias and financial inclusion.


Alex Rizzi

Senior Research Director, Consumer Data Opportunities and Risks

Since joining CFI in 2012, Alex has been an advocate for consumer protection and evidence-based inclusive finance practices.

She manages the responsible data practices research portfolio at CFI which focuses on opportunities and risks for low-income consumers from data-driven financial services. Previously, she was a senior director for CFI’s Smart Campaign initiative, where she managed consumer-protection learning agendas and helped launch the Client Protection Certification Program.

Prior to joining CFI, Alex was a project development manager at Innovations for Poverty Action, where she helped create its microsavings and payments innovation initiative (now part of its global financial inclusion initiative), a research fund focused on impact evaluations of savings products and payment systems. She also worked at the Centre for Microfinance at IFMR Research (now Krea University) in Chennai, India, where she was the head of knowledge management and dissemination.

She has participated in multiple industry working groups on responsible finance, including an advisory group to GSMA’s mobile money certification program and the Microfinance Center’s social performance fund steering committee.

Alex is a graduate of Princeton University and holds a master’s degree from Georgetown’s foreign service school, as well as a certificate in digital money from the Digital Frontiers Institute and The Fletcher School at Tufts University. She speaks conversational French and needs to work on her Italian.

Sonja Kelly

Director of Research and Advocacy, Women’s World Banking

Sonja Kelly is the global lead for Women’s World Banking research and was the research director at CFI from 2011 to 2018. Through research on the financial sector, policy trends, financial services providers, and end users, Sonja and her team advocate for women’s financial inclusion. Before joining Women’s World Banking, she advised the U.S. Department of State on strategy for U.S. Embassy engagement in digital finance around the world. She has also held consulting roles at the World Bank and the Consultative Group to Assist the Poor (CGAP), and has worked in microfinance at Opportunity International. Sonja holds a doctorate in international relations from American University where she researched financial inclusion policy and regulation.

Explore More

Privacy as Product, FSP designers
Toolkits and Guides

Privacy as Product: Privacy by Design for Inclusive Finance

Sign up for updates

This field is for validation purposes and should be left unchanged.