The role of data is increasingly crucial as the financial services industry shifts to digital delivery, alternative analytics, targeted marketing, and data-driven customer segmentation. As outlined in the recent Accion report, Unlocking the Promise of Big Data to Promote Financial Inclusion, the future of financial inclusion will include higher volumes of better quality and more wide-ranging data to expand access, lower prices, reduce bias, and drive innovation. However, the use of big and alternative data in financial inclusion is not a value-neutral trend—nor should it be.
In fact, the explosion of data has generated a great deal of thought, and maybe even a little anxiety, among the financial services community. This was one of the convictions that emerged last week in Berlin at the Responsible Finance Forum, which I attended with CFI Fellow John Owens and my colleague Ana Ruth Medina from the Smart Campaign.
Consider the variety of projects CFI has been involved in to understand the risks associated with using data for financial inclusion:
- CFI Fellow John Owens is conducting research on the landscape of standards for responsible digital financial services. His early findings reveal that we cannot simply import existing standards for responsible traditional financial services onto digital channels—new standards are needed. John will continue to assess what new elements need to be in place in a client protection framework for digital services.
- CFI Fellow Patrick Traynor is conducting an app-teardown on digital credit apps to expose where they are mishandling data. Previous research of Patrick’s shows that the digital credit companies using apps tend to care more about getting and keeping customers than handling their data responsibly.
- JUMO’s digital lending platform utilizes data to predict which mobile customers are suitable candidates for credit, and then begins lending amounts as small as US$2. In 2016, JUMO and its investor LeapFrog (via LeapFrog Labs) commissioned adapted Smart Assessments of JUMO’s lending models in Tanzania and Kenya. After this assessment of everything from the design, marketing, and piloting of the lending algorithm to the customer service model to support queries/complaints, JUMO used the insights to strengthen client protection within its operations.
Client protection in digital financial services (DFS), including mobile-based lending, makes business sense. Research by MicroSave found that while 85 percent of DFS customers said they would recommend DFS to others, they thought of it as a Plan B due to lack of trust – stemming from concerns including data protection and security.
Government approaches to the collection, use, and storage of data vary, in part due to differences in views about the impact on ordinary people. The German government, for example, is extraordinarily cautious about how to approach data. Companies operating in Germany must be able to explain the relevance and purpose of data they use. After use, the data must be discarded. Websites in Germany are even required to tell users that their online activity is being monitored (in readable font within a pop-up on the site, as shown below).
The Germans need only to look to their history to be worried about the collection, use, and storage of data. Less than a century ago, millions of people’s personal data were used against them, with catastrophic consequences. That historical precedent contributes to Germans’ view of data as a potentially dangerous form of social control.
China, on the other end of the spectrum, is generally less circumspect about collecting and using data to define opportunity and change behavior. We’ve talked previously about China’s social credit score, which bases capacity to repay on behavior ranging from spending habits to filial piety to turnstile violations and beyond. In addition to collecting data on non-financial activity, China is allowing the use of such data to influence financial opportunity for individuals. For financial service providers, this means a fairly open playing field on which data can be deployed to make credit and other decisions.
Participants in the Responsible Finance Forum expressed their concern that the use of personal data in China is poised to support a dangerous level of social control. The extent to which the financial services industry is complicit in the use of data for harm (either intentional or unintentional) is a question we should all be concerned about.
The collection, use, and storage of data in financial inclusion are not—and should not be—value neutral. Whether countries implementing data policies choose the China model, the Germany model, or like the United States, something in between, has direct implications for the amount of power that governments, corporations, and individuals wield. If the financial inclusion community is to be responsible in its work with underserved customers, it must actively work to create a world in which information about people does not lead to their harm.
Stay tuned for part two on what CFI is doing about this.