> Posted by Sonja Kelly, Director, CFI
A recent Facebook promotion by a U.K. coffee shop offered, “Like us on Facebook and get a free coffee!” This line would totally get me. Wait… all I have to do is click one little button, and I can save $2? Sign me up!
A free cup of coffee, however, was not the only thing that customers received when they liked the coffee shop’s Facebook page. They also got a very “personalized” experience, complete with the barista at the coffee shop rattling off their job, religion, birthdate, address, mother’s maiden name, and more.
Check out the video that documented the customers’ experiences here:
(My favorite part is when the barista says to the customer, “Oh, we know everything about you, Martin.”)
As part of the CFI Fellows Program one of our fellows, AJ Mowl, has been looking at some of the pros and cons of leveraging consumer data for financial inclusion. As she has relayed to me some of the basic facts about big data, I have become more and more aware of just how big big data is—and what the consequences are when I trade access to my data for services.
LinkedIn, for example, learns all about my professional network, and in exchange it provides me with access to a place to exchange ideas and connect with colleagues. I was totally fine with that until I heard that Microsoft was buying LinkedIn in part to sell its products. According to Microsoft Chief Executive Satya Nadella, LinkedIn will help Microsoft as it goes into sales meetings—arming them with the full bios of customers—data that would usually only be available to a person’s accepted network. Or it could super-charge Microsoft’s customer relationship management, allowing them to systematically and individually target consumers who are in particular industries or have particular jobs. Or it could integrate LinkedIn into Office products like Outlook so those I correspond with could have more data on me, my job description, my work history, and my references (to be frank, I would find this helpful as a user of the data, but I’d like to have a little more control over how my own data is viewed by others).
As another example, consider how our internet service providers use our browsing data. A few weeks ago, I searched for a “Pack ‘n Play” (portable crib) for my baby daughter. All of a sudden, the ads on the sides of my browser were full of Pack ‘n’ Plays. I ended up clicking through to one that was lower priced and slightly better than the original one I had searched for. Thanks, big data!
When I shop at the grocery store, I get coupons on the back of my receipt highlighting items that the store thinks I should buy, based on the purchase history on my credit card. Because I bought cold medicine they assume I’ll want tissues, and I can take $1 off a pack of four! Where this becomes more controversial is when Target knows your daughter is pregnant before you do.
Big data is indeed big, it knows me way too well, and it is here to stay. There is only so much tinkering I can do with my privacy settings on Facebook, only so much shopping I can do “off the grid” using just cash, only so much censoring I can do of my LinkedIn profile before it hampers my career.
In light of the reality of big data, and thinking about financial services, there are two pretty important questions that emerge to me. First, are the institutions I engage with interacting with my data in a responsible way? Second, is the use of big data making me, as a customer, feel uncomfortable?
Responsible use of data by financial institutions is a serious issue. The White House issued a report recently that questioned whether big data use in credit decisions could lead to discrimination. The algorithms that comb through those big piles of data are often assumed to be objective. But in reality, they directly mirror the subjective inputs of their creators. In other words, in credit decisions, it is the coders that ultimately determine who is a credit risk and who isn’t. Hopefully the criteria they choose are not ones like race, religion, or sexual preference. But they may include close proxies with the same result. Who in the institution is responsible for thinking about discrimination? Who is going to hold the financial institution accountable? How can I know if I am being discriminated against, and what are my recourse mechanisms?
The second issue is more one about consumer comfort. When I go into a store, I don’t love overbearing sales clerks who constantly watch me and ask repeatedly “Is there anything I can help you with?” In the same way, consumers of financial services shouldn’t have to feel like they’re constantly being monitored. As a customer, I want to take the lead—I don’t want to feel that the institution is manipulating me into particular products or decisions. I want help, but I don’t want an institution to “hover” over me.
Sure, big data is big. And it’s here to stay. But I’m eager to know how the industry can be reflective as big data becomes more integrated into our financial services.
Stay tuned—CFI Fellow AJ Mowl will be releasing her report on big data this fall.
Have you read?