Should banks be in the business of ‘surveillance capitalism’?

Google’s tracking of credit card purchases and linking them to users’ online profiles and search patterns raises a number of knotty questions for banks.

The tech giant says it just wants to show advertisers that the ads they placed led to sales, and there’s no reason to doubt the company’s intention. But if consumers understood that their card transaction data was being sold to Google, would they sanction this? Or would they ask the banks and card issuers that collect and store their transaction data to think carefully and perhaps ask their consent before passing this information over to third parties?

These questions are relevant to banks because they are complicit in the march toward “surveillance capitalism” — a world where consumers’ every move is recorded without their knowledge and the information is monetized.

Although it is seldom talked about, some banks sell customer data to third parties. Banks also feed customer data to data aggregators such as Yodlee, which anonymize and sell that information to third parties such as hedge funds. The hedge funds use it to predict company performance and make trading decisions. Mastercard and Visa also sell card transaction data to third parties.

Most banks have a vested interest in making sure customer data is not misused in any way, since they incur a lot of the costs of fraud, such as card reissuance and credit monitoring, said Al Raymond, specialist leader, privacy and data protection at Deloitte and former head of U.S. privacy at TD Bank.

“They want to stay close to the data, as the ounce of surveillance prevention more than outweighs the pound of cure,” Raymond said. “Staying close to the customer is a very real, and recent goal, since large banks are trying to fight off the threat from smaller, nimbler fintech players that are chipping away at bank customers, particularly the millennial segment.”

U.S. banks can’t sell raw consumer data to third parties unless they provide the customer with a notice and an opportunity to opt out. In some states, customers have to opt in. They can sell anonymized data, with all personally identifiable information stripped out or hashed. Companies often use such data sets to come up with rules they can apply to their customer base, such as: a 42-year-old living on the Upper West Side of Manhattan is likely to purchase certain items at a particular time of day.

But even anonymized data raises at least three privacy issues.

1. Anonymized data can be de-anonymized

In a report published in April, Stanford and Princeton researchers described how they linked de-identified web browsing histories to social media profiles using only publicly available data.

“This was a case study in how much people share on the internet without even realizing it and how uniquely identifiable that is,” said Jessica Su, computer science Ph.D. student at Stanford University and an author of the report. “We believe that the set of things that are recommended to somebody or the set of websites somebody uses on the internet very much uniquely identify them.”

Image conveying big brother.
Laptop computer being watched in the office by a security camera concept for big brother surveillance or internet computer security
Brian Jackson - stock.adobe.com

The team asked Twitter users to donate their browsing histories and then looked at the links they clicked on while visiting Twitter. They mapped the newsfeeds to the browsing histories, and where there were many similarities they made a match. They successfully identified people around 70% of the time. If they had used the time stamps on the browsing histories and Twitter posts, that rate would probably have been much higher, Su said.

Su said she isn’t concerned that Google or any other large company would take advantage of this ability to identify people, and thus know that the individual who bought a pink cardigan from Jcrew.com at 6:45 a.m. on June 7 was me.

Companies like Google and Facebook “usually have clear policies on what they can do with user data. When I was at Microsoft research, there were very strict controls on what could be done with so-called personally identifiable data. The moral of this story is that a lot of information is personally identifiable.”

Aggregate information used to find patterns and formulas is far more valuable than the activities of individuals, said Boris Segalis, co-chair, data protection, privacy and cybersecurity at Norton Rose Fulbright, a New York City-based law firm,

“They don’t care that you bought the cardigan,” Segalis said. “That’s low-value information.”

What Su does worry about is “the sketchiest small companies,” she said.

“If somebody releases some data set that’s very private and sensitive but anonymous, and somebody else goes and de-anonymizes it using statistical methods, that could be pretty bad.”

Stuart Lacey, founder and CEO of the data privacy technology company Trunomi, is alarmed about companies' being able to connect these dots.

“The extent to which this is being done and what we’re just finding out now is a gulf that will be filled with alligators and surprises,” he said. “I don’t think many people realize just how much is being done.”

2. Even if it remains anonymized, consumers don’t know how their data is being used

Do consumers have the right to know where their data is being sent even if it’s anonymized?

“That’s a question of policy,” Segalis said. “You can certainly have the view that you don’t want someone in the commercial space to figure out your shopping patterns. Ultimately you as a consumer generate that data, even if it’s not associated with you.”

On the other hand, the use of more and better data and analytics could benefit customers’ as well as companies, Segalis argued.

“It could be annoying that someone can predict your shopping patterns,” he said. “But the same data analytics tools are used to predict traffic patterns and make driving safer, to help with pharmacological research. … It’s probably hard to stop data because it drives so much business today.”

3. If you try to explain to consumers how their data is being used, they probably won’t read the explanation.

Banks have to provide privacy notices that disclose what they do with customer data, but often the useful information is buried in legalese.

Lacey pointed out that Apple’s iTunes agreement runs 3,600 words on 27 pages. “No one reads it,” he said.

He’s in favor of using a consent widget that would clearly state what information is being shared with whom for how long. (The European Union’s General Data Protection Regulation calls this “informed consent.”)

“That’s the way the customers and the banks we talk to see this going — judicious and appropriate, measured use of data,” Lacey said.

He points to Twitter’s new privacy policy and the consent form it pushed to users recently, as a good example.

“They had eight clean sentences with little sliders beside every one,” he said. “It was well run, they asked relevant questions, they explained it in understandable language and I opted out of everything.”

He wonders, however, how many among Twitter’s 328 million monthly users bothered to look at the consent agreement.

“People become so blinded by, ‘Yes, just get me on my Twitter, I just need to share this thing,’ ” he said.

Segalis also bemoaned ultralong privacy notices. “They try to describe everything a company does on that piece of paper because of plaintiffs’ lawyers and consumer advocacy groups,” he said. “It’s easier for them to be super detailed than to think about it like Twitter thought about it.”

The worst-case scenario

Lacey paints a dark picture of the future if data continues to be shared thoughtlessly.

“The more a few parties have more information about any one thing and they can control the flow of relevancy of information and what you see, what you do, it starts to become a little Orwellian,” he said.

One day the pink cardigan I buy from J. Crew will have a near-field communication chip in it, Lacey said. The chip will be designed to be read by my washing machine, which will warn me not to put it in with certain other fabrics.

“That seems like a good use case,” he said. But NFC-enabled clothing will become an identifier that can be used to locate people.

“Now we’ve got a whole mechanism for mass surveillance globally and now, all these companies will be trading off that information to not just figure out who you are and what you buy, but locate you and figure out your habits,” Lacey said.

“What I worry about is we’re not taking anywhere near enough time to understand the way in which we’re collecting data about people, what it’s being used for, and by whom for what reason,” Lacey said.

Not everyone shares this dire view.

“My experience in working with legitimate, large companies, is they have chief privacy officers and they try to do the right thing,” Segalis said. “They use this data for their own purposes, because in many cases it opens opportunities and makes them more profitable. I don’t think we can roll that back.”

But Lacey said he sees the EU’s data protection regulation as a gift that could guide companies back to the light, “like Luke in 'Star Wars,' when he’s balancing on the ship and you don’t know if he’s going to go bad or good. Suddenly, if guided the right way, you can make good choices and the result can be quite compelling.”

Editor at Large Penny Crosman welcomes feedback at penny.crosman@sourcemedia.com.

For reprint and licensing requests for this article, click here.
Data sharing Data privacy Data security Bank technology
MORE FROM AMERICAN BANKER