As consumers, we all have “secret scores”: hidden ratings that determine how long each of us waits on hold when calling a business, whether we can return items at a store, and what type of service we receive. A low score sends you to the back of the queue; high scores get you elite treatment.
Every so often, journalists lament these systems’ inaccessibility. They’re “largely invisible to the public,” The New York Times wrote in 2012. “Most people have no inkling they even exist,” The Wall Street Journal said in 2018. Most recently, in April, The Journal’s Christopher Mims looked at a company called Sift, whose proprietary scoring system tracks 16,000 factors for companies like Airbnb and OkCupid. “Sift judges whether or not you can be trusted,” he wrote, “yet there’s no file with your name that it can produce upon request.”
As of this summer, though, Sift does have a file on you, which it can produce upon request. I got mine, and I found it shocking: More than 400 pages long, it contained all the messages I’d ever sent to hosts on Airbnb; years of Yelp delivery orders; a log of every time I’d opened the Coinbase app on my iPhone. Many entries included detailed information about the device I used to do these things, including my IP address at the time.
Sift knew, for example, that I’d used my iPhone to order chicken tikka masala, vegetable samosas and garlic naan on a Saturday night in April three years ago. It knew I used my Apple laptop to sign into Coinbase in January 2017 to change my password. Sift knew about a nightmare Thanksgiving I had in California’s wine country, as captured in my messages to the Airbnb host of a rental called “Cloud 9.”
“The heater in the room with the big couch has been running since we got here and we’re not sure how to turn it off,” I wrote on Wednesday afternoon.
“The air in the main house is really musty, like maybe there’s a mildew or mold issue,” I wrote on Thursday, then added apologetically, “Sorry to be bothering you on Thanksgiving!”
“The bathroom flooded during the rain storm. The carpet outside the bathroom is very wet,” I wrote on Friday. “Ants are coming in from the interior wall of the house.”
This may sound somewhat comical, but the companies gathering and paying for this data find it extremely valuable for rooting out fraud and increasing the revenue they can collect from big spenders. Sift has this data because the company has been hired by Airbnb, Yelp, and Coinbase to identify stolen credit cards and help spot identity thieves and abusive behavior. Still, the fact that obscure companies are accumulating information about years of our online and offline behavior is unsettling, and at a minimum it creates the potential for abuse or discrimination — particularly when those companies decide we don’t stack up.
How to get your data
There are many companies in the business of scoring consumers. The challenge is to identify them. Once you do, the instructions on getting your data will probably be buried in their privacy policies. Ctrl-F “request” is a good way to find it. Most of these companies will also require you to send a photo of your driver’s license to verify your identity. Here are five that say they’ll share the data they have on you.
-
Sift, which determines consumer trustworthiness, asks you to email privacy@sift.com. You’ll then have to fill out a Google form.
-
Zeta Global, which identifies people with a lot of money to spend, lets you request your data via an online form.
-
Retail Equation, which helps companies such as Best Buy and Sephora decide whether to accept or reject a product return, will send you a report if you email returnactivityreport@theretailequation.com.
-
Riskified, which develops fraud scores, will tell you what data it has gathered on your possible crookedness if you contact privacy@riskified.com.
-
Kustomer, a database company that provides what it calls “unprecedented insight into a customer’s past experiences and current sentiment,” tells people to email privacy@kustomer.com.
Just because the companies say they’ll provide your data doesn’t mean they actually will.
Kustomer, for example, gave me the runaround. When I first contacted the company from my personal email address, a representative wrote back that I would have the report by the end of the week. After a couple of weeks passed, I emailed again and was told the company was “instituting a new process” and had “hit a few snags.” I never got the report. When I contacted a company spokeswoman, I was told that I would need to get my data instead from the companies that used Kustomer to analyze me.
Thanks, California
Most of the companies only recently started honoring these requests in response to the California Consumer Privacy Act. Set to go into effect in 2020, the law will grant Californians the right to see what data a company holds on them. It follows a 2018 European privacy law, called General Data Protection Regulation, that lets Europeans gain access to and delete their online data. Some companies have decided to honor the laws’ transparency requirements even for those of us who are not lucky enough to live in Europe or the Golden State.
“We expect these are the first of many laws,” said Jason Tan, the chief executive of Sift. The company, founded in 2011, started making files available to “all end users” this June, even where not legally required to do so — such as in New York, where I live. “We’re trying to be more privacy conscious. We want to be good citizens and stewards of the internet. That includes transparency.”
I was inspired to chase down my data files by a June report from the Consumer Education Foundation, which wants the Federal Trade Commission to investigate secret surveillance scores “generated by a shadowy group of privacy-busting firms that operate in the dark recesses of the American marketplace.” The report named 11 firms that rate shoppers, potential renters and prospective employees. I pursued data from the firms most likely to have information on me.
One of the co-authors of the report was Laura Antonini, the policy director at the Consumer Education Foundation. At my suggestion, she sought out her own data. She got a voluminous report from Sift, and like me, had several companies come up empty-handed despite their claims to have information on hundreds of millions of people. Retail Equation, the company that helps decide whether customers should be allowed to make a return, had nothing on me and one entry for Ms. Antonini: a return of three items worth $78 to Victoria’s Secret in 2009.
“I don’t really care that these data analytics companies know I made a return to Victoria’s Secret in 2009, or that I had chicken kebabs delivered to my apartment, but how is this information being used against me when you generate scores for your clients?” Ms. Antonini said. “That is what consumers deserve to know. The lack of the information I received back is the most alarming part of this.”
In other words, most of these companies are just showing you the data they used to make decisions about you, not how they analyzed that data or what their decision was.
‘It’s incredible what machines can do when they can look under every stone’
My Sift file didn’t come with a credit-score-type number at the top, but many of the entries included a percentage rating as to whether the behavior was “abuse” or “not abuse,” “normal” or “fraud” or “account takeover” versus “not account takeover.”
When I told Mr. Tan that I was alarmed to see my Airbnb messages and Yelp orders in the hands of a company I’d never heard of before, he responded by saying that Sift doesn’t sell or share any of the data it has with third parties.
“We are in the business of predicting risks for particular events at particular times, for particular fraud,” he said. Sift is looking at all my online activity to make sure it’s me, and not someone trying to impersonate or hack me.
“Behind the scenes, we’re trying to create connections between fraudulent accounts,” Mr. Tan said. To score risk, the more data Sift has, the better. It’s able to use what it knows across the accounts of all its clients, so if a certain device has been used to make an order on Yelp with a stolen credit card, Sift can flag that device when it shows up on Airbnb.
“We’re not looking at the data. It’s just machines and algorithms doing this work,” said Mr. Tan. “But it’s incredible what machines can do when they can look under every stone.”
I asked Mr. Tan how many people had requested their data from Sift since the company introduced the option to get it.
“Honestly, we haven’t seen much of a response,” he said.
A spokeswoman from Zeta Global, which created a portal for data requests in August, told me that 10 people have requested their data so far. “There was only data on two people,” she said. (I was one of them; the company had a record of all the comments I had made on a blog a decade ago.)
This may be because most people have no idea Sift, Zeta and the other secret scorers exist. But now you do, and you know how you can get your files.
If you submit a request to any of these companies and get back something weird, please share your experience with me at kashmir.hill@nytimes.com.
"can" - Google News
November 04, 2019 at 05:00PM
https://ift.tt/33hWc2s
I Got Access to My Secret Consumer Score. Now You Can, Too. - The New York Times
"can" - Google News
https://ift.tt/2NE2i6G
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update
No comments:
Post a Comment