Since the start of the Privacy Project, the most common response I have gotten from readers is a request for some kind of solution. They’re slightly freaked out and hoping for tips to shore up their digital hygiene or for a guide that might help them navigate the internet without giving away their personal data. For months I’ve included a Tip of the Week feature for this very reason, but I’ve always felt conflicted about it, because the hard truth is that our data is leaking and trading all the time, in places we might not even know to look. Each privacy tip you follow is undoubtedly helpful (and you should follow them!), but it’s a bit like a single sandbag in a hurricane: You need to amass so, so many and be extremely vigilant to make a difference.
I have argued previously that the personal-responsibility frame for privacy is unfair. And I believe that the only way to fully transform privacy is if data protection moves from individuals to institutions. But that said, I was moved over the holiday weekend by an argument that gave me a bit of hope that there are small ways we individuals can make a difference.
The idea came from the writer Dave Eggers. In an interview with Vox’s Ezra Klein, Eggers — who doesn’t have Wi-Fi in his home and still uses a flip phone — makes the argument that our public demand for more and more information plays a meaningful role in the privacy discussion:
“We can’t just blame the Big Five [Apple, Google, Facebook, Microsoft and Amazon] and the surveillance they do and the N.S.A., because we are constantly using these tools on each other and thinking it’s O.K. Whether it’s getting email receipts, whether it’s parents surveilling their kids, even at college. Whether it’s spouses surveilling each other through their smartphones — all the spying people do on each other. People surreptitiously taking photos of each other because it’s so easy now and you always have a high-level camera in your hands. I think that we don’t necessarily realize how quickly we’ve evolved and how quickly we have superseded our idea of our right to privacy by our right to know.
He continues:
We’ve evolved to the point where our ideas of privacy have evolved or our value of it is almost completely gone. I think there’s a few square feet and our skulls that we still retain. There’s the bathroom, the bedroom after a certain hour and there’s the space in our brain. But nowhere else do we expect privacy. And I think that’s a radical shift in evolution, and it happened in a few years.
There are bits in the larger conversation that I disagree with Eggers on, mostly because I think it offers too much cover for Big Tech. I think his argument that there’s a “public market” for privacy-invading services and that tech companies are merely responding to it and building products is a backward interpretation. I’d argue that it’s human behavior that’s responding to powerful, addictive products and well-crafted marketing campaigns. Regardless, I think there’s something poignant about this line: “We have superseded our idea of our right to privacy by our right to know.”
Writing about technology for roughly a decade, I’ve felt this strongly at times. I noticed it first watching Reddit threads after a mass shooting in 2012 inside a movie theater in Aurora, Colo. — a kind of online vigilante detective emerged, powered by the idea that almost any piece of information could be found and that, by virtue of being online, we were entitled to it. Since then, the behavior has embedded itself into the dark soul of the internet. The hunt for the Boston bombers, Gamergate, the 4chan culture of doxing — some of it is predicated on a behavior to want information that, 15 years ago, we might not have felt entitled to.
These are extreme examples, of course. But there are countless, mundane ways in which we use technology to demand information from one another, as Eggers points out. We track emails and snap pictures of unknowing individuals in public. Even a small act like sending a late-night work email that could wait until morning is an invasion of our private time. We rarely think about it that way because it’s so easy to fire off a quick message and we’re so eager for an immediate response — for more information. It’s not that we don’t value privacy (we care about it more than we think); it’s that we no longer expect it for ourselves or for others.
Which brings us back to that quest for solutions. Without a comprehensive privacy bill or some meaningful regulation of Big Tech, we’re not going to change the way our data is siphoned away. But we can hope to change what we demand from ourselves and others when it comes to those small, everyday privacy invasions. We can be more respectful of others’ desires to unplug by giving them space to do so. We can give space to friends and family by not monitoring them just because we can. We can make small choices not to demand information solely because we feel entitled to it. Perhaps most important, we can give ourselves both the space to disconnect and the permission to say “no” when others demand more information from us (from read receipts to social network prompts).
This might feel like a marginal change, but shifting our expectations can have a profound impact on ourselves and others. Nothing about technological change is inevitable. As my Opinion colleague Annalee Newitz wrote last week, “a better internet is waiting for us.” That we’ll eventually find it isn’t certain. And the process will be extremely difficult, but it will start with us reimagining what we expect of one another.
What Google Knew:
If you like this newsletter, I suggest you also subscribe to Big, a newsletter by Open Markets’ Matt Stoller. It’s about monopoly power and frequently gets into fascinating issues about Big Tech and privacy. Last week, Stoller had a fascinating tidbit from the Financial Times columnist Rana Foroohar about how Google’s founders, Sergey Brin and Larry Page, predicted the disruption that Google would cause to the advertising ecosystem. Here’s what she told Stoller:
The most surprising thing I leaned while researching the book was that the founders of Google, Sergei and Larry, had basically predicted the key problems with surveillance capitalism and where they would lead us back in their original paper on search, written while they were Stanford grad students. At the very end, in the appendix, there’s a paragraph where they admit that the targeted advertising business model could be misused by companies or other entities in ways that would hurt users. This is kind of a bombshell revelation given that search engines say everything they do is for users. The fact that this paper hasn’t gotten more attention makes me think people aren’t reading.
I decided to look up the paper, which has the sexy headline, “The Anatomy of a Large-Scale Hypertextual Web Search Engine.” Foroohar is right — it’s fascinating reading from a 2019 perspective where Google has a near-monopoly on search:
The goals of the advertising business model do not always correspond to providing quality search to users. … We expect that advertising funded search engines will be inherently biased toward the advertisers and away from the needs of the consumers.
Then, Page and Brin raise the issue of search engine bias. “Since it is very difficult even for experts to evaluate search engines, search engine bias is particularly insidious,” they write. It continues, with an example from the defunct search engine, OpenText. “This type of bias is much more insidious than advertising, because it is not clear who ‘deserves’ to be there, and who is willing to pay money to be listed.”
Page and Brin most likely included this because they believed that their search engine would be an exception to the rule. But the conclusion reads as somewhat prophetic:
In general, it could be argued from the consumer point of view that the better the search engine is, the fewer advertisements will be needed for the consumer to find what they want. This of course erodes the advertising supported business model of the existing search engines. However, there will always be money from advertisers who want a customer to switch products, or have something that is genuinely new. But we believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.
The whole paper is worth your time, but you can read the entirety of what I quoted from by scrolling down to Appendix A.
What I’m Reading:
“Activists Build a Grass-Roots Alliance Against Amazon”
“Ring Doesn’t Have Facial Recognition — Some Police Want to Add Their Own”
“Top Senate Democrats Unveil New Online Privacy Bill, Promising Tough Penalties for Data Abuse”
“They See You When You’re Shopping”
Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.
Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.
glossary replacer
"can" - Google News
December 04, 2019 at 03:00AM
https://ift.tt/384BS7K
We No Longer Expect Privacy. You Can Change That. - The New York Times
"can" - Google News
https://ift.tt/2NE2i6G
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update
No comments:
Post a Comment