Tuesday, April 10, 2018

Understanding Facebook's data crisis: 5 essential reads


File 20180404 189804 6fxdh5.jpg?ixlib=rb 1.1
What will Mark Zuckerberg say to Congress? AP Photo/Noah Berger

Most of Facebook’s 2 billion users have likely had their data collected by third parties, the company revealed April 4. That follows reports that 87 million users’ data were used to target online political advertising in the run-up to the 2016 U.S. presidential election.

As company CEO Mark Zuckerberg prepares to testify before Congress, Facebook is beginning to respond to international public and government criticism of its data-harvesting and data-sharing policies. Many scholars around the U.S. are discussing what happened, what’s at stake, how to fix it, and what could come next. Here we spotlight five examples from our recent coverage.

1. What actually happened?

A lot of the concern has arisen from reporting that indicated Cambridge Analytica’s analysis was based on profiling people’s personalities, based on work from Cambridge University researcher Aleksandr Kogan.

Media scholar Matthew Hindman actually asked Kogan what he had done. As Hindman explained, “Information on users’ personalities or ‘psychographics’ was just a modest part of how the model targeted citizens. It was not a personality model strictly speaking, but rather one that boiled down demographics, social influences, personality and everything else into a big correlated lump.”

2. What were the effects of what happened?

On a personal level, this level of data collection – particularly for the 50 million Facebook users who had never consented to having their data collected by Kogan or Cambridge Analytica – was distressing. Ethical hacker Timothy Summers noted that democracy itself is at stake:
“What used to be a public exchange of information and democratic dialogue is now a customized whisper campaign: Groups both ethical and malicious can divide Americans, whispering into the ear of each and every user, nudging them based on their fears and encouraging them to whisper to others who share those fears.”

3. What should I do in response?

The backlash has been significant, with most Facebook users expressing some level of concern over what might be done with personal data Facebook has on them. As sociologists Denise Anthony and Luke Stark explain, people shouldn’t trust Facebook or other companies that collect massive amounts of user data: “Neither regulations nor third-party institutions currently exist to ensure that social media companies are trustworthy.”

4. What if I want to quit Facebook?

Many people have thought about, and talked about, deleting their Facebook accounts. But it’s harder than most people expect to actually do so. A communications research group at the University of Pennsylvania discussed all the psychological boosts that keep people hooked on social media, including Facebook’s own overt protestations:
“When one of us tried deactivating her account, she was told how huge the loss would be – profile disabled, all the memories evaporating, losing touch with over 500 friends.”

5. Should I be worried about future data-using manipulation?

If Facebook is that hard to leave, just think about what will happen as virtual reality becomes more popular. The powerful algorithms that manipulate Facebook users are not nearly as effective as VR will be, with its full immersion, writes user-experience scholar Elissa Redmiles:
“A person who uses virtual reality is, often willingly, being controlled to far greater extents than were ever possible before. Everything a person sees and hears – and perhaps even feels or smells – is totally created by another person.”
The ConversationAnd people are concerned now that they’re too trusting.
Jeff Inglis, Science + Technology Editor, The Conversation
This article was originally published on The Conversation.

No comments:

Post a Comment