What Facebook and Cambridge Analytica should teach us about Big Brother and Big Data

Facebook HQ
Avatar

Recent news about Facebook and Cambridge Analytica has placed these companies in the media spotlight. The way they use data harvesting to gather data about us, and then use that data to target advertising and manipulate public opinion is troubling, to say the least. This spotlight on Facebook will soon fade. Cambridge Analytica will probably be operating under a different name before the end of next week. But this kind of scandal – and there will be many more to come – increases the relevance and urgency of learning lessons about how our data is being collected and used to manipulate us.

If you use Facebook, you give them your personal data, your searches and interests. With every click, new data on you is harvested. Facebook is able to sell your data to others – third parties – who want to sell you things through targeted advertising. These third parties are often businesses, but they are also political parties, non-governmental organisations, governments, or any kind of organisation.

Politics and policies are just products in the Facebook business model.

Start your day with The Canary News Digest

Fresh and fearless; get excellent independent journalism from The Canary, delivered straight to your inbox every morning.




Facebook is not alone.

Google knows everything you search for. And those who pay Google Adwords are more likely to be found via a search.  Since 2014, Twitter has listed every app you have installed on your phone. After it changed its privacy policy last year, Twitter also collects data about your interests. The company uses this data to show you more relevant content and also sells it to third parties. Netflix builds preference profiles based on what you watch. Amazon keeps track not only of what you buy but also your searches. All these companies use Big Data.

Whether it is an event, service or product, politician or policy, or even religious ideas and beliefs, everything is data for sale. The emails you receive, the posts in your news feed, the tweets on your homepage, the suggestions of websites to visit or articles to read are all customised to your preferences, past searches, likes, and choices.

Big Brother is in your pocket

Your smartphone is probably the best mass surveillance device ever invented. It records who you call, or who calls you, and for how long you spoke. It records all your online activity. Your phone tracks your location continuously, so it records where you live, work, and travel. It locates where you are (or at least where your phone is) at any point during the day. And it records your address book, photographs, and messages.

Our phone is like George Orwell’s Big Brother. Except we all volunteer our information to it. You give your consent every time you scroll through a service or user agreement and click your assent to the terms and conditions. You have given your data to the company. It now belongs to them. The companies track, package, and sell your movements, preferences, and consumer choices.

We have entered the era of surveillance capitalism.

Surveillance capitalism

John Bellamy Foster and Robert W. McChesney first introduced the term ‘surveillance capitalism’ in 2014. More recently, Shoshana Zuboff popularised this term in her 2017 book The Age of Surveillance Capitalism. In this new phase of capitalism, Big Data is the new resource. Companies use data from millions of people to sell products and services. And they also use it to make decisions that will let them grow faster than the competition. Big Data is needed for lean production and supply, predictions of consumer behaviour, and for targetting advertisements.

Surveillance capitalism and data harvesting — the trade in data acquired through surveillance — are fundamental components to the business model of giants such as Microsoft, Google, Amazon, Facebook, and Twitter. Every time you scroll on Facebook, hit the heart button on Twitter or Instagram, or watch a video on YouTube, you are taking part in surveillance capitalism.

The drive for profits and market dominance demands more and more data extraction. Increasingly sophisticated artificial intelligence is extracting and analysing data from us, every day. And companies like Facebook must continually experiment on us to find more and better data from us.

Facebook and mood manipulation 

In its infamous 2012 study, Facebook discovered that it could influence the emotions of a large population (689,003 Facebook users). It almost goes with saying that there are ethical issues with this study. There are also social issues. Researchers performed this experiment in secret, without users consent.

Facebook manipulated almost 700 thousand users’ news feeds for one week. The experiment reduced half of the users’ exposure to their friends’ “positive emotional content”. This resulted in those users making fewer positive posts. Facebook reduced the other half of users’ exposure to “negative emotional content” and the opposite happened. This confirms that emotional states can be transferred to others. Facebook calls this emotional contagion.

Exploiting a vulnerability in human psychology

Facebook’s founders knew they were creating something addictive that exploited “a vulnerability in human psychology”, according to the company’s founding president Sean Parker. He said:

The thought process that went into building these applications, Facebook being the first of them, … was all about: How do we consume as much of your time and conscious attention as possible?

And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you … more likes and comments.

It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.

Chamath Palihapitiya, who was vice-president for user growth at Facebook before he left the company in 2011, said:

The short-term, dopamine-driven feedback loops that we have created are destroying how society works… No civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.

Palihapitiya added:

…we get rewarded in these short term signals: Hearts, likes, thumbs up.

We conflate that with value and we conflate it with truth, and instead what it really is is fake, brittle popularity that’s short term and leaves you even more vacant and empty before you did it.

You don’t realise it, but you are being programmed.

Facebook and fake news

At its best, Facebook has provided an important platform for alternatives to the mainstream media. It has allowed people to spread the word about grassroots events and protests and given a democratic voice to people. At its worst, it uses bots and clickbait articles to capture attention and direct it towards advertising and fake news. It even manipulates the mood and viewpoints of its users. When the concern is reach and clicks, the content is often not important. Its aim is ‘engagement’. And every second you spend ‘engaged’ equals profit.

Watch the following YouTube video “How Your Brain is Getting Hacked”. It might not change everything you know, but it might make you think twice next time you are scrolling through your Facebook news feed:

https://www.youtube.com/watch?v=GufhzLSmqMs&feature=youtu.be

Facebook and Cambridge Analytica

Cambridge Analytica, according to whistle-blower Chris Wylie, used psychological profiling to understand what messages people would be susceptible to believe. Someone prone to dislike immigrants, for example, is shown ads that play to their prejudices and fears. Companies like Cambridge Analytica target people who are more prone to believe attack ads about political opponents. In this way, politicians and parties can use Facebook data and audience reach to manipulate public perception of a candidate for public office or the leader of the opposition. It can also influence a national election or a referendum such as Brexit. It is a tool for targetting propaganda to the audience most susceptible to believe it.

The problem, however, is deeper than Facebook or Cambridge Analytica. The problem is within us. There are lessons for us to learn. We need to take a look at ourselves, warts and all, and take a good hard look. How do we use the internet? What is it within us that makes us so easily manipulated? If we live in “data bubbles” – only interacting with people who share the same prejudices and world views as we do – we are susceptible to manipulation and targeting. When we seek only stories or media outlets that reinforce our prejudices and worldview, we inhabit an echo-chamber and can be fed fake news. To stop companies like Facebook and Cambridge Analytica from manipulating us, we need to burst the data bubble.  We need to face our psychological vulnerabilities and overcome our addictions.

So, let’s overcome those addictions. Leave the phone at home. Breathe. Go for a walk and feel the sun on your face.

Get Involved!

–  Join The Canary, so we can keep holding the powerful to account.

–  How to delete your Facebook account

– Read and support other independent media outlets:

Media DiversifiedNovara MediaCorporate WatchRed PepperNew InternationalistCommon SpaceMedia LensBella CaledoniaVox PoliticalEvolve PoliticsReal MediaReel NewsSTRIKE! magazineThe Bristol CableThe Meteor, The SkwawkboxSalford StarThe Ferret.

Featured image via Minette Lontsie/Wikimedia Commons

Since you're here ...

We know you don't need a lecture. You wouldn't be here if you didn't care.
Now, more than ever, we need your help to challenge the rightwing press and hold power to account. Please help us survive and thrive.

The Canary Support

Comments are closed