In the wake of the Cambridge Analytica scandal, many people are questioning whether or not to delete their Facebook accounts. It’s not the first time this has happened. I can’t recall how many times I have seen calls to boycott Facebook – on Facebook – ever since I started using the social media platform.
But many people (myself included) simply click the “Like” button of the call, then carry on using Facebook. The latest #DeleteFacebook campaign could well be another hapless boycott. And the irony of this is that the energy and emotion spent on the #DeleteFacebook campaign could well be harvested and used to target us as consumers in one way or another.
This is because Facebook – and many other digital platforms – are built on our data. The platform is one of many companies that are part of a heavily personalised, data-intensive economy that exploits the digital labour of its user base. You might not think of your interactions on Facebook (or Instagram or Twitter) as labour, but the way you produce and manage various emotions on these platforms gives them reams of information about your likes and dislikes.
These platforms are free to use. They make their money by selling your data to advertisers. So whether it’s reading and interacting with #middleclassproblems or reacting to a cat video, people register all sorts of emotions involved in performing their daily routines. This ranges from managing the frustration of operating technological tools (like apps) to anger about train delays.
The terabytes of data we generate in our interactions on these platforms allows companies to “datafy”, quantify, track, monitor, profile us and sell target adverts to haunt us. This is an economic system that has been dubbed “surveillance capitalism”. And it is fuelled by “affective labour” – a concept borrowed from feminism studies to refer to the invisible, yet intense, work embedded in producing and managing our emotions.
The #DeleteFacebook campaign is an emotional response to the Cambridge Analytica scandal. This anger will be recorded and exploited, because again it’s encouraging people to react on Facebook or other social media.
Slacktivism – where people show support for the trending #DeleteFacebook campaign but do little about it in real life – will probably feed into this surveillance capitalistic machine. While I think we ought to hold Facebook accountable, I am not sure supporting #DeleteFacebook without any further action is an effective method for sabotage.
So, as conscious consumers, what can we do? There are ways to subvert this surveillance capitalism and alternative, fairer platforms have been attempted.
For example, people have developed decentralised social media projects such as Freedom Box (a system for personal publishing), Diaspora (a decentralised social network), Mastodon (a non-commercial Twitter-like service) – to name just a few – to replace Facebook.
But these projects, though creative and unique in their own ways of addressing data privacy and transparency, have not been effective in removing the big data “platform monopolies”, a term coined by researchers at MIT’s Digital Currency Initiative.
This is because, while it may be technologically viable to design projects to decentralise the storage of data and allow users to take back control of their data, they cannot save users from being socially locked-in by platforms like Facebook. The tech is there but the human factor appears to be the hardest to address. People need to want to leave and see the benefit of it.
Having said that, it does not mean that consumers should be left powerless in a data-driven economy. The new General Data Protection Regulation (GDPR), which will become enforceable on May 25 2018, mandates the right to data portability in the EU. This is particularly important because it will give users choices and control that helps balance the asymmetrical relationship between them and providers at present.
The GDPR affects all companies operating in the EU. It says people have the right to receive their personal data:
In a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided.
This gives strong basis to command big data companies like Facebook to invest their profits in tech that enables people to transmit their data directly from one platform to another.
By mobilising data flow, we will be able to see greater mobility between different social networks. This should assuage those fears you have of leaving Facebook and losing all the information you’ve gathered on it. It should lead to a fairer economic ecosystem.
Plus, it is imperative to increase data literacy, raise awareness of how transparent a company’s data policy is, and improve consumer education in the digital age. How many people are aware, for example, the extent to which Facebook tracks your online activity and sells your data to advertisers?
The “data journey” framework, developed by Jo Bates, Paula Goodale and myself, helps understand data trajectories from production through to various contexts of big data collation, distribution and reuse. It questions how our values and actions transform as they interact with the data over the course of its journey. This framework can be used to enhance engagement with and education around data literacy.
Only when we understand how data are made, collected, used and exploited, can we appreciate the value of our data, and the importance of transparency with what’s done with it, and privacy. While there is no quick fix to these problems, I hope by raising awareness of the exploitation of our emotional labour and investigating “data journeys” the discussion can be taken to a new level.
Cover image credit: Alexander Nix, CEO at Cambridge Analytica, explaining how they use psychographics as part of their individual profiling and behavioral analytics for advising on election campaigns.