Monday, June 22, 2020

The tyranny of algorithms - VI

Facebook was founded by an undergraduate with good intentions but with a flawed understanding of human nature. While it has been beneficial in general terms for individuals; improving communication with friends and relatives, and even people who we would never have hoped to keep in touch with before its arrival, Facebook has done significant damage to society as a whole. People use it for all kinds of things, many of them innocuous, but some of them absolutely pernicious. They use it to try to influence democratic elections, to threaten and harass others, to spread fake news, publish revenge porn and perform a host of other antisocial acts.

It has no effective competitors, so it’s a monopoly – and a global one at that.Facebook's strategy has been to buy potential rivals before they can get too big. Of “social networking apps”, Facebook owns the top 3 -  Facebook, Instagram, and Whats App. Mark Zuckerberg said that there is a breakdown in global communities and Facebook’s mission is to help build communities and make the world a better place. A few months later, the Cambridge Analytica scandal broke showing that the personal data of Facebook users can be leaked to third parties which can be used to influence elections around the world. When your business model is built on taking the data of your users and selling  it to advertisers, you cannot build lasting communities.

Facebook derives its revenues solely by monetizing the data provided by its users – the photographs they upload, the status updates they post, the things they “like”, their friendship groups, the pages they follow, etc. This enables it to build detailed profiles of each user which can then be used for even more precisely targeted advertising. Thus the more “user engagement” there is, the better. Facebook optimists to push our emotional buttons to increase the number of ‘engagements’. This type of design ensures that the most inflammatory and sensational item will be circulated the most because they will generate the maximum engagements. It thus concentrates and amplifies our prejudices. Sober, balanced, well-researched reports don’t stand a chance. Siva Vaidyanathan says in Antisocial Media:
If you wanted to build a machine that would distribute propaganda to millions of people, distract them from important issues, energise hatred and bigotry, erode social trust, undermine journalism, foster doubts about science, and engage in  massive surveillance all at once, you would build something a lot like Facebook. 
The precise targeting of ads by Facebook (and Google) using massive surveillance to create elaborate personal dossiers is something that cannot be matched by other media companies. Thus a firm with a small advertising budget is likely to shift its ad spend towards Facebook and Google forcing reputable news organisations to lay off staff thus affecting their quality of work. The editors and publishers of these organisations spend much of their time trying to design their content to be picked up by Facebook’s algorithms. They have to feed the very monster which is killing them in order to stay alive.

When we visit the site, we scroll through updates from our friends. The machine appears to be only a neutral go-between. We do not see that Facebook's engineers can tweak its algorithms to change what we see - whether text or photos is prioritized, which newspapers appear in news feeds etc. It runs psychological experiments on its users without them being aware of it. For eg., it once sought to discover whether emotions are contagious. For one group it removed the positive words from the posts in its news feed while for another group it removed the negative words. It concluded that each group wrote posts that reflected the mood of the posts it was exposed to.

Facebook’s success, Mr. Vaidhyanthan argues, is based on two elements. The first being that Facebook is deliberately engineered to be addictive; rewarding interactions, likes, and shares, in similar ways to how casinos keep their guests playing. The second element of Facebook’s success being that it has become 'one of the most effective advertising machines in history.' Facebook knows so much about us, and offers advertisers such levels of targeting that were never before dreamed of, that it is unparalleled as a sales tool.

If you frequently click on certain sites, friends or web pages, the Facebook algorithm knows that you are highly engaged with them. So it gives you more of the stuff with which you would engage and less of the stuff you would ignore. The ability of the Facebook algorithm to predict your behavior improves over time with your willing cooperation. Thus over time, your news feed becomes narrower in perspective and you find yourself in an echo chamber as it is less likely that you will find information coming from outside the group. Thus Facebook users are unable to engage with people outside their group because they don’t share a body of truths.

The easy availability of various internet tools has led to what is called 'clictivism'. The premise behind clicktivism is that social media allows for quick and easy ways to support an organization or cause but this leads only to slactivism  -  a pejorative term for "feel-good" measures in support of an issue or social cause. The "Like" button used on Facebook is a popular slacktivist tool. Other Slacktivist activities include signing Internet petitions, joining a community organization without contributing to the organization's efforts, copying and pasting of social network statuses or messages or altering one's personal data or avatar on social network services. People can now express concern about social or political issues with nothing more than the click of a mouse since they can easily "like", "share" or "tweet" about something interesting. The sociologist, Zygmunt Bauman said in an interview 'Social media are a trap':
The question of identity has changed from being something you are born with to a task: you have to create your own community. But communities aren’t created, and you either have one or you don’t. What the social networks can create is a substitute. The difference between a community and a network is that you belong to a community, but a network belongs to you. You feel in control. You can add friends if you wish, you can delete them if you wish. 
You are in control of the important people to whom you relate. People feel a little better as a result, because loneliness, abandonment, is the great fear in our individualist age. But it’s so easy to add or remove friends on the internet that people fail to learn the real social skills, which you need when you go to the street, when you go to your workplace, where you find lots of people who you need to enter into sensible interaction with. 
Pope Francis, who is a great man, gave his first interview after being elected, to Eugenio Scalfari, an Italian journalist who is also a self-proclaimed atheist. It was a sign: real dialogue isn’t about talking to people who believe the same things as you. Social media don’t teach us to dialogue because it is so easy to avoid controversy.   But most people use social media not to unite, not to open their horizons wider, but on the contrary, to cut themselves a comfort zone where the only sounds they hear are the echoes of their own voice, where the only things they see are the reflections of their own face. Social media are very useful, they provide pleasure, but they are a trap.

No comments:

Post a Comment