Tuesday, October 27, 2020

Control through triviality - VI

It would be wrong to think that nobody noticed that we were drowning in distraction. Amusing Ourselves To Death by Neil Postman was published in 1985, a world that was not yet invaded by the Internet, cell phones, PDAs, cable channels by the hundreds, DVDs,  blogs, flat-screens, HDTV, and iPods, downloading tunes, playing games (online, PlayStation, Game Boy), etc. It discusses the once-urgent premonitions about the deep-seated perils of television.  He says that TV has turned all public life into entertainment. He warns that we'll be overwhelmed by "information glut" until what is truly meaningful is lost and we no longer care what we've lost as long as we're being amused. 

He rues the fact that there is no reflection time in the world anymore. Today TV no longer dominates the media landscape. "Screen time" also means hours spent in front of the computer, video monitor, cell phone, and hand-held. Silence has been replaced by background noise. Things have gotten much, much worse since he published the book. The book discusses two frightening visions of the future. One was in 1984 by George Orwell. The other was in the lesser known Brave New World by  Aldous Huxley. Postman writes that Brave New World and not 1984 is the book to focus on. 

The Party of 1984 maintained control of the people by keeping them under constant surveillance, whereas the government of Brave New World kept the citizens so happy, they never felt threatened enough to put up a fight. For Huxley, oppression came in a very different form from what Orwell imagined. Orwell’s Oceania keeps the masses in check with fear thanks to an endless war and a hyper-competent surveillance state. In Huxley's dystopian World State, all the inhabitants merely live for pleasure. The elite amuse the masses into submission with a mind-numbing drug called Soma and an endless buffet of casual sex. Postman wrote that  hard surveillance societies are not the ones to be wary of but societies that are bored by stimulation, dazed by constant distraction. 

As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think. What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for people would be so infatuated by various technological narcotics that there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be not find the needle in the haystack. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance.  In 1984, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us.  Neil Postman writes: 

What Huxley teaches is that in the age of advanced technology, spiritual devastation is more likely to come from an enemy with a smiling face than from one whose countenance exudes suspicion and hate. In the Huxleyan prophecy. Big Brother does not watch us, by his choice. We watch him, by ours. There is no need for wardens or gates or Ministries of Truth. 

When a population becomes distracted by trivia, when cultural life is redefined as a perpetual round of entertainments, when serious public conversation becomes a form of baby-talk, when, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility. 

In an article, My Dad Predicted Trump in 1985, Andrew Postman writes that everyone had mistakenly feared and obsessed over an information-censoring, movement-restricting, individuality-emaciating state while his father had warned about a technology-sedating, consumption-engorging, instant-gratifying bubble. The environment is one in which people were ‘being conditioned to get its information faster, in a way that was less nuanced and, of course, image-based.’ “An Orwellian world is much easier to recognize, and to oppose, than a Huxleyan,” my father wrote. “Everything in our background has prepared us to know and resist a prison when the gates begin to close around us … [but] who is prepared to take arms against a sea of amusements?”

In a letter to Orwell, Huxley stated that instead of the  ‘boot-on-the-face’ policy described by him, rulers are more likely to ‘find less arduous and wasteful ways of governing and of satisfying its lust for power, and these ways will resemble those which I described in Brave New World.’ He wrote, ‘Within the next generation I believe that the world's rulers will discover that infant conditioning and narco-hypnosis are more efficient, as instruments of government, than clubs and prisons, and that the lust for power can be just as completely satisfied by suggesting people into loving their servitude as by flogging and kicking them into obedience.’ And he thought that this change will be brought about as a result of ‘a felt need for increased efficiency’.

In an essay he wrote called Brave New World Revisited, Huxley described the society of Brave New World as one ‘where perfect efficiency left no room for freedom or personal initiative’. He said the changed circumstances since Orwell wrote his novel seemed to indicate that the odds were more in favor of something like Brave New World than of something like 1984 because Orwell ‘failed to take into account man's almost infinite appetite for distractions’. In Brave New World on the other hand, ‘non-stop distractions of the most fascinating nature  . . . are deliberately used as instruments of policy, for the purpose of preventing people from paying too much attention to the realities of the social and political situation’. He wrote:

The other world of religion is different from the other world of entertainment; but they resemble one another in being most decidedly "not of this world." Both are distractions and, if lived in too continuously, both can become, in Marx's phrase, "the opium of the people" and so a threat to freedom. Only the vigilant can maintain their liberties, and only those who are constantly and intelligently on the spot can hope to govern themselves effectively by democratic procedures. 

A society, most of whose members spend a great part of their time, not on the spot, not here and now and in the calculable future, but some­where else, in the irrelevant other worlds of sport and soap opera, of mythology and metaphysical fantasy, will find it hard to resist the encroachments of those who would manipulate and control it. 

It is not a simple dichotomy that says that Huxley was right and Orwell was wrong. The internet has strengthened propaganda and surveillance and Orwell did write about prolefeed - the deliberately superficial entertainment including literature, movies and music that keep the  masses content and prevent them from becoming too knowledgeable. It is just that Huxley's dystopia has played a much bigger role in strengthening authoritarian regimes than is appreciated. The world today is a hybrid of the dystopias presented in three books - 1984 by George Orwell, Brave New World by  Aldous Huxley and  Player Piano  by Kurt Vonnegut. (In Player Piano, there is a permanently unemployed working class, dispossessed by managerial engineers and automation.)

All these fears reflect Gandhi's concerns about modernity. What makes modernity especially dangerous according to Gandhi is that it comes with a surface gloss which makes people blind to the costs that they are obliged to pay.  He wrote, 'Modern tyranny is a trap of temptation and therefore does greater mischief’. For Gandhi, people have to wage two types of struggle to gain autonomy - one external and one internal. The external struggle is waged against institutional practices which lead to their degradation. The internal struggle is against one's own senses and passions. For Gandhi, people who always give in to temptations are not autonomous. He thinks that we can be slaves to our own passions and desires and not just to other people.  He believed that they who have failed to attain swaraj within themselves must lose it in the outside world too. He felt that modernity increased the difficulty of both struggles. 

The external struggle is against the Orwellian dystopia. The internal struggle is against the Huxleyan dystopia. The hidden hand of the market can be almost as potent an instrument of control as the iron fist of the state. His struggle against industrial civilization was because of his fear of it leading to Vonngut's dystopia. He argued that modern economic life reduced men to its helpless and passive victim and represented a new form of slavery, more comfortable and insidious and hence more dangerous than the earlier ones. Others have raised some of these issues before and after his time but he was the only mass leader who could move millions of people who consistently raised them. As Nelson Mandela says in this article:

Gandhi remains today the only complete critique of advanced industrial society. Others have criticized its totalitarianism but not its productive apparatus. He is not against science and technology, but he places priority on the right to work and opposes mechanization to the extent that it usurps this right. Large-scale machinery, he holds, concentrates wealth in the hands of one man who tyrannizes the rest. He favors the small machine; he seeks to keep the individual in control of his tools . . . 

Wednesday, October 14, 2020

Control through triviality - V

“There are only two industries that refer to their customers as 'users': illegal drugs and software. " — Edward Tufte

Many persuasive and motivational techniques are used to keep users returning to gaming and social media sites. These include “scarcity” (a snap or status is only temporarily available, encouraging you to get online quickly); “social proof” many people retweeted an article so you should go online and read it); “personalization” (your news feed is designed to filter and display news based on your interest); and “reciprocity” (invite more friends to get extra points, and once your friends are part of the network it becomes much more difficult for you or them to leave).

A fear of missing out, commonly known as FoMO, is at the heart of many features of social media design. Groups and forums in social media promote active participation. Notifications and “presence features” keep people notified of each others’ availability and activities in real-time so that some start to become compulsive checkers. This keeps us “friended” to people with whom we haven’t spoken in ages (“what if I miss something important from them?”). This feeling of “1% chance you could be missing something important” keeps us subscribed to newsletters even after they haven’t delivered recent benefits (“what if I miss a future announcement?”) . This keeps us using social media (“what if I miss that important news story or fall behind what my friends are talking about?”)

One of the ways tech companies capture attention  is to use social awareness cues which exploit  our need for social approval. The need to belong, to be approved or appreciated by our peers is among the highest human motivations. But now our social approval is in the hands of tech companies. When I get tagged by a friend, I imagine him making a conscious choice to tag me. But I don’t see how a company like Facebook has orchestrated his action. Facebook, Instagram or SnapChat can manipulate how often people get tagged in photos by automatically suggesting all the faces people should tag (e.g. by showing a box with a 1-click confirmation). Through design choices like this, Facebook controls how often millions of people experience their social approval online.

Another way to keep people engaged is to exploit the idea of social reciprocity. If you do me a favor, I start feeling that I owe you one next time. You say, “thank you” — I have to say “you’re welcome.” You send me an email — it’s rude not to get back to you. You follow me — it’s rude not to follow you back. (especially for teenagers). As Kurt Vonnegut said, 'If somebody says 'I love you' to me, I feel as though I had a pistol pointed at my head. What can anybody reply under such conditions but that which the pistol holder requires? 'I love you, too'. 

We are vulnerable to needing to reciprocate others’ gestures and tech companies now manipulate how often we experience it. It’s in their interest to heighten the feeling of urgency and social reciprocity. For example, Facebook automatically tells the sender when you “saw” their message, instead of letting you avoid disclosing whether you read it (“now that you know I’ve seen the message, I feel even more obligated to respond.”) This includes “two ticks” on instant messaging tools, such as Whatsapp. 

Like Facebook, LinkedIn exploits an asymmetry in perception. When you receive an invitation from someone to connect, you imagine that person making a conscious choice to invite you, when in reality, they likely unconsciously responded to LinkedIn’s list of suggested contacts. In other words, LinkedIn turns your unconscious impulses (to “add” a person) into new social obligations that millions of people feel obligated to repay. All while they profit from the time people spend doing it.

Another way to hijack people is to keep them consuming things, even when they aren’t hungry anymore. Games, music, podcasts and hundreds of other diversions of life are carefully designed to make us come back for more. This is done by converting an experience that has a definite end and turn it into a bottomless flow that keeps going. So for eg., News feeds are purposely designed to auto-refill with reasons to keep you scrolling, and purposely eliminate any reason for you to pause, reconsider or leave. Mr Raskin, the person who designed infinite scroll, says, "If you don't give your brain time to catch up with your impulses, you just keep scrolling." He said the innovation kept users looking at their phones far longer than necessary.

It’s also why video and social media sites like Netflix, YouTube or Facebook autoplay the next video after a countdown instead of waiting for you to make a conscious choice (in case you won’t). A huge portion of traffic on these websites is driven by autoplaying the next thing. The continuous nature of the feeds leave no natural stopping points where it would make sense to just quit using the application. When you get one recommendation after another that you like, you may keep watching without being aware of how much time has gone by. 

Tik Tok, which is akin to the hugely popular site musical.ly, displays short video performances. (Tik Tok is banned now but clones will appear; the idea will not go away.) Users promote a variety of talents online, including application of makeup, magic acts, cooking or standup comedy. The app has a function to make footage look fancier, which attracts users from other apps. It  can gauge users’ tastes according to their browsing history and recommend other clips they will probably like which keeps them hooked. 

Tech companies often claim that “we’re just making it easier for users to see the video they want to watch” when they are actually serving their business interests. Increasing “time spent” is the currency they compete for. Hundreds of engineers' job every day in tech companies around the world is to invent new ways to keep you hooked. In Automate This, a Harvard-educated mathematician Jeffrey Hammerbacher, tells Steiner,  "The best minds of my generations [sic] are thinking about how to make people click on ads. That sucks."  Remember Plato’s allegory of the cave? Instead of staring at the shadows on the wall, we’re all staring at Facebook, Instagram or watching endlessly our favorite series due to machinations of smart people who lull us into thinking that we made the choice ourselves. Never have so many been manipulated so much by so few.

Leah Pearlman, co-inventor of Facebook's Like button, said she had become hooked on Facebook because she had begun basing her sense of self-worth on the number of "likes" she had. "When I need validation - I go to check Facebook," she said. Tristan Harris, who was design ethicist at Google, says that he is addicted to e-mails. Even though he knows the tricks that Google uses to make people come back to check e-mails, he says that he is not able to control his urge. 

What could be the harm if people are checking their phones all the time, posting pictures of themselves on Instagram, and getting addicted to online games? Many people could get killed or injured as a result of distracted driving caused by texting messages. It’s easy to say that people should not text and drive, But the design problem, the “error-provocative” aspect of the technology is ignored. There is also an increase in lifestyle diseases caused by a sedentary lifestyle and lack of interaction with people. Problems like obesity, diabetes, and high blood pressure, along with higher rates of depression and anxiety, suggest that digital entertainment is not the best way to spend leisure time. 

India is the world capital for selfie deaths accounting for 50% of worldwide selfie deaths. Even mundane and everyday spots such as railways and shopping centres are the scenes of tragic accidents. Some of these incidents are macabre. In a case that garnered worldwide headlines, a group of bystanders took selfies in front of three men were who were dying on a road after being involved in a crash. No one called an ambulance or helped the victims, who were covered in blood and writhing in pain.

Technology is becoming more and more integrated into every aspect of our lives. Meanwhile, the life span of devices is getting shorter — many products will be thrown away once their batteries die, to be replaced with new devices. Companies intentionally plan the obsolescence of their goods by updating the design or software and discontinuing support for older models. The discarded computers, cell phones, printers, televisions etc. create huge amounts of e-waste. 

Electronic devices contain toxic heavy metals, polluting PVC plastic, and hazardous chemicals which can harm human health and the environment. Developed countries ship a lot of their e-wastes to developing countries where workers usually do not wear protective equipment and lack any awareness that they are handling dangerous materials. Research has found that inhaling toxic chemicals and direct contact with hazardous e-waste materials result in increases in spontaneous abortions, stillbirths, premature births, reduced birth weights, mutations, congenital malformations, etc. Moreover, e-waste toxins contaminate the air, soil and groundwater.

As games get even more immersive, with augmented reality and virtual reality features, combined with monetized incentives and built-in conditioning, the addictive aspects seem likely to increase in the years ahead. Given the choice between a walk  or meeting friends face to face and twenty minutes on Facebook, the better choice for both mental and physical health would be the the former alternatives. But the current pandemic has ensured that Tech. companies will keep benefiting even more than they imagined.

Friday, October 2, 2020

Control through triviality - IV

In today's world, the most scarce quality is attention. Advertising companies have fought for decades to capture peoples' attention and convince them that various useless products are crucial for existence. With the information explosion following the advent of the internet, capturing and retaining attention became more crucial. The Nobel prize winning economist Herbert Simon said that ‘a wealth of information creates a poverty of attention’.

Social media companies influence how people think and behave without them even being aware of it. They deceive their users by manipulating their attention and directing it towards their own commercial purposes. They deliberately engineer addiction to the services they provide. The power to shape people’s attention is increasingly concentrated in the hands of a few companies. The business model of social media companies is based on advertising.  Facebook and Google effectively control over half of all internet advertising revenue. The more time users spend on the platform, the more valuable they become to the companies.

The attention merchants of Silicon Valley earn billions of dollars a year from our data. By posting, searching and liking, we perform the free labor that powers one of the most profitable sectors of the economy. The ethicist James Williams said, “Your time is scarce, and your technologies know it.” Technology is persuading millions of people in ways they don’t see. It steers what 2 billion people are thinking and believing every day. Big platforms like Apple, Facebook, Google, YouTube, Snapchat, Twitter, Instagram etc. suck us into their products and take time that we may later wish we had not wasted. 

Systems are getting better and better at steering what people are paying attention to and what people do with their time than ever before. We might enjoy the thing it persuades us to do, which makes us feel like we made the choice ourselves. When using technology, we often focus optimistically on all the things it does for us. Many defend their right to make “free” choices but ignore how those choices are manipulated upstream by menus we didn’t choose in the first place. Tech companies give people the illusion of free choice while designing the menu so that they win, no matter what you choose. 

Technology can undermine the autonomy of consumers or users because addiction is built into the apps. For example, many games and online platforms are designed to make users want to come back for more. In order to get the next round of funding or to get your stock price up, the amount of time that people spend on your app has to go up and then that attention is sold to advertisers. Many designers are thus under pressure to create addictive app features that engage you and suck as much time out of your life as possible. In Antisocial Media, Siva Vaidyanathan writes: 

Google and to a lesser extent Facebook help us manage the torrent of information around us by doing the work of deciding whats valuable or interesting to us.  . . Google and Facebook have cornered the market on [capturing attention]. 

Monetizing our captured attention pays for the labor and technology that enable Google and Facebook to filter the flood of information so effectively. And while those two companies are far from the the only players in the attention economy, they are the best at it. 

A cartoon character said, 'We have seen the enemy and it is us.' Captology is the study of computers as persuasive technologies. There is a Persuasive Tech Lab at Stanford University that studies various  techniques to automate persuasion. (Captology is derived from an acronym: Computers As Persuasive Technologies.) This includes the design, research, ethics and analysis of interactive computing products (computers, mobile phones, websites, wireless technologies, mobile applications, video games, etc.) created for the purpose of changing people’s attitudes or behaviors. Every day more computing products, including websites and mobile apps, are designed to change what people think and do. People have been fed the propaganda for decades that  they make their own choices so it is easy to manipulate them because they won’t think that their feelings are being produced and manipulated by some external system.

In Hooked: How To Build Habit Forming Products, Nir Eyal  discusses his Hook Model. The Hook is a habit forming product design (a habit being an activity done with little or no conscious thought). The Facebook, Instagram, and Twitter hooks happen every time you interact with the product. Frequent engagement with a service over a short period of time increases the likelihood of a person sticking to that behavior. The 4 steps of the hook are trigger, action, reward, and investment. 

  • Trigger – These can be external triggers like push notifications, or internal ones that are informed through an association or memory in our minds. The most frequent internal triggers are negative emotions. For example, depressed people check their email more.  
  • Action – This is the simplest behavior done in anticipation of a reward. The ease of performing an action increases the chance that it happens. Make the trigger visible and extremely easy to use. Every time the user has to think, they’re taking on cognitive load. The rule around forming habits is to reduce cognitive load to make doing easier than thinking. 
  • Reward – Rewards reinforce the motivation for performing an action and increase the likelihood of that action being repeated. Predictable rewards don't create desire. Variability in a reward really gets us hooked. A part of the brain called the nucleus accumbens becomes active when we crave something. It becomes most active in anticipation of a reward and less active when we get the reward. 
  • Investment – It occurs when the user puts something into the product or service such as time, data, effort, social capital or money. Inviting friends, stating preferences, building virtual assets, etc are all investments. They increase the likelihood of the next pass through the hook. 

What’s interesting is that while all physical products depreciate, habit forming technology appreciates! For example, the more content you have on Google Drive or the more followers you have on Twitter, the less likely you’ll be to leave those services. That’s often true even if a better competing service comes along. Over time, the number of people who remember life before the internet will be fewer and fewer, and eventually, no one will know what life was like without constant access to the internet and social media. 

Having a quiet dinner with one's family with associated chit chat or going out to play with friends will become rarer. There will be less face to face interactions among people. (Even before the virus pandemic, neighbours were meeting more often on Whatsapp.) No one will remember what it was like to eat dinner without taking a picture of it and posting it on Facebook. Used to the creep of technology into our lives, this comes to seem completely normal. All of this “disruption” is driven by technologies purposely designed to be addictive.  

Max Frisch once once remarked that “Technology is the knack of so arranging the world that we do not experience it.” An early investor in Facebook, Sean Parker said he has become a “conscientious objector” to social media, and that Facebook and others had succeeded by “exploiting a vulnerability in human psychology.” A former product manager at the company, Antonio Garcia-Martinez, has said Facebook lies about its ability to influence individuals based on the data it collects on them. The games developer Ian Bogost has said these addictive technologies are the 'cigarette of this century'.

Chamath Palihapitiya, who joined Facebook in 2007 and became its vice president for user growth, said he feels “tremendous guilt” about the company he helped make. “I think we have created tools that are ripping apart the social fabric of how society works,” he told an audience at Stanford Graduate School of Business. Taking a swipe at the wider online ecosystem, he said,  “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he said, referring to online interactions driven by “hearts, likes, thumbs-up.” “No civil discourse, no cooperation; misinformation, mistruth".