Saturday, July 25, 2020

The tyranny of algorithms - IX

According to the prophets, the arrival of the internet was going to be the biggest thing to happen to democracy since the invention of the ballot box. Nothing like the Rwandan genocide could ever happen again, the former British PM Gordon Brown insisted, 'because information would come out far more quickly about what is actually going on and the public opinion would grow to the point where action would need to be taken’. The message was: large doses of information and communications technology are bound to prove lethal to the most repressive of regimes. What the cyber-utopians failed to grasp was that the internet can just as easily be used to control people as it can be used to educate them.

There were massive protests in Iran in 2009 because of suspicions of a fraudulent election. The protests were thought to be fueled by tweets and cyber-utopians lost no time in claiming that the Internet will spell the doom of dictators everywhere and that a liberal democracy was the only game in town. So much so that 'the Internet' was one of the nominations for the Nobel Peace Prize in 2010. What they failed to realize was that tweets don't topple government, people do. A real revolution sooner or later demands sacrifices from the population, not just typing on computers.

After the failed uprising in Iran, the government hunted down dissidents online, tracking them through their emails and using face-recognition technology to identify people from pictures taken on mobile phones. The authorities used technology for their own benefit by sending messages warning Iranians to stay away from street protests. The police searched for personal details like Facebook profiles and email addresses of Iranians living abroad and threatened them not to incite protests unless they wanted to hurt their relatives back home. Governments use social networks to infiltrate protest groups and track down protesters, seeding their own propaganda online.

In 21 Lessons for the 21st Century, Yuval Noah Harrari says that many fear AI algorithms because they think that they will not remain obedient to us. But the problem with algorithms is exactly the opposite - they will always do what they will be ordered to do. If algorithms and robots are in benign hands, they will produce tremendous benefits. However, if countries of the “Axis of evil” embrace new technologies, people might end up in a complete surveillance regime where all actions and utterances are followed and controlled by the future Big Brother and humans will come to live in “digital dictatorships.” Eventually, the population of digital dictatorships, because of extensive propaganda and constant fear of being marked as a dissenter, will come to unconditionally obey the Big Brother. From “1984" by George Orwell:
We do not destroy the heretic because he resists us: so long as he resists us we never destroy him. We convert him, we capture his inner mind, we reshape him. We burn all evil and all illusion out of him; we bring him over to our side, not in appearance, but genuinely, heart and soul. We make him one of ourselves before we kill him. It is intolerable to us that an erroneous thought should exist anywhere in the world, however secret and powerless it may be.
An example of of such a digital dictatorship is what is being implemented in China called the 'Social Credit System' (SCS). Every citizen in China would be given a score that will be available for all to see. This citizen score comes from monitoring an individual’s social behavior — from their spending habits and how regularly they pay bills, to their social interactions — and it’ll become the basis of that person’s trustworthiness, which would also be publicly ranked. What people can and can't do, like the kinds of jobs or mortgages they can get, and what schools their children qualify for will depend on how high their "citizen score" is.

There already are agencies that trace the timely manner in which we pay our debts, giving us a score that's used by lenders and mortgage providers. There is on eBay a rating on shipping times and communication, while Uber drivers and passengers both rate each other; if your score falls too far, you're in trouble. China's social credit system expands that idea to all aspects of life, judging citizens' behavior and trustworthiness. Caught jaywalking, don't pay a court bill, play your music too loud on the train — you could lose certain rights, such as booking a flight or train ticket. If it is implemented overtly, it doesn't mean that the idea is new or that it doesn't exist elsewhere in more skeletal form.

Supporters of the SCS see this as an opportunity to improve on some of the state’s services. Some argue that this would give Chinese citizens much-needed access to financial services. It's all about building trust, says the Chinese government. The 2014 document describing the government's plans note that as "trust-keeping is insufficiently rewarded, the costs of breaking trust tend to be low." Any technology doesn't come only with benefits; it also comes with costs which its champions would play down. It could paint a very inaccurate and incomplete picture of a person.

People do many different things for many different reasons, and if the context is not appreciated it can be misconstrued. This is what happens when algorithms compute correlations from large amounts of data. Someone who plays video games for ten hours a day, for example, could be considered an idle person.  But the reason he was playing games could be because he is a games developer who was testing a new product. A person who is looking at various terrorist organizations could be designated by an algorithm as a person to be watched by security agencies. In reality, he may just be a journalist doing his job. The system can also be used to enforce vague laws like endangering national security or unity.

China has developed advanced facial recognition systems that are able to follow people across entire cities. In a show of power at the end of 2017, Chinese officials working in co-operation with BBC News showed how it could track down and find one of the organisation's reporters within seven minutes. Ultimately, the problem is that “socially acceptable behavior” will be defined by the Chinese government, not a democratic process since it now will have a way of monitoring virtually all aspects of citizens’ lives.

The system the Chinese are putting in place is just an expanded version of what is already in existence in many democratic countries. Police and intelligence agencies are using the databases created by the private sector to revolutionize their own role in society. The government will say that you don’t have to worry if you have nothing to hide. But if your political activities or interests deviate even slightly out of the mainstream, you do. Thousands of people are being caught in data-driven dragnets for being activists, or just belonging to a suspect “identity” group. Careful protection of the boundary between crime and dissent is not a high priority of the intelligence apparatus. FBI director Robert Mueller said way back in 2002, that “there is a continuum between those who would express dissent and those who would do a terrorist act.”

Tuesday, July 14, 2020

The tyranny of algorithms - VIII

Some magazines now employ a company called Narrative Science to automatically generate online articles about what to expect from upcoming corporate earnings statements. Just feed it some statistics and, within seconds, the clever software produces highly readable stories. Or, as Forbes puts it, “Narrative Science, through its proprietary artificial intelligence platform, transforms data into stories and insights.” In an article, A Robot Stole My Pulitzer!  Evgeny Morozov writes:
Don’t miss the irony here: Automated platforms are now “writing” news reports about companies that make their money from automated trading. These reports are eventually fed back into the financial system, helping the algorithms to spot even more lucrative deals. Essentially, this is journalism done by robots and for robots. The only upside here is that humans get to keep all the cash. 
Apart from sports, finance, and real estate in which news stories tend to revolve around statistics, Narrative Science has also entered the political reporting arena. It’s much cheaper than paying full-time journalists who tend to get sick and demand respect and there is no one to fret about the terrible working conditions. The article takes only a second to compose, a deadline that no journalist can beat. Science promises to be more comprehensive — and objective — than any human reporter. Few journalists have the time to find, process, and analyze millions of tweets, but Narrative Science can do so easily and instantaneously.

In the long run, the civic impact of such technologies may be more problematic.  Everything we click, read, search, and watch online is increasingly the result of some optimization effort, whereby our previous clicks, searches, “likes,” purchases, and interactions determine what appears in our browsers and apps. Such personalization of the Internet may usher in a world in which we see only articles that reflect our existing  interests and never venture outside of our comfort zones. What if we click on the same link that, in theory, leads to the same article but end up reading very different texts? In an article, A Robot Stole My Pulitzer!,   Evgeny Morozov writes:
How will it work? Imagine that my online history suggests that I hold an advanced degree and that I spend a lot of time on the websites of the Economist or the New York Review of Books; as a result, I get to see a more sophisticated, challenging, and informative version of the same story than my USA Today-reading neighbor. 
If one can infer that I’m also interested in international news and global justice, a computer-generated news article about Angelina Jolie might end by mentioning her new film about the war in Bosnia. My celebrity-obsessed neighbor, on the other hand, would see the same story end with some useless gossipy tidbit about Brad Pitt. 
Producing and tweaking stories on the spot, customized to suit the interests and intellectual habits of just one particular reader, is exactly what automated journalism allows — and why it’s worth worrying about. Advertisers and publishers love such individuation, which could push users to spend more time on their sites. But the social implications are quite dubious. 
At the very least, there’s a danger that some people might get stuck in a vicious news circle, consuming nothing but information junk food and having little clue that there is a different, more intelligent world out there. And the communal nature of social media would reassure them that they aren’t really missing anything. 
Another piece of human creation that people presume cannot be created by machines is music. Emotions are not some mystical phenomenon — they are a biochemical process. Hence, given enough biometric data and enough computing power, suppose external algorithms are able to understand and manipulate human emotions better than Shakespeare, Picasso or Lennon? Allow a learning machine to go over millions of musical experiences, and it will learn how particular inputs result in particular outputs.

David Cope, a musicology professor at the University of California in Santa Cruz, created a computer program called EMI (Experiments in Musical Intelligence), which specialized in imitating the style of Johann Sebastian Bach. In a public showdown at the University of Oregon, an audience of university students and professors listened to three pieces — one a genuine Bach, another produced by EMI and a third composed by a local musicology professor, Steve Larson. The audience was then asked to vote on who composed which piece. The result? The audience thought that EMI’s piece was genuine Bach, that Bach’s piece was composed by Larson, and that Larson’s piece was produced by a computer.

Hence in the long run, algorithms may learn how to compose entire tunes, playing on human emotions as if they were a piano. Will this result in great art? As Yuval Noah Harari says, 'To enter the art market, algorithms won’t have to begin by straightaway surpassing Beethoven. It is enough if they outperform Justin Bieber.' In The World Without Mind, Franklin Foer writes:

If algorithms can replicate the process of creativity, then there is little reason to nurture human creativity. Why bother with the tortuous, inefficient process of writing or painting if a computer can produce something seemingly as good and in a painless flash? . . . No human endeavour has resisted automation, so why should creative endeavours be any different?
The engineering mind has little patience for the fetisization of words and images, for the mystique of art, for moral complexity and emotional expression. It views humans as data, components of systems, abstractions. . . The whole effort is to make human beings predictable . . . With this sort of cold-blooded thinking . . . it's easy to see how long-standing values begin to seem like an annoyance . . .

Wednesday, July 1, 2020

The tyranny of algorithms - VII

As Robert Jungk says in Tomorrow is Already Here, ‘The devil has many names, and in this century he likes to call himself "Mr. Profit" or "Mr. Efficiency".'  He further writes: ‘Planning down to the smallest detail, control of each labor process, the abolition of all possible waste of time, characterize the American earning system. The same smooth functioning is required of the man as of the machine. "Efficiency" has grown more important in the working world than freedom.‘ One manifestation of this obsession with efficiency is “clopening”, closing late at night and opening again just a few hours later.

In the US, many businesses rely on scheduling software that determine the number of workers required and exactly when they are required using sales patterns and other data from thousands of locations. They can bring in more hands in anticipation of a delivery truck pulling in or the weather changing, and sending workers home when real-time analyses show sales are slowing. Managers are often compensated based on the efficiency of their staffing. Scheduling is now a powerful tool to bolster profits, allowing businesses to cut labor costs with a few keystrokes. Yet those advances are injecting turbulence into parents’ routines and personal relationships, undermining efforts to expand preschool access, driving some mothers out of the work force and redistributing some of the uncertainty of doing business from corporations to families.

Having the same employee close the store late at night and open it again at dawn makes logistical sense for a company. Many employees find out only a day or two in advance that they are scheduled for 'clopening'. This results in unpredictable work schedules, preventing parents from committing to regular drop-off times or answering standard questions on subsidy forms and applications for aid: “How many hours do you work?” and “What do you earn?” Previously, inefficiencies in the workplace benefited workers by giving them regular working hours and time to read / study. Now they are under the control of software and they are kept busy every minute (praised by authorities as 'hard work') which makes them 'better' cogs in the wheel.

The automation of many entry-level roles will make it even harder for young people to gain traction in the working world. It is thought that two-thirds of the job losses for young people could occur in food, hospitality, or retail. One-third of the automation-related job losses for young people could occur in white-collar jobs, including entry-level roles in accounting, finance, human resources, and administration. In the legal profession, for example, AI can handle document review and case law search — not a favorite task for junior attorneys, but it provides valuable opportunities for learning. Now AI is outperforming humans at these tasks. This means aspiring young professionals will need to enter the labor force in higher-level roles. But employers have been saying for years that too many new hires, even those with college degrees, are not work-ready.

Online commerce allows even conscientious consumers to forget that other people are involved. Amazon employs or subcontracts tens of thousands of warehouse workers, with seasonal variation. Accounts from inside the centers describe the work of picking, boxing, and shipping books and dog food and beard trimmers as a high-tech version of the dehumanized factory floor satirized in Chaplin’s “Modern Times.” Pickers holding computerized handsets are perpetually timed and measured as they fast-walk up to eleven miles per shift around a million-square-foot warehouse, expected to collect orders in as little as thirty-three seconds. Warehouse jobs are gradually being taken over by robots. Bezos recently predicted that, in five years, packages will be delivered by small drones. Then Amazon will have eliminated the human factor from shopping.

In 21 Lessons for the 21st Century, Yuval Noah Harari says that Computer scientists are developing artificial intelligence (AI) algorithms that can learn and analyse massive amounts of data and recognize patterns with superhuman efficiency. At the same time, biologists and social scientists are deciphering human emotions, desires and intuitions. The merger of info-tech and biotech is giving rise to algorithms that can successfully analyse us and communicate with us, and that may soon outperform human doctors, drivers, soldiers and bankers in such tasks. These algorithms could eventually push hundreds of millions out of the job market.

This has already happened in the field of medicine. The most important medical decisions in your life are increasingly based not on your feelings of illness or wellness, or even on the informed predictions of your doctor — but on the calculations of computers who know you better than you know yourself. This situation is likely to take place in more and more fields. It starts with simple things, like which book to buy and read. Harari speculates that in the 21st century we will create more powerful myths and more totalitarian religions than in any previous era. With the help of biotechnology and computer algorithms these religions will not only control our minute-by-minute existence, but will be able to shape our bodies, brains and minds.