Sunday, February 24, 2019

Is the psychological distance between people shrinking or growing? - IV

The difference between ancient myths and modern ones is that the latter are peddled as true. Among these modern myths is the assumption that replacing people with machines running complicated algorithms will give unbiased outputs. You are often told that systems are more objective and less prone to prejudice than human judgment. Many companies are turning to machine learning to review vast amounts of data, from evaluating credit for loan applications, to scanning legal contracts for errors, to looking through employee communications with customers to identify bad conduct.

Unintended consequences are the norm rather than the exception in human endeavours and delegating decision-making to machines is no exception. In 2016, for example, an attempt by Microsoft to converse with millennials using a chat bot plugged into Twitter famously created a racist machine that switched from tweeting that “humans are super cool” to praising Hitler and spewing out misogynistic remarks. This was a system that  learnt through interaction with users and bias arose based on the biases of the users driving the interaction. In essence, the users repeatedly tweeted offensive statements at the system and it used those statements as the input for later responses.

In Homo Deus: A Brief History of Tomorrow, Yuval Noah Harari says that high-tech Silicon Valley gurus are creating a new universal narrative that legitimizes the authority of algorithms and Big Data. He calls this novel creed “Dataism”. According to Dataism, 'Beethoven’s Fifth Symphony, a stock-exchange bubble and the flu virus are just three patterns of dataflow that can be analyzed using the same basic concepts and tools. The proponents of the Dataist worldview perceive the entire universe as a flow of data and we are already becoming tiny chips inside a giant system that nobody really understands. The nature of AI algorithms makes it so that we won’t know how or what the system is doing, or how it’s so damn good at predicting and choosing things.'

We may thus be gradually ceding control to algorithms which will make all the important decisions for us. But, as Cathy O’ Neil says in Weapons of Math Destruction, an algorithm is just 'an opinion formalized in code'. We tend to think of machines as somehow cold, calculating and unbiased. We believe that self-driving cars will have no preference during life or death decisions between the driver and a random pedestrian. We trust that smart systems performing credit assessments will ignore everything except the genuinely impactful metrics, such as income.  And we have the belief that learning systems will always ultimately enable us to find out the truth because ‘unbiased’ algorithms drive them.

Using machine-learning technology for accomplishing various tasks is attractive but it doesn't eliminate human bias; it just camouflages it with technology.  In an ideal world, intelligent systems and their algorithms would be objective.  Yet there are many more potential ways in which machines can be taught to do something immoral, unethical, or just plain wrong. Machine bias or algorithm bias is the effect of erroneous assumptions in machine learning processes. Models of human behaviour often use proxies which are questionable and depend on the biases of the modeler. The models are generally opaque and businesses guard them as intellectual property defended by legions of lawyers.

These algorithms have large effects in the real world which can have unfortunate consequences with some beginning to describe data science as the new astrology. Technocrats and managers make debatable value judgments that have their biases written all around them. Complicated mathematical models reframe subtle and subjective conclusions (such as the worth of a worker, service, article, or product). One opaque model feeds into another and could affect your whole life. For eg., a bad credit score could mean that you may not get a house, buy a vehicle or get a job. The techies then claim that it is objective “science” based on measurable data which is supposed to be accepted unchallenged. As long as the algorithms are secret, you will never know what kind of of social sorting is taking place.

Can a computer program be racist?  In ‘predictive policing’, historical data about crime is fed into an algorithm and this gives police information about future crime. The system is in use in countries like the US and China. But predictive tools are only as good as the data they are fed. As an article says, Predictive Policing Isn’t About the Future, it’s about the past, These systems are based mostly or entirely on historical crime data held by the police which are a record of how law enforcement responds to particular crimes, rather than the true rate of crime.  Hence these data are contaminated by underlying biases about where to deploy police and what type of people commit crimes thereby strengthening these biases. Forecasts are only as good as the data used for their training.

A machine learning algorithm is used by judges in over a dozen states in the US to make decisions on pre-trial conditions, and sometimes, in actual sentencing. A study found that the algorithm was two times more likely to incorrectly predict that black defendants were high risk for recommitting a crime, and conversely two times more likely to incorrectly predict that white defendants were low risk for recommitting a crime. This difference could (and did) result in handing out tougher or more lenient sentences to convicts.

In an article Beware the Big Errors of 'Big Data', Nassim Nicholas Taleb warns that spurious correlations will increase with the voluminous jump in data collection. The huge increase in the haystack will make it harder to find the needle. As Tim Harford, says in the article Big Data: Are we making a big mistake?, ‘Because found data sets are so messy, it can be hard to figure out what biases lurk inside them – and because they are so large, some analysts seem to have decided the sampling problem isn’t worth worrying about. It is.’

The industries involved in various activities like search or credit hide their methods in secret algorithms. The “privacy policies” of various firms are written to their advantage at the expense of the consumer. I have yet to come across anybody who reads them. People mechanically click “I agree” when confronted with “terms of service” agreements since protesting against any clause in them won’t be of any use. The dice is heavily loaded against them. Now there is a  “unilateral modification” clause that lets companies change the agreement later, with no notice to the persons affected. Frank Pasquale writes in The Black Box Society:
We cannot so easily assess how well the engines of reputation, search, and finance do their jobs. Trade secrecy, where it prevails, makes it practically impossible to test whether their judgments are valid, honest, or fair. The designation of a person as a bad employment prospect, or a website as irrelevant, or a loan as a bad risk may be motivated by illicit aims, but in most cases we’ll never be privy to the information needed to prove that. 
What we do know is that those at the top of the heap will succeed further, thanks in large part to the reputation incurred by past success; those at the bottom are likely to endure cascading disadvantages. Despite the promises of freedom and self-determination held out by the lords of the information age, black box methods are just as likely to entrench a digital aristocracy as to empower experts.

Friday, February 8, 2019

Is the psychological distance between people shrinking or growing? - III

A great mystery of our hyper-connected digital age is that we seem to be drifting apart. Things like mobile phones and social media as also the unending working hours and traffic jams have accelerated the atomisation of the individual. Deep friendships have been replaced by screens, gadgets, and bleary-eyed couch-potato stupor. People seem more self-absorbed, more individualistic, less empathetic. They communicate more, but there’s less communication with the people they’re actually around. They’re ignorant, because many of them don’t feel the need to educate themselves outside their little world.

When visitors come, I see that when adults are in conversation, the children are often in their own worlds keeping on playing with their mobiles. (They are afflicted by the modern disease FOLO: Fear Of Losing Out.) It is no wonder that they don't know many of their relatives. A family friend said that once she saw her daughter chatting with a friend on Facebook. On checking who it was, she found that the friend was a neighbour who lived next-door. Even when told that it would be better to hop across and chat with her friend in person, she continued to use Facebook.

The widespread hypnotisation of TV, laptops and mobile phones has reduced social interactions. When I was growing up in Jamshedpur, I recall a lot more interaction among acquaintances. Almost every weekend, I used to visit some friends or relatives with my family or they used to come over to visit us. Now  I don't even know some people in the apartments block where I live. A relative who stays in the Gulf and visits India frequently, visits his relatives every time. He was telling us that some relatives told him that he comes from the Gulf on a short trip and meets them but they are rarely able to meet some of their neighbours. George Orwell writes in his essay Pleasure Spots:
Much of what goes by the name of pleasure is simply an effort to destroy consciousness. If one started by asking, what is man? what are his needs? how can he best express himself? one would discover that merely having the power to avoid work and live one's life from birth to death in electric light and to the tune of tinned music is not a reason for doing so. Man needs warmth, society, leisure, comfort and security: he also needs solitude, creative work and the sense of wonder. If he recognised this he could use the products of science and industrialism eclectically, applying always the same test: does this make me more human or less human? 
He would then learn that the highest happiness does not lie in relaxing, resting, playing poker, drinking and making love simultaneously. And the instinctive horror which all sensitive people feel at the progressive mechanisation of life would be seen not to be a mere sentimental archaism, but to be fully justified. For man only stays human by preserving large patches of simplicity in his life, while the tendency of many modern inventions - in particular the film, the radio and the aeroplane - is to weaken his consciousness, dull his curiosity, and, in general, drive him nearer to the animals.
Tech enthusiasts looked to search engines as an extraordinary democratization of the Internet. But that is only half the story. The online world also spawns murketing, unfair competition, and distortions of reality that may be having the opposite effect. Whenever you hear the word ‘secret’, your antennas should be up. In a moment of of weakness, the government conceded the Right to Information Act. Since then, it has been trying to correct its error.

Bankers will hide their risk behind complex securities. There will be various accounting tricks for short term gains. Obligations would remain on balance sheets for some purposes, and off them for others. In the article Why is finance so complex?, Steve Randy Waldman says, ‘Finance has always been complex. More precisely it has always been opaque, and complexity is a means of rationalizing opacity in societies that pretend to transparency.’

In many countries, computerized exchanges made it possible to gain or lose fortunes within seconds, Thus even the length of  the wire connecting your computer to the server of the stock exchange is vital for maintaining one's competitive edge, as Michael Lewis explains in Flashboys. Such information advantage can only be obtained by the wealthy who will also be able to employ economists to give sophisticated reasons for why such fixing of markets will benefit you. After all, he who pays the piper calls the tune.

Existing power structures are strengthened by obfuscation. Powerful, wealthy companies will hide information by providing too much of it. If a firm is asked for some information, it could inundate you with tens of thousands of pages of data. Investigators are in effect trying to find a needle in a haystack and waste a lot of time and effort in the process. Meanwhile the institutions will self righteously quote Louis Brandeis’s comment that “sunlight is said to be the best of disinfectants”.

When large, wealthy companies can do nothing to refute the mountain of evidence against their products and practises, they will employ ‘experts’ to create confusion, uncertainty, and doubt in the minds of the public. In 1969, an internal memo within the Brown and Williamson Tobacco company stated that “Doubt is our product since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public. It is also a means of establishing controversy.” Uncertainty and doubt always tend to maintain the status quo and give people an excuse to continue doing what they really want to do.

The same strategy has also been used to make it appear that no one really knows whether climate change is real or what might be causing it.  Many people read about the results of the study, but never probe into how the study was actually done.  People could then say, “If nobody knows, I might as well continue to drive my SUV, eat my burgers and live just as I always have.” More investigation may reveal that studies of this sort typically have strong financial ties to the industry that is selling the product although the concerned firm will always deny that they influenced the report. As Humphrey says in one episode of Yes Minister, 'He that would keep a secret must keep it secret that he hath a secret to keep.' In Homo Deus: A Brief History of Tomorrow, Yuval Noah Harari says:
In the past, censorship worked by blocking the flow of information. In the twenty-first century censorship works by flooding people with irrelevant information. We just don't know what to pay attention to and often spend our time investigating and debating side issues. In ancient times having power meant having access to data. Today having power means knowing what to ignore.