diumenge, 13 de gener del 2019

The media are unwittingly selling us an AI fantasy by John Naughton


Don’t believe the hype: the media are unwittingly selling us an AI fantasy
Hype: Hype is the use of a lot of publicity and advertising to make people interested in something such as a product.
Unwittingly: Not knowing or conscious, not intentional; inadvertent

John Naughton

Journalists need to stop parroting the industry line when it comes to artificial intelligence
Parroting: If you disapprove of the fact that someone is just repeating what someone else has said, often without really understanding it, you can say that they are parroting it.

Sun 13 Jan 2019 07.00 GMT

Artificial intelligence (AI) is a term that is now widely used (and abused), loosely defined and mostly misunderstood. Much the same might be said of, say, quantum physics. But there is one important difference, for whereas quantum phenomena are not likely to have much of a direct impact on the lives of most people, one particular manifestation of AI – machine-learning – is already having a measurable impact on most of us.

The tech giants that own and control the technology have plans to exponentially increase that impact and to that end have crafted a distinctive narrative. Crudely summarised, it goes like this: “While there may be odd glitches and the occasional regrettable downside on the way to a glorious future, on balance AI will be good for humanity. Oh – and by the way – its progress is unstoppable, so don’t worry your silly little heads fretting about it because we take ethics very seriously.”
Glitch: A glitch is a problem which stops something from working properly or being successful. Problem, difficulty, fault, flaw
Fret: If you fret about something, you worry about it.

Critical analysis of this narrative suggests that the formula for creating it involves mixing one part fact with three parts self-serving corporate cant and one part tech-fantasy emitted by geeks who regularly inhale their own exhaust. The truly extraordinary thing, therefore, is how many apparently sane people seem to take the narrative as a credible version of humanity’s future.
Cant: If you refer to moral or religious statements as cant, you are criticizing them because you think the person making them does not really believe what they are saying. (cantarella, jerga)
Exhaust: The exhaust of an engine consists of the waste gas that leaves it.

Chief among them is our own dear prime minister, who in recent speeches has identified AI as a major growth area for both British industry and healthcare. But she is by no means the only politician to have drunk that particular Kool-Aid.
Kool-Aid: Kool-Aid is a brand of flavored drink mix owned by Kraft Foods. "Drinking the Kool-Aid" refers to the 1978 Jonestown Massacre; the phrase suggests that one has mindlessly adopted the dogma of a group or leader without fully understanding the ramifications or implications. At Jonestown, Jim Jones' followers followed him to the end: after visiting Congressman Leo Ryan was shot at the airstrip, all the Peoples Temple members drank from a metal vat containing a mixture of "Kool Aid", cyanide, and prescription drugs Valium, Phenergan, and chloral hydrate. Present-day descriptions of the event often refer to the beverage not as Kool-Aid but as Flavor Aid, a less-expensive product from Jel Sert reportedly found at the site. Kraft Foods, the maker of Kool-Aid, has stated the same. Implied by this accounting of events is that the reference to the Kool-Aid brand owes exclusively to its being better-known among Americans.

Why do people believe so much nonsense about AI? The obvious answer is that they are influenced by what they see, hear and read in mainstream media. But until now that was just an anecdotal conjecture. The good news is that we now have some empirical support for it, in the shape of a remarkable investigation by the Reuters Institute for the Study of Journalism at Oxford University into how UK media cover artificial intelligence.

The researchers conducted a systematic examination of 760 articles published in the first eight months of 2018 by six mainstream UK news outlets, chosen to represent a variety of political leanings – the Telegraph, Mail Online (and the Daily Mail), the Guardian, HuffPost, the BBC and the UK edition of Wired magazine. The main conclusion of the study is that media coverage of AI is dominated by the industry itself. Nearly 60% of articles were focused on new products, announcements and initiatives supposedly involving AI; a third were based on industry sources; and 12% explicitly mentioned Elon Musk, the would-be colonist of Mars.
Leaning: Your particular leanings are the beliefs, ideas, or aims you hold or a tendency you have towards them.

Critically, AI products were often portrayed as relevant and competent solutions to a range of public problems. Journalists rarely questioned whether AI was likely to be the best answer to these problems, nor did they acknowledge debates about the technology’s public effects.

“By amplifying industry’s self-interested claims about AI,” said one of the researchers, “media coverage presents AI as a solution to a range of problems that will disrupt nearly all areas of our lives, often without acknowledging ongoing debates concerning AI’s potential effects. In this way, coverage also positions AI mostly as a private commercial concern and undercuts the role and potential of public action in addressing this emerging public issue.”

This research reveals why so many people seem oblivious to, or complacent about, the challenges that AI technology poses to fundamental rights and the rule of law. The tech industry narrative is explicitly designed to make sure that societies don’t twig this until it’s too late to do anything about it. (In the same way that it’s now too late to do anything about fake news.) The Oxford research suggests that the strategy is succeeding and that mainstream journalism is unwittingly aiding and abetting it.
Oblivious: If you are oblivious to something or oblivious of it, you are not aware of it. Ignorant, unconscious.
Poses: If something poses a problem or a danger, it is the cause of that problem or danger.(plantear)
Twig: If you twig, you suddenly realize or understand something
Aid and abet: If one person abets another, they help or encourage them to do something criminal or wrong. Abet is often used in the legal expression 'aid and abet': His wife was sentenced to seven years imprisonment for aiding and abetting him.(còmplice, cómplice)

Another plank in the industry’s strategy is to pretend that all the important issues about AI are about ethics and accordingly the companies have banded together to finance numerous initiatives to study ethical issues in the hope of earning brownie points from gullible politicians and potential regulators. This is what is known in rugby circles as “getting your retaliation in first” and the result is what can only be described as “ethics theatre”, much like the security theatre that goes on at airports.
Plank: Something that supports or sustains
Brownie point: Recognition for a good, but non-useful suggestion or effort. You gain brownie points when you say nice things or joke around with them and they accept the joke. You can also lose brownie points by saying something insulting or rude that the person would most likely take offense to.
Gullible: If you describe someone as gullible, you mean they are easily tricked because they are too trusting.

Nobody should be taken in by this kind of deception. There are ethical issues in the development and deployment of any technology, but in the end it’s law, not ethics, that should decide what happens, as Paul Nemitz, principal adviser to the European commission, points out in a terrific article just published by the Royal Society. Just as architects have to think about building codes when designing a house, he writes, tech companies “will have to think from the outset… about how their future program could affect democracy, fundamental rights and the rule of law and how to ensure that the program does not undermine or disregard… these basic tenets of constitutional democracy”.

Yep. So lets have no more “soft” coverage of artificial intelligence and some real, sceptical journalism instead.

What I’m reading
Music to my ears
Who said analogue nostalgia doesn’t have a future? According to a new BuzzAngle report, vinyl and cassette sales saw double-digit growth last year!

Rise of the machines
One giant step for a chess-playing machine… Science publishes Garry Kasparov’s thoughtful reflections on the Deep Blue supercomputer.

The search engineer
Overlooked no more. The New York Times’s long-overdue obituary of Karen Spärck Jones, the British computer scientist who laid the foundation for search engines.

Cap comentari:

Publica un comentari a l'entrada