AI pace is picking up, and with it the anxiety

As a variation on Moore’s famous law about mankind’s computer processing power doubling every 18 – 24 months, it certainly seems to be true that the numbers of commentators and words being written on the subject of Artificial Intelligence are increasing at a similarly fantastic rate.

And it is no longer just the computer geeks, philosophers and neuroscientists who meet at highbrow conferences who are doing all the talking. Increasingly well-informed, and downright entertaining, commentary is turning up on general interest websites like WaitButWhy, while self-styled science comedian Brian Malow – and I first rendered the man’s name with the Freudian typo “Brain” – recently reflected in a light-hearted way on a fear that has always been a mainstay of science fiction. In essence: should we fear the consequences when the power of Artificial Intelligence outstrips our own?

Looking at humanity’s history of interactions with species that are unable to defend against our own predations, it is easy to see how we might project onto a superior intelligence the assumption of malignant intent. But is predatory behaviour necessarily a function of a creature’s intelligence, rather than a manifestation of other, baser characteristics? It would seem more natural to suppose that it would be part of the definition of a higher intelligence that it would not default to a malignant desire to eliminate any less intelligent creature, but rather work to preserve and enable intelligence where it found it. That would certainly seem a reasonable extrapolation from our own species, where strong correlations exist between violent impulses on the one hand, and their greater indulgence by the less intelligently inclined, on the other.

A more likely worry is what the AI community refers to as the Bostrom’s paperclips scenario, whereby an AI programme designed to optimise the manufacture of paperclips scales up to consume the universe in hoarding resources needed to make and distribute paperclips. Except that in such a world we would surely fall victim to a host of AIs, all programmed to optimise the manufacture of any and all things, with the inevitable consequence of provoking an Armageddon of office supply wars.

Leave a Reply

Your email address will not be published. Required fields are marked *