Richard Waters (“Advances in Supercomputers Set to Reshape the Technological Landscape,” Inside Business, May 21) is right on one point, when he observes: “We don’t know how far these systems can go to produce really useful products. [human] interactions. But neither he nor anyone else in the tech world seems to understand why.
A computer, however fast, is a disembodied calculating machine, without meaning, without biology or without experience of life in the world.
GPT-3, the much-vaunted language generator – whose only advantage over humans is that troll factories can use it to produce billions of fake tweets at once – thinks the phrase ” can marijuana cause cancer? ” is the same as “can marijuana cause cancer from smoking?”
Good luck figuring out that “I don’t mind if I do” can mean anything from “if I have to” to “let’s have sex now!”.
Without this vital experience of life, he understands nothing. So all it can do is very extensive and very fast pattern recognition – which is why Google has used search as its primary focus.
But the real concern, again barely mentioned, is energy. To bring GPT-3 to its current state, it took a 500 billion word dataset, sorted through 175 billion parameters and trained over several thousand petaflop days (one thousand million million, or 10 to 15 operations, each second, for a whole day). It does not count the energy it draws when using it.
The Oak Ridge US National Laboratory’s Summit supercomputer, which in some ways has almost the same raw computing power as the human brain, also uses a million times more energy.
Until we understand what computers are for and what humans are, and learn to use artificial intelligence not as a substitute for human understanding, but as a complement to it, we we rush into a dead end towards a fried planet.
Director of MIT Media Lab
Artist in Residence, Potsdam Institute for Climate Impact Research
London NW1, United Kingdom