Donald Trump poses a huge dilemma for commentators: to ignore his daily outrages is to normalize his behavior, but to constantly write about them is to stop learning. Like others, I struggle to get this balance right, which is why I pause today to point out some incredible technological changes happening while Trump has kept us focused on him — changes that will pose as big an adaptation challenge to American workers as transitioning from farms to factories once did.
Two and half years ago I was researching a book that included a section on IBM’s cognitive computer, “Watson,” which had perfected the use of artificial intelligence enough to defeat the two all-time “Jeopardy!” champions. After my IBM hosts had shown me Watson at its Yorktown Heights, New York, lab, they took me through a room where a small group of IBM scientists were experimenting with something futuristic called “quantum computing.” They left me thinking this was Star Wars stuff — a galaxy and many years far away.
Last week I visited the same lab, where my hosts showed me the world’s first quantum computer that can handle 50 quantum bits, or qubits, which it unveiled in November. They still may need a decade to make this computer powerful enough and reliable enough for groundbreaking industrial applications, but clearly quantum computing has gone from science fiction to nonfiction faster than most anyone expected.
Who cares? Well, if you think it’s scary what we can now do with artificial intelligence produced by classical binary digital electronic computers built with transistors — like make cars that can drive themselves and software that can write news stories or produce humanlike speech — remember this: These “old” computers still don’t have enough memory or processing power to solve what IBM calls “historically intractable problems.” Quantum computers, paired with classical computers via the cloud, have the potential to do that in minutes or seconds.
For instance, “while today’s supercomputers can simulate ... simple molecules,” notes MIT Technology Review, “they quickly become overwhelmed.” So chemical modelers — who attempt to come up with new compounds for things like better batteries and lifesaving drugs — “are forced to approximate how an unknown molecule might behave, then test it in the real world to see if it works as expected. The promise of quantum computing is to vastly simplify that process by exactly predicting the structure of a new molecule, and how it will interact with other compounds.”
Quantum computers process information, using the capabilities of quantum physics, differently from traditional computers. “Whereas normal computers store information as either a 1 or a 0, quantum computers exploit two phenomena — entanglement and superposition — to process information,” explains MIT Technology Review. The result is computers that may one day “operate 100,000 times faster than they do today,” adds Wired magazine.
Talia Gershon, an IBM researcher, posted a fun video explaining the power of quantum computers to optimize and model problems with an exponential number of variables. She displayed a picture of a table at her wedding set for 10 guests, and posed this question: How many different ways can you seat 10 people? It turns out, she explained, there are “3.6 million ways to arrange 10 people for dinner.”
Classical computers don’t solve “big versions of this problem very well at all,” she said, like trying to crack sophisticated encrypted codes, where you need to try a massive number of variables, or modeling molecules where you need to account for an exponential number of interactions. Quantum computers, with their exponential processing power, will be able to crack most encryption without breaking a sweat.
It’s just another reason China, the NSA, IBM, Intel and Google are now all racing — full of sweat — to build usable quantum systems.
Look at where we are today thanks to artificial intelligence from digital computers — and the amount of middle-skill and even high-skill work they’re supplanting — and then factor in how all of this could be supercharged in a decade by quantum computing.
As education-to-work expert Heather McGowan (www.futureislearning.com) points out: “In October 2016, Budweiser transported a truckload of beer 120 miles with an empty driver’s seat.... In February 2017, Bank of America began testing three ‘employee-less’ branch locations that offer full-service banking automatically, with access to a human, when necessary, via video teleconference.”
It’s why IBM’s CEO, Ginni Rometty, remarked to me in an interview: “Every job will require some technology, and therefore we’ll need to revamp education. The K-12 curriculum is obvious, but it’s the adult retraining — lifelong learning systems — that will be even more important.”
Each time work gets outsourced or tasks get handed off to a machine, “we must reach up and learn a new skill or in some ways expand our capabilities as humans in order to fully realize our collaborative potential,” McGowan said.
Anyway, I didn’t mean to distract from the “Trump Reality Show,” but I just thought I’d mention that Star Wars technology is coming not only to a theater near you, but to a job near you. We need to be discussing and adapting to its implications as much as we do Trump’s tweets.
Thomas Friedman, a New York Times columnist, was awarded two Pulitzer Prizes for international reporting in Beirut and Israel and one for commentary.