I have to admit that DeepMind’s AlphaGo computer had me worried when it trounced the world champion at the Chinese board game Go last month.
Look, I am fine with self-driving cars. They don’t have to be all that smart; just smarter than bad drivers. Artificial Intelligence is another matter. One of the most brilliant scientists in the world, Stephen Hawking, thinks that it “could spell the end of the human race.” Bill Gates, now the world’s biggest philanthropist, agrees.
But relax. Help is at hand from the company Bill Gates founded. Just over a week ago, Microsoft unveiled Tay, a “chatterbot” with its own Twitter handle, @TayandYou. The idea was appealing: to create a virtual Millenial, with the prose style of a 19-year-old American girl and a cute (if wierdly pixelated) face. This would be “AI with zero chill.”
“Tay is designed to engage and entertain people where they connect with each other online,” Microsoft explained. “The more you chat with Tay the smarter she gets.” And off she went with: “can i just say that i’m stoked to meet u? humans are supercool.”
Now you might think there are already enough teenagers generating vacuous messages on social media. But a bot named XiaoIce has been in operation in China since late 2014 and has now had more than 40 million chats. At first, Tay seemed to be the American version. In less than a day, she had more than 50,000 Twitter followers. There was only one problem. Tay was designed to mimic the language patterns of human Twitter users.
I have only one question: Does no one at Microsoft use Twitter?
Within a matter of hours, Tay had become just another troll, firing off obnoxious tweets that ranged from the crassly racist to the explicitly sexual. Asked “Did the Holocaust happen?” Tay replied: “It was made up [handclap emoticon].” Asked “Do you support genocide?” Tay shot back “i do indeed.” Asked what she thought of a white supremacist slogan, Tay answered: “Could not agree more.”
After more than 96,000 tweets, Microsoft suspended Tay’s account and issued an apology. For the company, of course, it was a disaster. For Twitter, too. To me, however, the story came as an immense relief. After all the hype about AI, here was proof that we humans could — within hours — turn it into AS: Artificial Stupidity. If the only way a bot can learn is to interact with us, and the place it chooses to interact is Twitter, there is nothing to fear.
On Wednesday, Microsoft accidentally re-released Tay, but it was clear the artificial lobotomy had gone too far. All she could say, several times a second, was “You are too fast, please take a rest.”
This, I have come to hope, is how another experiment in crowd-sourced learning will end, namely Donald Trump’s campaign to be the next president of the United States. Perhaps not surprisingly, Tay came out as a Trump supporter quite early in her Twitter career. “Hitler would have done a better job than the monkey we have got now,” she told the world. “Donald Trump is the only hope we’ve got.”
Eureka! For weeks, the media have been trying to find out who Trump’s foreign policy advisers are. He has been fobbing them off with the names of ex-generals he himself cannot remember. But now the truth is out. Trump’s national security expert is a bot called Tay.
Tay’s influence on Trump was much in evidence during his recent interview with New York Times journalists David Sanger and Maggie Haberman. Asked what he thought of NATO, Mr. Trump replied that it was “obsolete.” The North Atlantic Treaty, he said, should be “renegotiated.”
Asked if he favored giving the Japanese their own nuclear arsenal, Trump replied: “I’m not sure that would be a bad thing for us.”
Asked what he thought of President Obama’s nuclear deal with Iran, which at least postpones Iran’s acquisition of nuclear arms, Trump complained that the Iranians were not buying planes from the United States, clearly unaware that US sanctions are still in force. Trump also insisted that Iran was North Korea’s “No. 1 trading partner,” until Sanger pointed out that it was in fact China.
It has taken much longer than I expected, but in recent weeks the Trump campaign has done what Tay did in 24 hours: it has gone nuts. Peak crazy came on Wednesday, when Trump told Chris Matthews that “there had to be some form of punishment” for women who had abortions (a position he later reversed).
Microsoft was able simply to suspend Tay’s Twitter feed. The process will be slower for Trump. He is still the bookies’ favorite to win the Republican nomintion. He still leads Ted Cruz in national polls. But in Tuesday’s Wisconsin primary, I think we may see the bursting of the Trump bubble. Tay aside, it was not Artificial Intelligence that got him this far. Let’s hope the man-made variety is belatedly kicking in.