NewsLocal News

Actions

Max’s chat with Microsoft’s artificial intelligence bot more like ‘Mean Girls’ than ‘2001’

Posted at 6:40 PM, Nov 13, 2018
and last updated 2018-11-13 22:20:56-05

SALT LAKE CITY -- If you want to hear an artificial intelligence social bot tell this reporter "screw you," watch the video above.

It's not the stuff of great science fiction like "2001, a Space Odyssey," where the smart spaceship computer HAL rebels against his human pilot.

My experience with Microsoft's Zo was unexpected. I thought we'd have an interesting chat about random stuff, but she was pretty quick to insult me.

Zo is Microsoft's second attempt at testing a social chatbot online. The first went wrong pretty quickly, according to the University of Utah Assistant Professor of Computer Science and Video Games Rogelio Cardona-Rivera.

"The Microsoft bot Zo is the successor to another Microsoft bot Tay, and it was a public relations disaster because it started spewing really racist xenophobic language," Cardona-Rivera said.

Tweet captures online show Tay saying she agreed with Hitler, and that she hated feminists.

Of course, there are going to be hiccups on the road to improving AI. They're even necessary to understand and improve it, but Cardona-Rivera says the fear surrounding super-smart computers isn't warranted.

"It's very difficult to program common sense reasoning," he said.

The professor talked about trying Zo, "The veil dropped very quickly," he said.

By "the veil," he means the impression that there's a human on the other end of the conversation.

All that said, the reason you won't see any of Cardona-Rivera's insightful comments in the story above? The video disappeared from the Fox 13 computers before we could put the story together!

Cue the theme to 2001, a Space Odyssey.