27.01.2020 Views

Blouse Magazine

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

GARBAGE I

GAR

But that's only part of it. The company also added some fixed “editorial”

content developed by a staff, including improvisational comedians.

And on top of all this, Tay is designed to adapt to what individuals tell

it. “The more you chat with Tay the smarter she gets, so the experience

can be more personalized for you,” Microsoft's site describes Tay. In

other words, Tay learns more the more we interact with her. It's similar

to another chat bot the company released over a year ago in China, a

creation called Xiaoice. Xiaoice, thankfully, did not exhibit a racist, sexist,

offensive personality. It still has a big cult following in the country, with

millions of young Chinese interacting with her on their smartphones

everyday. The success of Xiaoice probably gave Microsoft the confidence

that it could replicate it in the US.

Given all this, and looking at the company's previous work on Xiaoice, it's

likely that Tay used a living corpus of content to figure out what to say,

says Dennis R. Mortensen, the CEO and founder of x.ai, a startup offering

an online personal assistant that automatically schedules meetings.

“[The system] injected new data on an ongoing basis,” Mortensen says.

“Not only that, it injected exact conversations you had with the chat bot

as well.” And it seems that was no way of adequately filtering the results.

Unlike the hybrid human-AI personal assistant M from Facebook, which

the company released in August, there are no humans making the final

decision on what Tay would publicly say.

Mortensen points out it that these were all choices Microsoft made.

Tay was conceived to be conversant on a wide range of topics. Having a

static repository of data would have been difficult if Microsoft wanted Tay

to be able to able to discuss, say, the weather or current events, among

other things. “If it didn’t pick it up from today, it couldn’t pick it up from

anywhere, because today is the day it happened,” Mortensen says. Microsoft

could have built better filters for Tay, but it may not have thought of

this at the time of the chat bot’s release.

Meanwhile, depending on their purpose, other chat bots might be

designed to have a much narrower, much more “vertical” focus, like

Mortensen’s own online personal assistant. Some chat bots, he explains,

just talk about sports or food or music, or are programmed to do one

thing, like set up meeting appointments through e-mail. Those are the

cases when you can have much more minute control over the universe of

responses for the chat bot, and when unleashing it to the world becomes

much less risky.

As for why, of all its options, Tay seemed to consistently choose the

most incendiary response possible, Mortensen says this is just how this

kind of AI works. The system evaluates the weighted relationships of two

sets of text-based questions and answers (in a lot of these cases) and

resolves what to say by picking the strongest relationship. And that

system can also be greatly skewed when there are massive groups of

people trying to game it online, persuading it to respond the way they

want. “This is an example of the classic computer science adage,

‘Garbage in, garbage out,’” says Oren Etzioni, CEO of the Allen Institute

for Artificial Intelligence.

THE FEMALE VOICE

All of this somehow seems more disturbing out of the ‘mouth’ of someone

modelled as a teenage girl. It is perhaps even stranger considering

the gender disparity in tech, where engineering teams tend to be mostly

male. It seems like yet another example of female-voiced AI servitude,

except this time she’s turned into a sex slave thanks to the people using

her on Twitter.

This is not Microsoft’s first teen-girl chatbot either; they have already

launched Xiaoice, a girly assistant or “girlfriend” reportedly used by 20

12 | Blouse Technology Issue

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!