Even virtual assistants are sexually harassed

[Aside from demonstrating how users of virtual assistants experience (social) presence, this piece from CNN Money illustrates some of the ethical issues creators of presence-evoking technology must face. –Matthew]

Robot slaps harasser (graphic)

Even virtual assistants are sexually harassed

by Heather Kelly
February 5, 2016

All virtual assistants have to deal with inappropriate comments and questions. From seasoned vets like Siri and Google Now, to the rash of new specialists with names like Amy, Molly, Mia and Robin.

When Microsoft (MSFT, Tech30) launched Cortana in 2014, a good chunk of early queries were about her sex life, according to Microsoft’s Deborah Harrison.

It turns out people feel very comfortable talking freely with text and voice assistants. Humanizing the bots with names, faked emotions, personalities and genders (mostly female) helps build trust with users.

A side effect of creating friendly female personalities is that people also want to talk dirty, confess their love, role play or bombard them with insults.

Cortana is not about to put up with it.

“If you say things that are particularly a**holeish to Cortana, she will get mad,” said Harrison during a talk at the Re•Work Virtual Assistant Summit in San Francisco last week. “That’s not the kind of interaction we want to encourage.”

Harrison is one of eight writers who creates Cortana’s dialogue in the U.S. In addition to writing jokes and coming up with casual banter, her team has to figure out the best way to shut down vulgar conversations.

Cortana is clearly identified as a woman. She has a female avatar and is voiced by human woman Jen Taylor. But the writers are conscious about avoiding female-assistant stereotypes. Cortana isn’t self-deprecating and avoids saying sorry.

“We wanted to be very careful that she didn’t feel subservient in any way … or that we would set up a dynamic we didn’t want to perpetuate socially,” said Harrison.

Not all assistants will take the same firm approach. Robin Labs, which makes a voice-assistant for drivers, thinks there might be a market customizing personalities. CEO Ilya Eckstein says there is a high demand for an assistant personality that’s “more intimate-slash-submissive with sexual undertones.”

Robin has held more than 100 million conversations. Looking through the data, Eckstein found people fall into a few basic categories. Some like lots of friendly banter, others want just the facts with no sass. Apparently one group is quite fond of making Robin repeat profanities in her soft female voice.

Sometimes people flirt, not realizing the assistants are just programs. Sometimes, as with Facebook’s (FB, Tech30) new M Messenger tool, there’s a real human doing some of the work. And though she is cheerful and friendly, x.ai’s meeting-scheduling robot Amy always keeps it professional. That hasn’t stopped people from sending her flowers, chocolates and whiskey. One user even asked her out on a date (Amy declined).

One smart way to learn how to handle harassment is to talk to real assistants, which Microsoft did for Cortana. After all, they’ve been putting up with similar behavior for years.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z