How Should Bots Respond to Frustration? – InApps is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn How Should Bots Respond to Frustration? – InApps in today’s post !

Read more about How Should Bots Respond to Frustration? – InApps at Wikipedia



You can find content about How Should Bots Respond to Frustration? – InApps from the Wikipedia website

Should we be nicer to our bots? And how exactly should the bots be responding to our anger and frustration?

CNN Money recently ran an article, “Alexa, shut up: Raging against the new machines,” that raises some interesting questions about whether it’s bad to be bad to our virtual assistants — and how their developers should respond to that messy human anger.

In fact, when it comes to voice assistants, our evolving relationship is being thoroughly documented on social media:

Dealing with Frustrations

The problem may be harder than it sounds, precisely because people don’t always interact naturally with voice assistants, according to Jay Wilpon, a senior Vice President of Natural Language Research at Interactions (which provides intelligent virtual assistants). “Listen to any person ask their phone about state capitals or salmon recipes — they over-pronounce words, exaggerate consonants, and speak in short, concise sentences. It’s a form of human-to-machine “dialect” we’ve developed to guarantee the technology understands what we’re saying,” he wrote, in an essay titled “Angry customers are shaping the future of AI.”

In that piece, he argues that virtual agents providing enterprise-level customer service are on the forefront of dealing with angry customers. “Since customer service is inherently focused on problems, many customers start their interaction already frustrated,” he noted.

“At the very least, it creates patterns and reinforcement that so long as your diction is good, you can get what you want without niceties.” — Hunter Walk

In the interest of science, he urges customers to “speak naturally to those customer service bots. Give ‘em hell if you want! At the end of the day, you’ll be advancing AI — one conversation at a time.”

Read More:   RFI stands for what? How to write software RFI

But a different take comes from the founder of X.ai, which offers a meeting-scheduling virtual assistant named Amy (or Andrew) Ingram. X.ai’s Dennis Mortensen tells CNN Money that since it’s a machine, “In that setting, people tend to feel comfortable applying their frustration.”

And the stakes may be higher than we think.

A photograph of media theorist Sherry Turkle by Flickr user jeanbaptisteparisSherry Turkle, who focuses on human-technology interactions as the director of Massachusetts Institute of Technology’s Initiative on Technology and Self, worries this may lead to a “coarsening of how people treat each other.” She tells CNN that “We treat machines as though they were people.” So where this may lead is “We are drawn into treating people as though they were machines.”

CNN notes a lack of data on how often people “rage against the new machines.” It noted that Amazon declined to comment and Apple did not respond to a request for comment, but Microsoft noted that Cortana “is hit by curses and offensive language on a daily basis,” according to their article, which reports that Microsoft has deployed an editorial staff “to craft the right response.”

They interviewed Deborah Harrison, one of the team’s writers, who stresses to CNN that Cortana “is always ready and eager to have a productive and positive conversation,” though at the same time strives “to shut down offensive behavior immediately.”

In February of 2016, CNN also reported that when it came to Cortana, “a good chunk of early queries were about her sex life,” citing a talk Microsoft’s Deborah Harrison gave at the Re-Work Virtual Assistant Summit in San Francisco. And if you say something particularly bad to Cortana, “she will get mad. That’s not the kind of interaction we want to encourage.” Harrison says they learned by talking to real-world assistants who’d already experienced their own real-world harassment. When it was time to program Cortana, “We wanted to be very careful that she didn’t feel subservient in any way … or that we would set up a dynamic we didn’t want to perpetuate socially.”

“We are in a position to lay the groundwork for what comes after us.”

It’s a problem that’s not going away, because “Lots of use cases come from that motivation,” says Ilya Eckstein, CEO of bot platform provider Robin Labs. Last October she told Quartz that their analysis identified at least five percent of the interactions in their database as being unambiguously sexually explicit.

Read More:   CNCF Chose Prometheus for Monitoring so What’s Next? – InApps 2022

Anger: The Next Generation

Rudeness to bots is already troubling one important demographic group: parents. “I’ve found my kids pushing the virtual assistant further than they would push a human,” Avi Greengart, a tech analyst for Current Analysis, told Quartz, adding that Amazon’s virtual assistant “never says ‘That was rude’ or ‘I’m tired of you asking me the same question over and over again.’”

“One of the responsibilities of parents is to teach your kids social graces, and this is a box you speak to as if it were a person who does not require social graces.”

In June venture capitalist Hunter Walk wrote that “Amazon Echo Is Magical. It’s Also Turning My Kid Into a Jerk.” In a short essay re-published in Inc, Walk complained that “Alexa tolerates poor manners.”

“Cognitively I’m not sure a kid gets why you can boss Alexa around but not a person,” he wrote. “At the very least, it creates patterns and reinforcement that so long as your diction is good, you can get what you want without niceties.”

Walk elaborated in an interview with Quartz about his concerns over how his daughter was being influenced by interactions with Amazon’s device. “Because she is a little girl, she needs to speak loudly, and there’s an unintentional aggressive tone to her command.” Though to be fair, one of Walk’s friends — also a parent — describes Alexa as a godsend.

“Need to get your kids to the table? Ask Alexa to set the timer for two minutes,” writes Rebecca Hanover Kurzweil. “They listen to Alexa. When she rings, they come…”

In June USA Today spoke to Jenny Radesky, a University of Michigan pediatrician who studies digital media and co-authored the American Academy of Pediatrics’ guidelines for media use. She suggests parents try modeling kindness for their children when interacting with a virtual assistant.

There’s also a positive possibility. MIT’s Technology Review talked to Cynthia Breazeal, who directs the Personal Robots Group at the MIT Media Laboratory, reporting that she “sees a huge opportunity for virtual assistants like Alexa, Google Home, and others to be designed in ways that push us to treat others the way we want to be treated,” helping children build social skills.

Maybe that’s why Elizabeth Reese, a writing manager at Microsoft whose responsibilities include Cortana, writes of the virtual assistant that “one of her core principles is that she’s always kind.” And she believes some people in the tech industry are focused on delivering products with built-in kindness as “A techy reminder that if bots and other AIs can be courteous and careful of our feelings, we shouldn’t forget that it’s possible in our human-to-human interactions too.”

Read More:   How The Go Programming Language Helps Docker and the Container Ecosystem – InApps 2022

IBM’s Solution

Other companies are working on the problem’s backend. Rob High, Chief Technology Officer of IBM Watson, told CNN Money that its AI platform is already analyzing the tone and emotion in voice interactions to identify angry customers. First Watson will “interpret and respond to their intention,” then apologize and say it doesn’t understand. But if all else fails, “We will offer to turn it over to a live agent.”

On Watson’s site, the company point out it can also analyze the tone of tweets and online reviews, and handle other use cases. The site claims that it can “Monitor customer service and support conversations so you can respond to your customers appropriately and at scale. See if customers are satisfied or frustrated and if agents are polite and sympathetic… Enable your chatbot to detect customer tones so you can build dialog strategies to adjust the conversation accordingly.”

IBM Watson Tone Analyzer demo

As the holidays began last year, IBM even combined their Watson Conversation and Watson Tone Analyzer from their Bluemix platform into Santa’s little Twitter bot. It analyzes the tone of any Twitter timeline that uses its name with the proper syntax, and “If it finds too much anger or disgust it will put you on the naughty list.”

 ElfCheck combines IBM Watson Conversation, Tone Analyzer and Personality Insights.

Basic Instincts

But there may be a darker truth: that humans yell at their virtual assistants because they enjoy it. Last October Sarah Larson, a staff writer at the New Yorker, wrote that yelling at Alexa “can help put you in touch with your id. And it adds to the pleasure.”

So maybe virtual assistants like Alexa are just teaching us something uncomfortable about ourselves.

One user (named “typicallydownvoted”) remembers times that Alexa wouldn’t play the news until the command was shouted, along with a few choices expletives. The irony may be that Alexa is teaching exactly the wrong lesson.

Maybe it’s all a reminder that while bots can be frustrating, we humans have our own shortcomings to work on. One Reddit user remembers calling Alexa a derogatory word, only to be greeted with a penitent, “I’m sorry, I’ll try to do better next time…”

“Then I felt like a jerk…”


Feature image via Pixabay.




Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      Success. Downloading...