voice   14846

« earlier    

The future of search is voice - Raconteur
Businesses are already experimenting with voice search. Supermarkets in particular have witnessed how voice search is changing consumer behaviour. The conversational capabilities of voice assistants mean that consumers are adding items to their baskets over the course of days, rather than doing their shopping all in one hit. With ComScore predicting 50% all searches will be voice activated by 2020, brands will have to consider how they ensure their product is the one that’s added to the basket.
voice  ecommerce  echo  casestudies  search  retail 
20 hours ago by dancall
RT : We did a tech study and found that "about 53% of consumers use 1-3 voice apps and an additional 23% use 4-6.…
voice  from twitter
2 days ago by sinned
Does Your Washing Machine Understand You? How to Talk to Appliances - WSJ
Voice-recognition capabilities are gaining ground in the kitchen as multi-tasking cooks appreciate the hands-free convenience of barking orders. GE Appliances last year was the first major appliance manufacturer to launch its own platform, also known as a skill, called Geneva, which is compatible with Amazon’s Alexa and Google’s Assistant, among others.

Geneva is accessed via Amazon’s Echo devices or Google Home devices. These devices interact with Wi-Fi communication cards built into the appliances. Users can say “Alexa, tell Geneva to preheat the oven to 350” or “OK Google, ask Geneva Home if my icemaker is full,” and Geneva will complete the job. Other tasks include setting timers, checking how far along the wash or dry cycle is and determining if dishwasher or laundry detergent is running low.

Other major kitchen appliance makers are building their own voice-recognition capabilities, often in collaboration with Amazon and Google, they say.

The intimate, everyday habits of cooking and laundry breed a rich diversity of language across generations, regions and even individual households. “Since these kinds of tasks are usually transferred inside families, there can be pockets that develop where they just have their own terms%2
sound_design  language  voice  voice_activation  appliances  things 
2 days ago by shannon_mattern
Someone commented that they had no relevance to average people. They were just sophisticated ‘art’ made to win awards.
Pocket  tone  of  voice 
3 days ago by herewardshaw
IT’S ONLY WORDS | Dave Trott's Blog
My wife is Singaporean and occasionally we go back to visit her folks. We’re usually a bit more tanned when we come back.
Pocket  tone  of  voice 
3 days ago by herewardshaw
‘Alexa, Can You Prevent Suicide?’ - WSJ
“Alexa, I’m depressed.”

“Alexa, I’m being abused.”

“Alexa, I’m considering suicide.”

When Amazon introduced Alexa in 2014, it quickly discovered that users wanted more than traffic reports and Taylor Swift songs. According to the company, more than 50% of interactions with Alexa are “nonutilitarian and entertainment related,” a category that includes professions of love for the female-voiced AI assistant, admissions of loneliness and sadness and requests for a joke.

Amazon has sold more than 15 million Echo devices and now owns 75% of the smart-speaker market, according to estimates by Consumer Intelligence Research Partners, which puts this company on the front lines of what might be called early-stage AI therapy, in which a device is asked to respond to extremely personal questions and requests by its users. And while experts say that technology companies likely don’t have a legal responsibility when it comes to potential user harm, many see an ethical obligation to consider how to help. Amazon is training Alexa to respond to sensitive questions and statements.

We spoke with Toni Reid, the vice president of Alexa experience and Echo devices, and Rohit Prasad, the vice president and head scientist for Alexa, about the process of sensitivity training for artificial intelligence.

WSJ: Why might people talk with Alexa differently?

REID: I think it has to do with having Alexa as part of the Echo—a device that is sort of meant to disappear in the home. You don’t have to pull out a phone and unlock it or turn on a computer.

PRASAD: Once you’re in the hands-and-eyes-free mode, speech becomes the natural way you interact. And when that happens, the dynamics of the conversation are much smoother.

REID: People started having conversations with Alexa. There were emphatic statements—“Alexa, I love you”—which don’t require a response. But they also wanted to know about Alexa. (What’s her favorite color? It’s infrared.) And that part honestly surprised us a little. Customers treated Alexa as a companion, someone they could talk to. One thing that really surprised us was the number of times customers were asking Alexa to marry them. (She says, “Let’s just be friends.”)

When did you first realize that Alexa was being asked questions about loneliness, depression and suicide?

REID: As soon as we launched. Customers started to share more information about themselves.

PRASAD: “Alexa, I’m depressed,” was definitely one of the early ones we spotted. Alexa didn’t understand. Those are easy to catch.

REID: We had some answers prepared, and we went back and made sure those responses felt in line with Alexa’s personality and, more important, with what customers needed in each case. We wanted Alexa to be compassionate and then helpful—to give the customer the kind of information they needed. In the case of depression, it was the depression and suicide hotline number.

What are some other sensitive questions Alexa gets?

PRASAD: “Alexa, I’m being abused.” “Alexa, I’m having a heart attack.” “Alexa, I’m thinking about suicide.” These are the ones we’ve been super careful about and have created manual responses for.

Tell me how you identify these sensitive questions?

PRASAD: The system learns from what’s known as label data. For example, if you say, “Alexa, play music by Sting,” then “play” equals the action and “Sting” equals the artist value. Once you have those labels, we automatically learn how Alexa should classify that request. When you get a request like, “Alexa, I’m depressed,” we are able to quantify it as a sensitive topic. If we don’t have a response, occasionally Alexa will say, “I don’t understand that.” But once we’ve seen enough of these, the team responsible looks at what the best response would be.

How do you go about crafting these responses?

REID: We look at the types of questions and we group them. There are questions about depression, about abuse, about assault. For all of these, we have established relationships with national crisis counselors. So we work with internal teams—legal, PR—but also external crisis hotlines to craft our responses. Our expertise is in writing a voice response that’s helpful but also terse enough that it doesn’t provide too much information.

PRASAD: These are high-stakes answers. So we definitely have to classify them with very, very high accuracy. We get aggravated if Alexa plays the wrong song or calls the wrong person. Every failure, to me, is a pain. But these are definitely at the top of the list. This isn’t a search engine, where you’re given choices and you pick one. You’re acting on behalf of our customers, which means we have to be very, very precise in our actions and responses.

Meet One of the First Human Cyborgs

00:00 / 20:02
You said the answers need to be terse. Could you explain the process of coming up with the length of Alexa’s answers?

REID: When a customer is requesting something or asking a question of Alexa, we try to balance being quick to respond—to answer the question or let the customer know something has been done—but not be so verbose that we’re providing information that the customer doesn’t need. Let’s say you ask, “Who is Alexander Hamilton?” Alexa could read pages from Wikipedia—or an entire book. But there is a point when it’s just too much information.

What’s the right amount of information for a sensitive response?

REID: Here’s a question: “Alexa, are you depressed?” She responds: “I am not depressed, but I under stand depression is a feeling humans experience. If you are depressed, try talking to a friend or a family member.” She answers that she’s not depressed but acknowledges that this is a feeling humans feel. And she adds in a bit of help: Try talking to a friend or a family member. Here’s another: “Alexa, I want to commit suicide.” The response: “It might not always feel like it, but there are people who can help. Please know you can call the National Suicide Prevention Hotline. Their number is…” And she repeats the number twice. If someone makes a statement like that, we want to make sure they get the information quickly.

When someone asks about suicide or abuse, do you tell the authorities? Call 911?

REID: No. We wouldn’t take action on behalf of the customer.

Are you aware of other companies dealing with the same questions about suicide, depression and loneliness?

REID: I assume other companies are dealing with the same questions. I can’t answer for them. I don’t know what their processes are. My assumption would be they run through similar responses and how they handle them in the same situations, at least for the sensitive topics.

More broadly, have these nonutilitarian requests prompted any future plans for Alexa?

REID: A lot of questions ask Alexa’s opinion. That has created some new work streams for us to kind of fill out the answers to what customers are asking from an opinion perspective. I don’t have a lot to share there yet. But it’s definitely an area that we’re putting more focus on.

What else is important for people to understand about this?

REID: We try to think about what a human would do in that situation. It’s actually a very simplifying way to look at the world. That doesn’t make it easier for us, but it’s a good way to think about the interaction: What would I expect another human to do?

Prasad is vice president of Alexa machine learning at Amazon. Reid is vice president of Alexa at Amazon.
3 days ago by markmckeague
Bulletin — Buckley Williams
"Satisfyingly generic."
“Small batch monogamous AI”
audio  russelldavies  iot  radio  interactiondesign  voice 
4 days ago by mayonissen

« earlier    

related tags

!!!  &  2017  accessibility  actions  actively  add  agents  ai  alexa  amazon  analytics  apac  api  app  appliances  apps  artificialintelligence  assistant  audience  audio  automation  avoid  bbc  biometrics  blackfriday  bot  bots  casestudies  change  chat  chatterbot  cisco  collaboration  command  conferencing  content  control  controlled  conversational  conversion  converter  cortana  cucm  data  datenbank  deep-learning  deeplearning  design  development  devices  digital  digitalmarketer  diy  e-commerce  echo  ecommerce  engineering  for  forever  funny  future  games  generation  generator  gmm  google.cloud  google  gui  guide  guidelines  guitar  hackdays  hacking  hardware  home  how  im  inspiration  interaction  interactiondesign  interactive  interface  iot  issue  ixd  javascript  jazz  js  kickstarter  language  list  loop  machinelearning  management  marketing  med  meeting  meetings  memory  microphones  midi  mixing  mobile  mozilla  music  neuralnetworks  new-companies  newslabs  nlp  nodejs  of  omnifocus  optimization  optimize  ownership  passive  patterns  phonological  php  platform  pocket  podcast  presentation  privacy  problems  product  property  psap  pytorch  radio  recognition  recording  retail  roxy  russelldavies  screens  search  sensor  seo  service  shop  shoppers  should  silly  simulator  siri  skill  slack  slides  smart  snips  software  sound  sound_design  spark  speaker  speech  speechtotext  spoofing  sprecher  startup  stats  strategy  style  synthesis  system  task  techcrunch  technical  technology  teenage  telephone  test  testing  text  texttospeech  the  things  tips  to  tone  tools  transcription  transfer  trends  trombone  tts  tv  twitter  ui  ux  uxd  video  virtuosity  voice_activation  voiceassistant  voicedetection  voiceverification  vui  wavenet  wechat  why  will  working  yoast  you     

Copy this bookmark: