Home robots are smart. For example, Kuri can tell jokes, dance, play music, explore your home on her own, detect faces, and so much more. When you say, “Hey Kuri! Play your favorite song,” it’s easy to take for granted how seamlessly Kuri nods his head and plays Pancake Robot from his speakers (trust us, you’ll love it too). But how did robotics get here, and more specifically, how does Kuri hear us?

Nestled inside Kuri are four microphones that listen to her environment. In partnership with SoundHound, Kuri listens for her wake word: Hey Kuri. When Kuri hears this phrase, she knows to start paying attention.

The listening part of Kuri’s “brain” is enabled by a partnership with SoundHound’s Houndify: a platform that integrates voice and conversational intelligence through speech-to-meaning technology. In other words, Houndify helps Kuri both hear and understand a variety of commands like: tell me a joke, go home, play your favorite song, and more.

Hey Kuri, Go Home from KuriRobot on Vimeo.

A History of Voice Commands
It wasn’t too long ago that understanding even single words was extremely difficult for a computer. It wasn’t until 1987 with the invention of the World of Wonder’s Julie Doll (a toy children could train to respond to their voice) that speech recognition technology was first introduced to the everyday household — 110 years after Edison’s invention of the phonograph.

Fast forward to today and Kuri is living in a world alongside many popular speech recognition systems that we all know and (probably) use daily.

Built for Hearing
Kuri was designed to be a great, active listener. With an array of four microphones around her neck, Kuri can hear and understand sounds — even from across the room. These microphones are Kuri’s ears.

This innovative hardware is how Kuri captures the sound around her. However, those sounds must be interpreted, understood, and executed upon — which is where stellar software comes into play.

Ability to Understand
Even though Kuri doesn’t speak back in English (Kuri speaks robot — adorable beeps and boops), she understands voice commands. If the microphones are Kuri’s ears, then the software is similar to the part of the brain that converts sounds into actionable meaning.

With thanks to the Houndify platform, Kuri can understand a range of voice commands straight out of the box, without any training.

Today’s Listening Technologies
Many traditional devices that operate through voice commands use a multi-step process. First, these devices recognize speech and convert it to a text file. Then, they use that text file to interpret try to understand what the voice command is saying or requesting. This multi-step approach can be slower and less reliable — simply because there are more opportunities for error.

Infographic of traditional approach to AI hearing

In contrast, because her hearing software is powered by Houndify, Kuri’s software and technology blends the hearing and interpretation into a single step, improving reaction speed and improving consistency and accuracy. (No, this technology upgrade isn’t available for humans — sorry)

Infographic of Houndify's AI hearing revisited

Kuri’s hardware and software work together to allow her to hear voice commands and thoughtfully respond to you and the world around her. This responsiveness helps Kuri maintain her delightful personality, and become an adorable addition to your home.