In the mission to make controlling and interacting with devices as natural as possible we've developed touchscreens, speech recognition, and motion tracking. Chirp Microsystems, a Berkeley,CA-based startup, believes the next evolution of user experience (UI) will happen because of sound.

Chirp has unveiled a microelectromechanical systems (MEMS)-based time-of-flight (ToF) sensor that uses ultrasound transducers to give users touch-free control over their devices, using hand and finger gestures. A recent demo (shown below) of Chirp's technology shows a user controlling a tablet simply by using hand-gestures in the air.

You've probably already seen this done with light and camera-based systems, but Michelle Kiang, CEO and co-founder of Chirp, told Design News that these solutions don't offer the same level of accuracy and consume much more power than ultrasound. “When you think of infrared or camera-based solutions, in terms of power consumption just the device itself can operate in the tens of milliwatts," she said. "With ultrasound we can operate in the tens of microwatts – a thousand times reduction in power consumption.”

Moreover, Kiang added camera-based systems require machine vision, which eats up a lot of processing power, in order to do their job. “Whereas ultrasound only measures range and distance, so even when you use multiple devices the calculations are pretty simple.”

Chirp's sensor utilizes an array of small ultrasound transducers to send out a pulse of soundwaves. Like a bat using echolocation, the soundwaves bounce off of an object (like your hand) and returns to the chip. By calculating the time-of-flight the chip is able to determine the location of the object relative to the device and trigger a programmed behavior. Waving your hand through the air might scroll through a screen, you could control volume or intensity by moving your hand toward or away from a device, or you could click or activate commands with a small finger movement.

Chirp’s technology originated at the Berkeley Sensor and Actuator Center (BSAC) at UC Berkeley and Davis, where researchers discovered a new way to miniaturize MEMS ultrasound sensors that was licensed by Chirp when it was founded in 2013.

The transducers are piezoelectric micromachined ultrasonic transducers (PMUTs) that convert ultrasonic pressure that hits a membrane on the sensor into an electrical signal. Because PMUTs don't use a backplate like convention piezoelectric sensors, there is no hard limit to the membrane's motion, resulting in a better-quality signal.

Chirp's FoT sensor combines a PMUT with a proprietary ultra-low-power, mixed-signal integrated circuit that handles the time-of-flight calculations. As the chip doesn't rely on an external processor, it can accomplish its task with much less power and in a much smaller form factor. And, as it works like echolocation, the sensor allows devices to track movements in 3D space, adding many more control options than the two-dimensional confinements of a touchscreen, for example. David Horsley, co-founder and CTO of Chirp, goes into detail on the technology's inner workings in an article for IEEE Spectrum .

According to the company, its ToF sensor is one-thousand times smaller than a conventional ultrasound transducer and can sense not only broad gestures

like hand waving but also “microgestures,” such as finger movements, with 1mm of accuracy. This fine accuracy makes the ToF sensor ideal for wearables applications, where a device may be too small to properly utilize a touchscreen and for hands-free applications such as sterile medical or clean room environments or infotainment applications.

To be fair, the foundation of what Chirp is doing is not new. Researchers and businesses have been experimenting with so-called acoustic gesture control or recognition, using sound to detect movements to control an interface, for decades. Nintendo even released a gaming peripheral, the Power Glove , back in 1989 that utilized ultrasonic sensors to let users control video games with hand movements.

The only problem with the Power Glove was that, in technical terms, it was a piece of junk that didn't actually work (Nintendo discontinued it only a year after release).

What Chirp has done however represents the newest, lowest-power, and smallest form factor application of sound-based gesture recognition technology.

Enabling Mobile VR

Chirp is aiming for the FoT chip to make big (sound)waves in VR/AR. Many VR applications right now are focusing on motion tracking controllers like the HTC Vive controller andOculus Touch . These controllers come in pairs and are designed to mimic the use of human hands as closely as possible. Rather than using a mouse or joystick to pick up an object for example, HTC and Oculus controllers let users pick things up in a virtual space by actually squeezing their hands and fingers on a trigger.

The drawback to these solutions (aside from the consumer cost) is that they use light and camera-based tracking, again leading to power consumption issues, not to mention the additional hardware required for the setup. And they restrict the user into a given area of space when moving around.

 

“We're very excited about VR and AR because we believe our solution is a disruptive solution to the tracking problem everyone is facing right now,” Kiang said. By embedding its chip into a head-mounted display (HMD) as well as a separate controller, Chirp says it can create a solution that is lower power, lower latency, lower cost, and better resolution that camera-based systems for VR and AR. Kiang said Chirp's sensor offers the potential for full 360-degree immersive experiences, a wider field of view than camera-based systems, and a system that will work under any lighting conditions, unlike light-based systems.

Having the sensors embedded into a HMD also offers a very attractive feature for mobile VR and AR users – inside-out

positional tracking. “Because one side [the controller] is transmitting ultrasound and the other side [the HMD] is receiving ultrasound it enables you to track a device (controller or HMD),” Kiang said. What this means is that users aren't restricted to just looking around virtual environments as is the case with today's mobile VR products like the Samsung Gear VR. Instead users will be able to move around VR environments as well – creating a mobile VR experience on par with the tethered hardware experience but without the space restrictions of camera- and light-based systems.

 
But when users think about all the control schemes available to them, everything from touchscreens, to voice recognition, and even your everyday, standard keyboard and touchpad/mouse, are any of these really so inconvenient? Touch screens especially have proven themselves to be a more than worthy solution for many mobile applications (just ask all the iPhone users out there).

Kiang fully agrees with this assessment and said that Chirp's aim is not to replace touchscreens entirely. She admits there are plenty of applications where touchscreens, for example, are ideal. “We're looking at devices that don't necessarily need or want a touch screen, but would love to add interactivity.” she said. Wearables such as smart glasses again come up as an ideal application. “It's a way to add interactivity without having to add a touchscreen or external controller,” Kiang said. Also, ultrasound isn't subject to some of the limitations of a touch screen and can still be used in the presence of water, dirt, and require at least one hand to be occupied.

“Anytime we think of things with a touchscreen, we can improve the experience,” she said, adding that Chirp will be able to expand the market since touchscreens can't be used in very device in every application. “We have developed a really unique ultrasound technology that addresses a broad range of user interface challenges we're seeing in different spaces.”

Chris Wiltz is the Managing Editor of Design News.

arrow
arrow
    全站熱搜

    Shacho San 發表在 痞客邦 留言(0) 人氣()