Experimental headphones can hear orders that the user is mouthing inaudibly - Search Engine Optimizer

Artificial Intelligence Earphones

In design, less is more. At least, that's what experimental designer Johann Robler believes.

 His newest creation is an earphone that uses artificial intelligence to interpret commands someone mouths without speaking by tracking changes in the shape of their ear canal. The device – called aeioTUBA – can be used as a hands-free controller for smartphones and other mobile devices. But its purpose actually extends far beyond making hands-free navigation possible.

The headphone can be programmed to listen for certain sounds and then react accordingly, such as when you whisper "go left," it will change directions immediately instead of waiting until you say something else like "right.

Artificial Intelligence Earphones
Artificial Intelligence Earphones


The idea behind this concept was inspired by how far technology has come since we first started communicating with each other through speech; now we don't even have to verbally tell our devices what we want them do because they already know based on our cues (i.e., body language).

This experiment shows us how far we've come as designers but also serves as another reminder. Less is more.

The device – called aeioTUBA – can be used as a hands-free controller for smartphones and other mobile devices. But its purpose actually extends far beyond making hands-free navigation possible.

The researchers behind it claim that their invention can also be used to control home appliances, such as televisions or air conditioners. They say it could help people who are hard of hearing make voice calls without having to rely on an assistant or other types of assistance technology.

"The best thing about aeioTUBA was when it got nominated alongside experts and professionals at the Red Dot Awards," he told DW.

You can see from the video below that the device works just like a regular microphone, except it picks up sounds from around you instead of your mouth.

"The best thing about aeioTUBA was when it got nominated alongside experts and professionals at the Red Dot Awards," he told DW."It was really amazing because we did not expect our project to be recognized this way."

The German Design Award Nominee 2020 says he's fascinated by how "the increasing miniaturization of technology is making such novel applications possible – and that they are finally becoming affordable."

The concept behind aeioTUBA was to open up new possibilities for interacting with smartphones without having to use our hands. The device can be used as a hands-free controller for smartphones and other mobile devices, allowing users to control their devices using only their voice or through gestures like waving their arms around.

His inspiration comes from imagining "how simple human gestures could affect machines."

The idea of using gestures to interact with machines is not new, but it's certainly not as obvious in the real world. As we're all aware, there are many examples of bad technology that happens when you try to use a computer by just typing on its keyboard. This can be frustrating and slow, but at least most people who do it are aware they need to do something more than "just" type.

The same goes for devices like smartphones and tablets: they're great tools for communication and entertainment—but they could also be more useful if there were some sort of way for us all to communicate without having access to each other's mouths or hands (or phones). That's where this project comes in: imagine being able to see what someone is doing with their hands while they're talking!

"The idea behind aeioTUBA was to open up new possibilities for interacting with smartphones without having to use our hands," Robler explained.

AeioTUBA is a device that can be used to control technology without using hands. The idea behind aeioTUBA was to open up new possibilities for interacting with smartphones without having to use our hands, said Robler.

"The project started as an idea and then evolved into something bigger," Robler told me in an email.

He said he also wanted to make audio interfaces more accessible for people with limited mobility in their hands or people who don't have full use of them.

The device can be used by people with limited mobility in their hands or people who don't have full use of them. It was created as a bachelor's degree project at HAW Hamburg and works by tracking changes in the shape of a user's ear canal, which changes when someone is talking.

When this happens, it activates an audio interface that allows users to control technology like smartphones or computers through vocal commands alone.

"The main idea behind this project was to make hands-free navigation possible," says Peikert. For instance, you would often need to reach for your phone if you wanted to change the radio station or increase the volume of some music playing nearby while you were driving.

This device helps those with limited mobility in their hands control technology better.

It can be used by people who don't have full use of their hands, as well as those who are paralyzed or have other problems with the use of their limbs. The device interprets mouth movements to control a computer or smartphone, for example. It also works on other devices such as tablets and smartphones that have touchscreens, which means it's possible for people to play games on these devices using only their mouths.

Conclusion

One of the most important aspects of a design is that it can be used in many different ways, and this earphone certainly qualifies.

It's one thing to invent something that can help people with limited mobility in their hands control technology better; but another, more important aspect of Robler's invention is that it shows how simple human gestures could affect machines.

Previous Post Next Post