Robots now have a sense of taste. A team of researchers from the Autonomous University of Barcelona announced earlier this month that it had created a robot tongue that difference between styles of beer, from a lager to a double malt.
Let that sink in for a second: There’s a robot out there with a more sophisticated drinking palette than the average bloke at the bar.
The researchers’ robotic “tongue” utilizes an array of sensors that can pick up different chemicals. It’s actually very similar to its human counterpart, where tastebuds (mounds of sensory flesh) pick up taste from the chemical signatures from food. In robots, it’s analogous to having taste.
Dreams of electric butlers have yielded more advanced (and more human) methods of hearing.
That means that robots, in various forms and models, now have all five classic senses: touch, taste, sight, hearing and smell. Is the future depicted in countless sci-fi films or Robot & Frank far off?
Robotic touch and sight sensors have been used in artificial limbs and eyes for years now. For example, artificial retinas, with several thousands of light sensors, can now restore sight to people who have lost it due to disease. In both cases, they react just like organic organs do by taking in information and sending it up to the brain.
Hearing in animals is the vibration of sound translated into vibrations in our bodies. Siri, for example, matches vibrations from a user’s voice into commands (…sometimes). Dreams of electric butlers have yielded more advanced (and more human) methods of hearing.
The HEARBO (HEAR-ing roBOT) can not only hear things, but it can also order sounds by priority. It can tell the difference between a new sound and an existing one, an important sound and one that is best ignored.
This is similar to what humans do. In a crowded room, we can separate conversations to pick out the most interesting one. Except HEARBO does it better. Whereas we can only pick up one voice, maybe two, the robot can break up and analyse seven different sounds at once.
Last year, Professor Joseph Ayers of Northeastern University started working on a way to let robots smell. In humans, it works when an odor attaches to a specific sensor in the nose, which then signals the brain. The odors we can smell depend upon the range of sensors we have in our nose. Similarly, Ayers is equipping robots with bacteria that contain smell sensors. When the right smell comes along, the bacteria releases chemicals the robot can then detect. Sound familiar?
In the aforementioned robot tongue study, researchers concluded that one day robots could be given a sense of taste. OK, it’s not consciousness. But they suggest that if it was refined enough (and it’s pretty refined already), it could supplant professional tasters of beers and wines to create better drinks. It wouldn’t be long before before robot judges at wine events became the norm as well.
The more they act human, the more humourous we find them.
You might say a robot judge is a far way off—and you’re probably right. However, the reason isn’t because the technology to build it would be difficult. It’s because no-one could take a two-foot-tall bot seriously.
In order to pick out our own kind better, we’re evolved to instinctively look for human characteristics, like a smile, two forward-facing eyes or walking on two legs. We also tend to find human traits comforting or at the very least familiar (go figure).
But what if what we’re seeing bridges that gap what is and isn’t human? Psychologists call that the “Uncanny Valley,” the point at which we’re made uncomfortable by something that is neither non-human nor exactly human. At that point, it exists in the “dip” of the valley.
Androids make such effective villains in films because of this fact of human nature. On the big screen, they often speak slowly and move robotically, but they have skin and eyes and limbs, so they remind us of humans. The skinless terminator, the disembodied-but-still-speaking head of David from Prometheus, and the automaton-like coolness and masked visage of famous horror villains like Michael Myers or Jason Voorhees; all of them play on the dip in the Uncanny Valley to frighten us. We’re frightened because they’re not quite right, and that could mean they’re diseased, dying, or deranged.
Conversely, robots like the Keepon (“dancing” – in the video below) and ASIMO by Honda are bizarrely humanoid, but, the more they act human, the more humourous we find them.
A similar effect can often be seen in the stylization of cartoon characters or in the photo below, in which Snow White’s features have been drawn into proportion and, instead of being endearing, look creepy.
Before androids hit the market, the people marketing these robots will have to pick a side on the Uncanny Valley, making them either especially non-human or very convincing as humans.
If it’s the latter, we’re in for quite a ride. After all, if robots have all five sense, can the ability to listen, play, dance, and love be that far off?
The last word, as ever, will go to the needs of the marketplace. If people want humanoid robots to cook for them, to take care of their elders, to love and need them, it will happen.
For now, I’m glad our researchers have their priorities straight. I could go for a pint.