SHARE

Robots can see in wavelengths beyond human eyes. Robots can hear in wavelengths beyond human ears. Robots can even feel with tactility approaching human skin.

But when it comes to tasting, robots are laggards. Taste is a sense that may seem basic to any human, including young children licking food from the floor, but not to robots. Tasting technology doesn’t even come close to the multifaceted sensitivity of the human tongue.

For robot-builders and food scientists alike, improving that technology is an active area of research. One idea: Relocating the tongue to an arm, which a robot can manipulate. Researchers at the University of Cambridge have done just that, testing a robot arm’s ability to taste eggy dishes. They published their work in the journal Frontiers in Robotics and AI on May 4.

The Cambridge group weren’t new to the idea, having previously created a robot that could make omelettes and improve its egg-making prowess with human feedback. It slots neatly into a wave of robots that have started to work their way into restaurants, typically doing rote kitchen tasks. 

Take Spyce, a Boston-area restaurant where patrons could watch automated machines cook up customized bowls. The MIT engineers who founded Spyce had dreams of expanding it into a chain across the US East Coast. But those dreams met a mixed reception, and Spyce shuttered its doors earlier this year.

For robots, even the most elementary cooking tasks can prove to be insurmountable obstacles. One British startup offers a set of robotic cooking arms, which costs over $300,000, that can make thousands of recipes—but it still needs human help to chop its vegetables. 

[Related: What robots can and can’t do for a restaurant]

Another thing that robots cannot do—but what comes naturally to many human cooks—is to check their progress by taste. “If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking,” said Grzegorz Sochacki, an engineer at Cambridge and an author of the study, in a press release.

That’s a solvable problem, because taste is a chemical process. Flavors are your brain’s interpretations of different molecules touching your tongue. Acids, for instance, taste sour, while their alkaline counterparts taste bitter. Certain amino acids give a savory umami taste, while salts like sodium chloride taste, well, salty. A chemical called capsaicin is responsible for the hot spice of peppers.

For some years now, researchers have been tinkering with so-called “electronic tongues,” devices that emulate that process by sensing those molecules and more. Some of those implements even look like human tongues. In past research, they’ve been used for tasting orange juice.

But electronic tongues are a pale imitation of the organic kind. To taste anything even remotely solid—even honey—you need to mix the food with water, and that water must be pure, to keep out unwanted molecules. Electronic tongues can appraise cheese or a braised chicken dish, but a human needs to liquefy the food first. You’d be hard-pressed to find any cook that wants to wait 10 minutes to have a taste.

Even then, that process results in a one-time measurement that doesn’t do the food justice. Any foodie will know that taste is far more complex than taking a chemical sample of liquified food. Taste changes over the course of a bite. Different seasonings will hit at different points. As you chew on a morsel, and as your saliva and digestive enzymes mix with an increasingly mushy mouthful, the bite’s flavor profile can change.

The Cambridge group hoped to address that issue head-on (or, perhaps, mouth-on). Instead of a tongue-like tendril, they decided to shift the taster—specifically, a salinity sensor—to a movable arm. In doing so, the researchers hoped to give the robot a tool that could sample a dish at multiple points during preparation and chart a “taste map” of the food.

“For the robot to have control of the motion and where and how it is sampling” is different from other electronic tongues that have come before, says Josie Hughes, a roboticist at the École Polytechnique Fédérale de Lausanne in Switzerland, who was part of the Cambridge group in the past but wasn’t an author on this current paper.

To test the arm, the researchers created nine simple egg dishes, each one with different quantities of salt and tomato. The arm mapped out the salinity of each plate. Afterward, the researchers put each dish in a blender, to see if the robotic arm could discern the differences in salinity as the egg and tomatoes churned together into a salmon-colored mush, as they would in a human mouth.

With this new technique, the researchers could create salt maps that surpassed anything electronic tongues had done before. Of course, salinity is only one aspect of cooking. In the future, the researchers hope to expand to other tastes, such as sweetness or oiliness. And putting food in a blender isn’t quite the same as putting it in your mouth. 

“Perhaps in the short term we might see robot ‘assistants’ in the kitchen,” says Hughes, “but we need more exciting and insightful advances such as this work to be able to develop robots that can taste, chop, mix, cook and learn as a true human chef.” The road to getting anything like this robot into a kitchen, whether that’s in a restaurant or at home, might make culinary training seem easy in comparison.