Robots are increasingly being designed for applications that call for autonomy and adaptability, beyond the safe and predictable confines of repetitive assembly-line tasks. From machines that move elderly people around care homes and stock-taking devices that roam busy supermarkets to self-driving cars and aerial drones, these robots must navigate complex environments and interact with the humans that populate them.
Doing so safely and efficiently in the future will rely less on improving robotic motion than on improving the way robots acquire and process the sensory data used to guide that motion, argues David Dechow, a machine vision expert at industrial robotics manufacturer FANUC America. Engineers, he reckons, have pretty much “solved the kinematic issues, so that a robot can move around without bumping into too many things.” But technology that will allow intuitive gripping and proper object recognition, he says, “is still a long way off.”
Yong-Lae Park, a mechanical engineer at Seoul National University in South Korea, agrees that better sensors will be an essential part of tomorrow’s robots. Park is one of many researchers today using optics and photonics technology to improve robots’ awareness of the world around them—in his case, by employing fiber sensors to make humanoids more dextrous. “For robots to be part of our daily lives, we need to have a lot of physical interactions with them,” he says. “If they can respond better, they can have more autonomy.”