ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Personal Robots Learn by Seeing it Done

Personal Robots Learn by Seeing it Done

Some personal assistant robots are nearing commercial use but refining their tasks remains under development. Image: Samsung
Anyone who watched kid’s TV shows like The Jetsons or Lost in Space in the 1960s can be forgiven for expecting we’d have robotic aides by now. Rosie the Robot cleaned the Jetson house and offered trays of appetizers to guests. On Lost in Space, the robot accompanied the Robinsons, keeping the family of space colonists from danger.

The day of the assistive robot is coming, though creating and training these devices are astronomically harder than The Jetsons made it seem. “Our hope is that robots in the future will be able to reliably help older adults and people with physical disabilities,” said Zackory Erickson, a Georgia Tech Ph.D. student in the healthcare robotics lab.
 
Yet, programming those robots presents numerous challenges. Assistive robots will need to learn on the job, as they assist people with physical disabilities with activities of daily living, including bathing, dressing, feeding and walking, Erickson said.
 
“Each of these tasks pose a high amount of variability and require robots to be very adept in physical human-robot interaction and to adapt to new scenarios,” he said. “For example, there are hundreds of different types of foods a robot may be asked to help someone eat; there are hundreds of garment types and fabric materials for robots to manipulate when providing dressing assistance.”
 
Researchers in the Georgia Tech lab are using digitally simulated environments to train robotic caregivers of the future. Sim2Real (or simulation-to-reality) is the term for giving robots the tools to learn real-life skills via simulation.
 
Sim2Real describes a collection of techniques used to transfer robotic simulations to real robotic movements, Erickson said. The researchers believe Sim2Real techniques will help assistive robots learn as they go.

The PyRobot, an early assistive robot at Georgia Tech, looks like a Roomba vacuum with a 360-degree arm mounted on top. It can move from a starting point to an end point on its own. Image: Georgia Tech
Training robots on digitally simulated environments is cheaper and faster than training them in actual real-world environments, Erickson added.
 
Traditionally, researchers might set up a real-world course and have the robot run the course multiple times in order to learn it. With Sim2Real, the robot learns a skill by 3D scanning a real environment and creating a virtualized replica in simulation. In addition, multiple simulations can be run at the same time, or in parallel, speeding up the learning even faster, Erickson said.

Editor's Pick: 10 Humanoid Robots of 2020
 
Now, researchers in the Georgia Tech lab can teach a robot to navigate from one point to another in about three days, said Joanne Truong, a Ph.D. student in the university’s school of electrical sssengineering who works on the Sim2Real project.
 
“We don’t want to hardcode everything onto the robot. We want to teach the robot to learn on its own,” Truong said. “Robots learn through experience. For example, if it moves a certain way and gets to its goal it remembers that. But in order to teach this way we need years of experience, which is infeasible to do with a real robot.”
 
The Sim2Real-trained robot collects billions of frames of experience from the simulation, which would take years to load on a physical robot. Humans take for granted the many, many movements and calculations humans make for even the simplest tasks, she noted.

The team works with a PyRobot, which looks like a Roomba vacuum with a 360-degree arm mounted on top. One of the skills Truong and her team has taught the bot is point goal navigation, where the robot moves from a starting point to an end point on its own. While that may seem like a simple feat, Truong explained just how difficult teaching robots simple tasks can be.

“Figuring out all of the things that are required for a robot to understand the sentence, ‘Go to the kitchen and get me some orange juice’ is extremely complex,” Truong said. “The robot must understand the words you are saying, the reasoning behind what you want it to do, what object you want it to grab, and where in the environment it needs to go. It also needs to understand that orange juice is usually found in the kitchen, therefore it needs to map and learn how to get to the kitchen, needs to learn how to open the fridge and get the orange juice."

Reader's Choice: Robotic Proboscis Offers Firm but Sensitive Grip

Some firms, like Samsung, also are developing personal care robots. Earlier this year it introduced three new bots that act as personal assistants. The first, Bot Handy, uses AI and a robotic arm to perform household chores, such as setting a table or loading a dishwasher. Bot Care is Samsung’s personal assistant that helps with scheduling and work-related tasks. A third is a smart vacuum that doubles as a security camera.
 
In the lab, Erickson is proud to play a role in the creation of assistive robots. “Having these caregiving robots assisting people with disabilities in our homes or clinics is still many years away, but from a research perspective it is an incredibly inspiring field with many open research questions that can lead to real-world impacts in people’s everyday lives,” he said.

Jean Thilmany is a technology writer based in Saint Paul, Minn.
 
 

You are now leaving ASME.org