ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Robotics Blog: A Robot’s Machine Vision is the Key to its Future

Robotics Blog: A Robot’s Machine Vision is the Key to its Future

Robotic systems will rely heavily on advanced machine vision systems as they continue to evolve.
Over the last few years, robotic systems have become extremely proficient in picking, sorting, placing, and cataloging items on the manufacturing floor or in the consumer market. The main driving force is advancements in machine vision systems.
 
The robotic industry is pushing the boundary of where they can operate. Machine vision systems, paired with AI and deep learning, enable robots to work faster on the line and in new areas such as supermarkets, hospitals, and restaurants.
 
“If you look at the amount of vision that’s embedded in robots now, a lot of that’s driven by the fact that you have a great camera on your phone, which has driven the cost way down,” said Tom Ryden, executive director of Mass Robotics. “The development of the lower cost sensors has made a dramatic impact in today’s robots. By placing these sensors throughout my robot, it can start to understand a lot more about its environment.”

Listen to our Podcast: The Transition of Robots from the Factory into Society
 
According to the International Federation of Robotics’ 2021 World Robot Report, industrial robotic installations have risen by 10 percent, and service robots use has increased by 12 percent. Overall, the robotic density has nearly doubled globally. Advanced visions systems are one of the main reasons for the rise of robotic installations.
 
“State-of-the-art machine vision technologies are what make highly automated and consistently networked industrial process chains possible,” said Steve Kinney, director of training, compliance, and technical solutions at Smart Vision Lights in a recent interview with the Association of Advancing Automation (A3). “Machine vision provides sophisticated capabilities for reliably identifying, locating, and positioning workpieces or, for example, optical character recognition, which supports reliable identification. The seamless, machine-vision-based monitoring of automated production processes also enables collaborative robots and their human colleagues to work together much more safely.”
 
The COVID-19 pandemic has pushed organizations to adopt more robots to combat labor shortages. In particular, companies are looking for easily programmable robots with advanced vision systems. These robots allow for greater skill diversity and less programming.
 
Robots like Tally from Simbee Robotics use advanced machine vision systems to scan products on the shelf and detect people and obstacles. Credit: Simbee Robotics
Machine vision allows robots to operate in the moment. For a typical industrial robot without a vision system, also known as a blind robot, programming must include a physical definition of the product with which it interacts. This limits the robot’s operations, restricting it to only a particular process. Vision systems allow the robot to identify various parts as they come down the manufacturing line.
 
Sam Lopez, senior vice president of sales and marketing at Matrox Imaging, told A3 that vision systems create a more adaptable robot and manufacturing line.
 
“Adding vision to an automation line, robot system, or autonomous vehicle not only greatly increases the capabilities of that particular system but can also improve the interoperability in the factory between these systems or with human interactions. With increased focus on an Industry 4.0–connected factory, the capabilities offered by vision become even more important,” Lopez said.

Watch our Video: Robots Take to the Streets
 
These enhanced vision systems also translate to robots’ adventures outside of the factory. As robots enter public spaces, they will need to identify people, buildings, street signs, pets, and various other obstacles to operate. For example, Tally, the retail inventory robot that works in supermarkets from Simbee Robotics, uses lidar and vision cameras to scan inventory and avoid people and not interrupt their shopping experiences.
 
“Similar to autonomous cars, Tally uses lidar and real-sense Intel depth cameras to not only help us detect obstacles but to avoid them. This allows Tally to be unobtrusive and not get in the way of the shopper’s experience,” said Jeff Gee, chief design officer at Simbee Robotics.
 
As robots continue to evolve, their sight will play a key role in interacting and behaving in our world.
 
Carlos M. Gonzalez is special projects manager.

You are now leaving ASME.org