Collaborative Robots Learn to Collaborate
Accessible 3D vision unlocks the potential of machine learning for making our autonomous partners more humanlike.
Share



To be truly collaborative, robots must be capable of more than working safely alongside human beings. Russell Toris, director of robotics at , says robots also need to act (and “think”) more like people.
This is particularly true of autonomous mobile robots (AMRs) like those manufactured by Fetch. Typically employed for material transport and data collection (such as counting inventory), these wheeled systems use vision sensors and navigation software to dynamically adapt to new environments and situations. Increasingly common in warehouses and distribution centers, this technology is likely to spread to other applications and industries, including our own. In fact, our January issue’s coverage of JIMTOF in Japan touched on the promise of machine-tending robot arms on wheels. Whatever the application, ensuring that a robot can safely occupy the same spaces as humans is an entirely different proposition than ensuring that its behavior neither hinders implementation nor wastes resources by making humans uncomfortable.
Mr. Toris cites the example of two people approaching each other from opposite directions in a narrow hallway. They typically will acknowledge each other in some way before passing, even if only by a nod or a glance. One or both will likely slow down, and perhaps step aside to allow the other a wider berth. A robot with a myopic focus on moving as efficiently as possible from point A to point B will not be nearly as considerate. It might not collide with the person striding toward it, but its movements will seem as cold as they are efficient, and possibly even threatening.
The robot is “aggressive.” The robot is “rude.” The robot is “acting drunk.” We cannot help but assign human traits to inanimate objects, particularly objects that act autonomously and purport to be our collaborators. This tendency influences our behavior, Mr. Toris says. Employees might lose time keeping a wary eye out for rampaging robots. They might even stop working entirely to observe odd behavior. Whatever the specifics of the situation, we are less likely to use any technology to its full potential, or even use it at all, if it evinces feelings of hesitation or intimidation.
Instead, what if AMRs could maintain a comfortable distance as they pass? What if they could differentiate between a person, a forklift and a pallet, and adjust their behavior accordingly? Robots may not be able to nod or glance, but what if they could use sound or light (say, a turn signal) to notify people of their intentions? Making behavior more natural and more predictable is a primary design philosophy at Fetch Robotics. “We design robots for people, not robots for robots,” Mr. Toris says.
This is possible through the intersection of two inherently intertwined technologies. The first is 3D vision systems, which are more affordable than ever due to advances in seemingly unrelated fields like autonomous vehicles, Mr. Toris says. Although the 2D laser sensors used for most AMRs are extremely accurate and capable of detecting distant objects, their vision is limited to a shin’s-eye view of the most basic geometric shapes. Add 3D cameras to complement the 2D sensors, as Fetch has done, and the robots can paint a more comprehensive picture of their environments. More robust visual data is critical not only for distinguishing objects, but also for fueling the machine-learning algorithms that enable the robots to determine how best to respond to those objects.
Fetch must teach its robots in order for them to learn, and teaching requires masses of data. To collect that data, the company has constructed a mock warehouse to train AMRs at its facility in San Jose, California. Mobile robots have been navigating the aisles for four years now, filtering rich vision-sensor feedback through artificial neural networks (ANNs) to distinguish obstacles and determine not only how to navigate around them, but to navigate around them appropriately. These ANNs consist of layer upon layer of interconnected, computerized nodes, creating a vast web that filters data from the robot’s sensors (2D lasers complemented by 3D cameras). Each time the robot identifies and/or responds correctly to an obstacle, individual nodes are weighted accordingly. This makes the same outcome more likely in the future, even when the ANN is tasked with filtering novel sensor data from a novel environment.
Four years’ worth of data from the mock factory ensure that the latest AMRs will benefit from all the experience of their predecessors, Mr. Toris says. Four years from now, the dataset will be even more robust, and new machine-learning techniques likely will be available. Whatever the future of AMRs in CNC machine shops, it is well worth considering how robot design might change as a result of both technological developments and changes in thinking about the nature of automation.
Related Content
5 Stages of a Closed-Loop CNC Machining Cell
Controlling variability in a closed-loop manufacturing process requires inspection data collected before, during and immediately after machining — and a means to act on that data in real time. Here’s one system that accomplishes this.
Read MoreMachine Monitoring Boosts Aerospace Manufacturer's Utilization
Once it had a bird’s eye view of various data points across its shops, this aerospace manufacturer raised its utilization by 27% in nine months.
Read MoreLeveraging Data to Drive Manufacturing Innovation
Global manufacturer Fictiv is rapidly expanding its use of data and artificial intelligence to help manufacturers wade through process variables and production strategies. With the release of a new AI platform for material selection, Fictive CEO Dave Evans talks about how the company is leveraging data to unlock creative problem solving for manufacturers.
Read MoreEasy-To-Install Data Acquisition System for Real-Time Monitoring Across Brands
cnSEE from All World Machinery Supply combines easy installation and monitoring across multiple machines.
Read MoreRead Next
Shop Tour Video: You've Never Seen a Manufacturing Facility Like This
In the latest installment of our “View From My Shop” series, explore Marathon Precision’s multi-process approach to manufacturing, where blacksmiths and hand-forged dies meet state-of-the-art CNC machining. Discover how restoring classic muscle cars and building custom art projects creates a dynamic shop culture — and draws top talent to this unique and innovative metalworking facility.
Read MoreSetting Up the Building Blocks for a Digital Factory
Woodward Inc. spent over a year developing an API to connect machines to its digital factory. Caron Engineering’s MiConnect has cut most of this process while also granting the shop greater access to machine information.
Read MoreRegistration Now Open for the Precision Machining Technology Show (PMTS) 2025
The precision machining industry’s premier event returns to Cleveland, OH, April 1-3.
Read More