The role of presence sensing in collaborative robot applications

June 17, 2015
What should we expect from presence sensing with the increase in collaborative robot applications—where humans and robots work in the same space?
About the author

Mike Bacidore is the editor in chief for Control Design magazine. He is an award-winning columnist, earning a Gold Regional Award and a Silver National Award from the American Society of Business Publication Editors. Email him at [email protected].

What should we expect from presence sensing with the increase in collaborative robot applications—where humans and robots work in the same space?

We asked for advice from a panel of industry veterans. They include Chris Elston, senior controls engineer, Yamaha Robotics; Scott Mabie, general manager of Americas region, Universal Robots; Helge Hornis, manager intelligent systems group, Pepperl+Fuchs; Victor Caneff, business development manager, assembly and robotics, Banner Engineering; and Balluff marketing managers Wolfgang Kratzenberg, industrial identification, Henry Menke, position sensing, and Shishir Rege, networking.

Here's what they had to say. 

→ For more Q&A sessions from the panel, click here. 

Mabie: Presence sensing will play a huge role in collaborative robot applications. It is better to be more aware than less. If we can improve the robot’s environment awareness to achieve more dynamic planning and collision avoidance, then why not? Of course, the presence-sensing technology needs to be cost-effective, easily accessible and readily integratable.

One of the key value drivers of the UR robot is the ease of integration with peripherals such as sensors. We’re making our software intuitive and GUI-based, as opposed to conventional command- or language-based programming, which is also available for advanced programmers who want that level of access. Our Polyscope interface is now more open and accessible to end users, enabling them to build their own interfaces and wizards, even implementing their own custom communication protocol.

Hornis: Safety will continue to play an important role. But there is a second level of safety that will be of greater importance in automation and robotics in the future. The biggest issues with safety systems as we know and use them today are cost and complexity. We expect assistance systems, not unlike what we see in modern cars, to be much more important in the future. Assistance systems will not provide functional safety but help operators and robots in such collaborative situations. Assistance systems will be used in situations where currently nothing is used due to cost and complexity. It is too early to tell what kind of assistance systems will be accepted, but, again looking at the automotive market, it seems clear that the inclusion of a large number of low-cost technologies—ABS, ESP, blind zone assistant, distance radar, backup sensors—is ultimately much better than having just one safe-rated solution at the same price point, perhaps a backup sensor that utilizes an expensive safe-rated scanner.

Caneff: Although any robot has the potential for use in collaborative operation, new-generation robots on the market that are smaller, have power- and force-limiting functions and speed monitoring are commonly associated as a collaborative robot. These functions alone do not necessarily address all safety requirements, and monitoring the separation between operators and the robot may be needed to comply with applicable standards such as ANSI/RIA 15.06-2012, Safety Requirements for Industrial Robots and Robot Systems. Current safeguarding technologies such as safety laser scanners and light curtains are still applicable; however, advancements in vision and LIDAR technology for 3D-safety-rated monitoring of the space around a robot’s work envelope would open up more potential applications for collaborative operation.

Elston: Collaborative robots seem to be in an arena of their own when it comes to industrial robots. They are used when programming ease is desired and reducing safety costs is desired. End users typically favor collaborative robots because of the ease of integration. Because collaborative robots must work on the principles of torque to sense when a human is near, they tend to already have preventive sensing built in along with safety sensing. However, the trade-off with this type of torque sensing is speed, and precision is given up. Collaborative robots have an advantage with this type of technology already onboard, whereas the same "touchy-feely" response would need to be added to an industrial robot as an extra. The only difference is we don't want to give up the speed and precision we already are familiar with when using an industrial robot.

Rege: The age of robots started out as replacing humans in areas where operations are repetitive, hazardous or tedious. Then the concept of productivity and efficiency drove robotics automation to new heights, not only elevating robots' operation speeds, but also increasing their payload capacity. This led to tremendous growth of robots in most sectors of manufacturing. Presence sensing related to operations, for example, picking correct object or number of objects, and presence sensing related to safety, for personnel inside of the robot work zones, started to become more and more important. This development led to the growth of photoelectric and ultrasonic sensors on the end-of-arm tools on the robots, in addition to safety gates and interlocks, zone monitoring and other safety devices around the robot installation.

Now that robotics automation has come full circle—working alongside humans instead of replacing them—the role of presence sensing has evolved to include smarter sensing technology, not only on the end-of-arm tools, but also around the robot itself. The new era of cage-free robots or collaborative robots brings new innovative applications to life, and with them come challenges of how to ensure the safety of people working around the robot while maintaining the efficiency or throughput for which the robots are employed in first place.

Software-based safety, which now comes standard with most robots, relies heavily on smart sensing technologies such as safe zone monitoring sensors, vision systems, and force-torque sensing. But visualization of the zones are still caged inside of the machine HMIs. Programmable tower lights can play a significant role in the visualization of a robot’s working conditions. Traditional stack lights were only capable of showing current robot status of operation, but new programmable tower lights can be used to map directly to show the robot's working condition. For example, as someone enters a robot operating zone, it can show the robot slowing down visually using level mode of operation. The tower light can also show if any maintenance is required and the type of maintenance, or it can show the performance condition of the robot, its efficiency. With this on-demand and on-site visualization, people working around robots can get direct indication of how their behaviors are affecting the operation's efficiency, which is very similar to a digital gauge in your car showing the impact on mileage of your car based on your driving behavior.

About the Author

Mike Bacidore | Editor in Chief

Mike Bacidore is chief editor of Control Design and has been an integral part of the Endeavor Business Media editorial team since 2007. Previously, he was editorial director at Hughes Communications and a portfolio manager of the human resources and labor law areas at Wolters Kluwer. Bacidore holds a BA from the University of Illinois and an MBA from Lake Forest Graduate School of Management. He is an award-winning columnist, earning multiple regional and national awards from the American Society of Business Publication Editors. He may be reached at [email protected]