David A. Bruce is engineering manager, general industry & automotive segment, at FANUC America. He answered our questions about robotic integration.
As industrial robot installations continue to rise, they continue to be integrated into work cells for palletizing, machine tending, welding and other industrial applications. What innovative or efficient robot-integration applications have you seen or been involved with?
David A. Bruce, engineering manager, general industry & automotive segment, FANUC America: In my role in the general industry and automotive segment supporting all our intelligent options, I get the chance to work on proof-of-concept systems for large potential applications. One that I worked on back in 2020 has turned into a multiple-system order for one of the largest logistics companies on the planet.
The application is flat sorting or sometimes called package induction where in a logistics facility several packages that are mostly flat, but not always, need to be singulated into a highly automated sorting system at over 1,000 parts per hour.
The challenge with this application involves not only very good machine vision to identify the best pick point and type of package, but also efficient and intelligent path planning to ensure the absolute maximum accelerations possible are used because too much acceleration will result in the package being ripped off the tooling, and too low acceleration will put the system behinds its goal rate.
This application is almost completely manual today and is one of the top applications that companies are looking to automate, and with the right technology and integration it is a perfect fit for vision-guided robotics.
Robot duty is also a large concern for these types of systems where the robot will be working as fast as it can and with a different path each cycle. The specific model of robot and its precise placement in the work cell will ensure that the robot will perform and last several years, and this is where an offline simulation tool is able to provide very accurate estimates for duty and motion speeds and allows our team to identify the best robot model and position for it in the system.
What have been the biggest improvements to robot integration and interoperability with other robots or components in work cells in the past five years?
FANUC’s version of this is dual-check safety, and it allows a user to specify various zones, in cartesian or joint space, where the robot cannot be or has to be based on safety-rated signals from other devices in the work cell.
It is called dual-check because two sperate processors are constantly monitoring the user-specified requirements and the robot’s position and current speed vector, and if ever both processors disagree there will be a fault stopping the robot.
As manufacturers and warehouses/distribution centers push toward autonomous operations, what is the benefit of collaborative robots (cobots) and their ability to work alongside humans?
David A. Bruce, engineering manager, general industry & automotive segment, FANUC America: While the flexibity and shorter engineering time of a collaborative robot–based system is attractive, the fact is collaborative robots are slower than regular industrial robots, which means in applications where the throughput requirement is very high they are just not an option.
That being said, there are many applications that are prototyped using a cobot just because they are much easier and safer to deal with in the lab than regular industrial robots. Not having to separate a cobot from its environment is certainly a cost savings in terms of material and engineering time, and the user interfaces for most cobots are generally much easier to navigate than those for traditional industrial robots. And this makes setup and execution of a simple robotic system much easier, provided the cobot is able to achieve the required throughput.
When will robotics technology become user-friendly enough that integration, installation and operation is plug-and-play and no longer requires extensive engineering?
David A. Bruce, engineering manager, general industry & automotive segment, FANUC America: I am not one to make predictions, and I have been in the industry long enough to have heard many predictions for dramatically decreasing the engineering time for a robotic system and for sure it has reduced in those 20-plus years but it has been from small advances in simulations, dual-check safety and vision, for example. It still does require some amount of due diligence to realize a successful and reliable robotic manufacturing system, but that time is decreasing and will continue to decrease.
What future innovations will impact the integration of robotics technology in work cells and with other industrial machinery?
David A. Bruce, engineering manager, general industry & automotive segment, FANUC America: Deep learning for image recognition and segmentation is really starting to be used quite a lot with robotic systems as a way to deal with very complex scenes and figuring out where the robot should move to in order to complete its task, especially in singulation applications in logistics and elsewhere.
The next big artificial-intelligence (AI) advancement that could dramatically expand the use of industrial robotics is deep re-enforcement learning for path planning, where raw pixel data from a vision system is used to decide on specific joint angles to move the robot to complete a task efficiently without any explicit instructions from conventional programming languages.
This could be considered the holy grail for AI and robotics, and there are number of small and large companies trying to realize this, but it has only been somewhat successful in the lab.
Tell us about your company’s state-of-the-art robotic integration into a work cell or transporting materials/goods between cells.
David A. Bruce, engineering manager, general industry & automotive segment, FANUC America: All FANUC robots come ready for FANUC’s fully embedded machine-vision system called iRVision, which includes 2D and 3D cameras and makes adding full featured vision guidance to any application very easy. No extra computer hardware is required with iRVision because the robot controller is also the machine vision processor. With FANUC’s latest controller platform R-30iB Plus, iRVision is on its third generation, this time using a single 75-Ohm coaxial cable interface for both 2D and 3D cameras.
There is a new feature for mobile robots, where a FANUC robot mounted on an autonomous mobile robot (AMR) can use a robot-mounted 2D iRVision camera to locate each fixture or machine the AMR delivers it to (Figure 1). This option is called the One-Marker Offset and it makes locating each workspace in full 3D easy and economical.