loren_shaum

See Your Vision Improve

Sept. 11, 2007
Half the Battle in Vision/Motion Control Integration Is Getting the Two Systems to Talk the Same Language
About the Author
In 2007, Loren Shaum was a contributing editor for Industrial Networking and Control Design, and principal at Comtec based in Syracuse, Ind., which provides research in the machine and general factory automation markets.Having been associated with vision technology since its “practical” introduction in the early ‘80s, I’ve seen machine vision go from 50-lb boat-anchor versions to sleek, highly intelligent embedded camera systems. Compared to other machine automation technologies, vision has advanced perhaps the most over the past 10 years.

Early on, Control Design saw the slow expansion of vision from just an end-user-initiated offline inspection tool to an application that included input sensor possibilities for machine control systems. But it was a slow journey. Given its expense and complexity, machine vision was a solution looking for a problem.

Motion and Vision

In Control Design’s August 1999 cover story, “20/20 Motion Control,” technical editor Rich Merritt began to see the promise emerge. He found that National Instruments had built vision and motion control hardware and software around its PC-based LabView package. “While it doesn’t offer the complete openness of Visual Basic and ActiveX, NI does provide an integrated programming package for developing applications,” he reported. “This is important because half the battle in vision/motion control integration is getting the two systems to talk the same language.”

Merritt found this to users’ liking. “A piece of cake,” is how Dino Farina, president of Image Therm Engineering, Waltham, Mass., described working with NI’s hardware and software in that article. Farina built an optical micrometer with an infrared laser to measure and map the thickness of silicon wafers. The system used vision to locate the proper starting point, and then moved the laser in X-Y directions to scan various sections of the wafer. Equipment used included a 200 MHz Pentium running Windows NT, a plug-in image-acquisition board, and a plug-in motion control board. As the laser traversed the surface, the vision system read 30 frames per second. “We just plugged in the boards, used NI’s virtual instrumentation (VI) software to communicate, and had the hardware up and running in a few minutes,” said Farina. This was a harbinger of things to come.

The FDA Assist

In April 2000, “QC Inspection Eyes Help From Vision Systems,” confirmed that “the big push into vision started in the early 1990s, when the FDA said a vision system could be used instead of human quality control personnel in the pharmaceutical industry,” said Ed Rogan, senior vision marketing specialist at Omron Electronics. “The FDA said pharmaceutical companies could replace 200% manual inspection (two people examining the same product) with a vision system operating 100% of the time.” Rogan said it didn’t take long for pharmaceutical companies to start installing vision systems for quality control inspections.

As system prices dropped, we reported a dilemma in our August 2002 machine vision roundup. “While it’s clear the drop in the cost of machine vision has brought many users into the camp, the flip side is some potential users actually are wondering if a $1,500 system really can do the same job as one that cost $10,000 just a few years ago.” We found the answer to that is training. “Vision suppliers are upping their commitment to provide training and education to spread the gospel.”

Rapid Growth

In a November 2003 TechFlash column, senior technical editor Dan Hebert reported, according to the Automated Imaging Association, “smart camera sales grew significantly in terms of dollar amounts, chalking up a whopping 42% increase in sales.” The stage was set for an onslaught of “smaller, cheaper, faster, better” smart camera vision mania from the likes of DVT, Cognex, PPT, and others. Trade associations predicted strong growth for smart cameras, and vendors confirmed this trend. “We sold 15,000 smart cameras from 1996, when the company started, through 2001,” said Endre Toth, director of business development for Vision Components, in the article. “In 2002 alone, we sold 10,000 smart cameras.”

Hebert helped define the smart camera revolution by reporting how “a smart camera not only captures an image, it also contains a frame grabber and internal memory. Images captured by a smart camera can be transmitted via industry-standard digital data interfaces such as Ethernet, FireWire, and CameraLink digital video. They are often equipped with local diagnostics to detect problems, and transmit information about malfunctions via their digital interface. Many smart cameras have internal logic that can be user-programmed to make real-time control decisions. Some can be linked in peer-to-peer networks to create a field-based control system. Smart cameras have much in common with smart field devices such as instruments and sensors used in distributed control systems.”

In 2004, Merritt noted how far vision had come in terms of price/performance. “Applications for machine vision cover a wide spectrum,” he wrote. “We spoke to five companies that use DVT’s vision systems. Three are doing parts inspections, one is bending tiny parts, and one is moving an airplane fuselage into position.” The latter application is quite complex even by today’s standards and was deployed by system integrator, Delta Sigma in, Acworth, Ga. “We take measurements from eight cameras to drive a 14-axis servo system to align large sections of an aircraft fuselage,” said Roger Richardson, president. “Our measurement and positioning accuracy are better than 0.001 in.”

Future Foreseen

As the machine vision market consolidates, the technology expands. In our June 2006 machine vision roundup, we learned from Frost & Sullivan analyst Vishnu Sivadevan that high-end machine vision applications are moving to 3-D. “Upgrading to 3-D inspection systems from 2-D would be a phenomenal leap in performance for certain applications,” observed Sivadevan.

Researchers are working toward the development of real-time, autonomous, robotic guidance using machine-vision systems. Sivadevan said the U.K. Engineering and Physical Sciences Research Council is working with a group of U.K. universities to incorporate artificial intelligence in robotics. The department of electrical and electronic engineering at the University of Manchester developed a vision chip capable of foveal and peripheral vision similar to the retina of the human eye.

Defined as a smart sensor, this vision chip performs the functions of a vision sensor and a microprocessor capable of processing complex images at rapid rates, and is expected to be used in laser-guided crawlers and for carrying out tasks such as machining and inspection.