CD0911_GuitarHero

Vision System Makes Robot a Guitar Hero

Nov. 11, 2009
Roxanne the Robot Is Now a Guitar Hero Master. This Is Thanks to Ingenious Machine Design and Integration of Vision System Technology. See How the Rrobot Became an Overnight Video Game Player Pro

Pete Nikrin graduated from Minnesota West Community and Technical College in Pipestone, Minn., in 2008 and now works as a manufacturing engineer at Meier Tool & Engineering (www.meiertool.com) in Anoka, Minn. He designed a robot to compete with a friend whom Nikrin had introduced to the Guitar Hero game and, after playing for two weeks, had surpassed Nikrin in his ability.

Bill Manor, robotics instructor at Minnesota West, suggested Nikrin incorporate a vision sensor with a right-angle lens from Banner Engineering, which Minnesota West had purchased in a startup education kit.

To develop his Guitar Hero robot, Nikrin used a mannequin—complete with Minnesota West sweatshirt, hat and painted fingernails—and installed the camera lens as the robot's left eye, which would be positioned toward the TV or computer screen.

GRUNGE VISION
Minnesota West's Roxanne can hit 100% accuracy at times on Guitar Hero's medium mode, thanks to a vision system incorporated by its creators.
BANNER ENGINEERING

The robot, named Roxanne, identified the notes to be played by using an Edge vision tool, which detects, counts and locates the transition between bright and dark pixels in an image area.

"We set up five Edge tools that ran horizontally across the screen, one for every fret, and positioned the tools to focus on the notes at the bottom of each," said Nikrin. "The Edge tools sent a constant signal as the five vertical fret lines progressed, and when a bright white dot appeared in the middle of a dark colored circle, the Edge tool allowed the sensor to detect it."

Jeff Curtis, senior applications engineer at Banner (www.bannerengineering.com), worked with Nikrin and Manor to ensure the robot's processing time was fast enough to keep up with the video game. Once a note was identified, communicating this signal efficiently depended on a heavy amount of programming, as well as Ethernet technology applied through a Modbus register. A PLC was programmed, so that it constantly looked at the vision sensor's register. Once the Edge tool senses a note, the PLC notices the change in the register, and the logic in the PLC fires a solenoid that activates the robot's finger. Just as a human player would react, the robot's finger then presses down on the appropriate note on the guitar. This setup resulted in 9-msec processing speed.

The team also needed to ensure Roxanne could play within a range of lighting conditions, as well as confirm the robot was correctly oriented with the monitor displaying the video game. They solved this problem by using a Locate tool, an edge-based vision tool that finds the absolute or relative position of the target in an image by finding its first edge.

"We honed a Locate tool and gave it a fixed point—a piece of reflective tape on the PC monitor—to focus on," said Curtis.