"Beam me up, Scotty" is a generational phrase associated with the future that we imagined but thought would never come. I beg to differ.
Although demolecularizer/remolecularizers might be farfetched, there are many other technologies from Star Trek that are real today, with more on their way, I suspect.
In our daily industrial comfort zone, we rely on the sensors and controllers to access and send us data in order to determine various things. SCADA and HMI are our window into our real-time world. Data gives us a look back.
SEE ALSO: SCADA, HMI Boundaries Blur
An operator might have only one screen to look at, or many. The assault of information is huge. Well, that is about to change. The interface we have with our beloved process now has options.
Remember Comdex from the '80s? It has been replaced by the Consumer Electronics Show (CES) in Las Vegas. It's consumer-based, but we are more entwined with that than ever.
Take Google's Project Glass. It involves stylish eyewear that has a touchpad on the side, full audio in and out, and the ability to send voice commands to the mother ship; and early design specifications suggest that the glasses could respond to head movements.
Use of such an interface bodes well for the floor supervisor who could get an email sent from a robot indicating he/it is short on parts, or that the chip-n-saw is stalled due to low oil pressure. The voice command would call the maintenance department to relay the information for a zero-response-time action.
Although you could say it's a walking screen, this ability to be an unobtrusive tool could have lots of advantages. We might consider it non-disruptive, since it is an incremental move from any handheld device.
Nintendo's Wii interface that uses handheld gyros has huge potential. The Wii controller, in my view, easily can be used in 3D modeling, viewing and zooming with intuitive hand gestures. Many SCADA systems employ 3D, but it's tough to manipulate on a touchscreen.
Microsoft's Kinect gaming system has huge potential as well. An ability to use hand and body gestures has a big future. The buzz with Windows 8 is "touch." Kinect is about non-touch. Move your hand forward, it zooms. Move your hand at a good speed and the screen changes. Raise three fingers and the screen changes to a predetermined view.
As Jay Leno might say, "What could go wrong with that?" While hand gestures could be good for navigating, they might not be all that responsive for changing analog setpoints.
I once used my buddy's camera and found that the focus point was controlled by where your eye looked. You looked at the spot in the frame at the object you wanted to have highest priority and that's where the main focus was.
Wouldn't you know, a company developed a technology that mimics your eye movements and follows where you look. The technology is PCEye tracking, and is accurate to within 1 mm.
My immediate thought was about a man I met in Florida many years ago. He was a quadriplegic. He would control his wheelchair by using his mouth and a pressure tube. He had no way of interfacing with his world at all. PCEye changes all that.
With voice commands and eye movement, anyone can control their PC. The technology replaces the mouse and keyboard, which is cool. But imagine where this can take us.
You have seen many pictures of massive amounts of screens and dynamic displays in control rooms. Multiple operators are there because of the amount of screen real estate and functionality that is displayed.
Imagine changing the application's "focus" to where you are looking. Accuracy of 1 mm is fairly precise, I would say. That would give us the ability to look at an object on a screen to change the optics, data or even send a message.
When you consider the options, the future can be very exciting. Imagine the possibilities. Captain Kirk did that with every turn in the galaxy. Nothing was ever ignored.