To exercise control, you need intelligence, and that requires information. In the machine world, that job falls to data acquisition (DAQ) systems. These consist of sensors, measurement hardware and software that sample and convert real-world physical conditions into something a computer understands. Armed with that, control and other decisions can then be made.
Setting up a DAQ system properly requires getting things right at the beginning, matching sensor capabilities to information needs, and, as in real estate, paying attention to location, location and location. It’s also important not to forget about the future.
For Graham Green, senior DAQ product manager at National Instruments, the key to getting the right results from the whole process lies in getting things right at the beginning. It’s common for engineers to start by evaluating hardware products when building a data acquisition system, but that is the wrong tack to take, he says.
“A better approach is starting with the decision you are planning to make based upon your measurement data,” Green says. “A good starting point is documenting exactly what data would provide the necessary insights to make that decision.”
There are two basic areas that have the greatest potential to hamstring data collection and hinder decision making, says Green. These are in the channel count and channel mix through which the data will be collected and in the ability to analyze and display the information acquired.
With regard to the first, it’s important to collect data on a device or system in its environment. Controlling the noise of an engine, for example, requires measuring that noise, engine conditions and possibly the environment. That means there must be enough channels of the right mix, such as one for a microphone array, another for engine speed, a third for temperature, and so on.
As for what to do with the data collected, the software must be capable of analysis. This may be through vendor-supplied application-specific toolkits, custom in-house developed software that can be imported into the analysis package or run as a stand-alone, or some combination. In any case, the data acquisition software has to transform raw data into something from which a decision can be made.
Larry Trammell, technical marketing director at Microstar Laboratories, adds that determining the channel number and mix requires understanding both the data needs and sensor capabilities. If a signal source, for example, drifts such that its repeatability is only plus or minus a percent, then a 16-bit measurement is overkill because the least significant eight bits only contain noise.
Also read: Best DAQ approach for machine controls?
Aside from drift in the signal source itself, there are also the sensors to consider, which will determine the kind of signal conditioning and data processing necessary. For instance, a negative temperature coefficient thermistor yields a large voltage swing but requires complicated linearization, Trammell says. On the other hand, a platinum resistance temperature detector (RTD) delivers good linearity over a modest voltage range but with a large common mode offset. A third type of temperature sensor is a thermocouple device, which produces micro-voltages with quantum noise roughly equal to the signal.
Setting the spacing in time of the incoming signals is also important in preserving the information needed for a decision. “Too many samples clog the system with redundancy. Too few samples lose information to aliasing. The right amount of samples will provide sufficient noise immunity and preserve meaningful bandwidth,” Trammell says.
Equally and perhaps even more important is how to handle the spatial distribution of signal sources. If closely related and coming from a limited area, then capturing and delivering data in a coordinated manner makes sense, explains Trammell. If spread out in several pockets, then a small modular system at each activity location is probably best. Highly distributed sensors of low data density can be a wiring nightmare, which can be alleviated through the use of smart sensors that tap into network communication lines.
Environmental considerations should not be overlooked, says Bill McGovern, national sales manager at Dataforth. For instance, having a sensor outdoors or on a plant floor means it is subject to wider temperature swings than would be the case if it were in an office, and this environmental factor must be accounted for.
The outside environment can also impact a data acquisition system in other ways. For example, local weather can cause voltage surges that travel down wiring. “If you’re operating in a laboratory condition, there’s probably not a problem with that at all. But if you’re bringing in signals from outdoors or an adjacent building, they’re very real conditions that need to be addressed,” McGovern says of such voltage swings.
Thus, there may be a need for input protection on individual channels. Individualization and configuration of channels may also be a necessity due to the nature of the data being collected. A temperature reading, for instance, likely needs low bandwidth, if only a few digits are updated every few minutes. Vibration data, on the other hand, may consist of tens of thousands of readings a second and so require a high bandwidth channel.
Beyond the ability to adjust individual channels, there also is often a requirement to modify an entire system. This can happen as a pilot solution is expanded into one for production and manufacturing. Then the ability to add on as needed is important, and so modularity of a data acquisition system is vital, McGovern says.
He adds that, since a DAQ system often has to interface with existing software and hardware, such capabilities as being able to handle a legacy serial communication link is important. So, too, is testing a system at startup, as well as setting up procedures for its general maintenance and periodic recalibration once in use.
Meeting data-acquisition needs entails figuring out what data is needed and how it will be used, along with defining measurement hardware, interface requirements and software, says Steve Byrom, product manager in the control instruments business division at Yokogawa of America. However, settling those issues only completes part of the job.
Any decision involving a data acquisition system should also look at the scalability of the DAQ solution. Doing so will allow for future growth and changes in requirements and also will protect any investment. Such future-proofing involves answering some questions about the system.
“Can I add more channels later and do it easily?” asks Byrom. “Can my new DAQ system connect to other plant systems such as PLCs using common data protocols such as Modbus and EtherNet/IP?”
Finally, what’s required in data acquisition is often a unique mix of channels and analysis, Green says. This setup can change over time with greater understanding of the problem, making modularity, flexibility and expansion capability important. Those traits can pay off in making sure a DAQ system does what it was built to do.
“Having the ability to customize a flexible system to meet the specific needs of your application will also ensure confidence in the data-driven decisions you make,” says Green.
Image courtesy of Stuart Miles at FreeDigitalPhotos.net