About the Author
After working as a semiconductor process engineer, Hank Hogan hung up his cleanroom suit and now writes about process control and other technologies from Austin.
To exercise control, you need intelligence, and that requires information. In the machine world, that job falls to data acquisition (DAQ) systems. These consist of sensors, measurement hardware and software that sample and convert real-world physical conditions into something a computer understands. Armed with that, control and other decisions can then be made.
Setting up a DAQ system properly requires getting things right at the beginning, matching sensor capabilities to information needs, and, as in real estate, paying attention to location, location and location. It’s also important not to forget about the future.
For Graham Green, senior DAQ product manager at National Instruments, the key to getting the right results from the whole process lies in getting things right at the beginning. It’s common for engineers to start by evaluating hardware products when building a data acquisition system, but that is the wrong tack to take, he says.
“A better approach is starting with the decision you are planning to make based upon your measurement data,” Green says. “A good starting point is documenting exactly what data would provide the necessary insights to make that decision.”
There are two basic areas that have the greatest potential to hamstring data collection and hinder decision making, says Green. These are in the channel count and channel mix through which the data will be collected and in the ability to analyze and display the information acquired.
With regard to the first, it’s important to collect data on a device or system in its environment. Controlling the noise of an engine, for example, requires measuring that noise, engine conditions and possibly the environment. That means there must be enough channels of the right mix, such as one for a microphone array, another for engine speed, a third for temperature, and so on.
As for what to do with the data collected, the software must be capable of analysis. This may be through vendor-supplied application-specific toolkits, custom in-house developed software that can be imported into the analysis package or run as a stand-alone, or some combination. In any case, the data acquisition software has to transform raw data into something from which a decision can be made.
Larry Trammell, technical marketing director at Microstar Laboratories, adds that determining the channel number and mix requires understanding both the data needs and sensor capabilities. If a signal source, for example, drifts such that its repeatability is only plus or minus a percent, then a 16-bit measurement is overkill because the least significant eight bits only contain noise.
Also read: Best DAQ approach for machine controls?
Aside from drift in the signal source itself, there are also the sensors to consider, which will determine the kind of signal conditioning and data processing necessary. For instance, a negative temperature coefficient thermistor yields a large voltage swing but requires complicated linearization, Trammell says. On the other hand, a platinum resistance temperature detector (RTD) delivers good linearity over a modest voltage range but with a large common mode offset. A third type of temperature sensor is a thermocouple device, which produces micro-voltages with quantum noise roughly equal to the signal.
Setting the spacing in time of the incoming signals is also important in preserving the information needed for a decision. “Too many samples clog the system with redundancy. Too few samples lose information to aliasing. The right amount of samples will provide sufficient noise immunity and preserve meaningful bandwidth,” Trammell says.
Equally and perhaps even more important is how to handle the spatial distribution of signal sources. If closely related and coming from a limited area, then capturing and delivering data in a coordinated manner makes sense, explains Trammell. If spread out in several pockets, then a small modular system at each activity location is probably best. Highly distributed sensors of low data density can be a wiring nightmare, which can be alleviated through the use of smart sensors that tap into network communication lines.