By Jim Montague, Executive Editor
There's an embarrassment of riches in data collection these days. In fact, there are so many methods to gather, archive and analyze production information that it can be difficult for users to sort out and implement the most useful and affordable data acquisition (DAQ) solution for their applications. Some new tools and methods can make this job easier.
"Data collection is now simpler than it's ever been, thanks to networking and connectivity enabled by standard technologies, interfaces and protocols," says Ben Orchard, applications engineer at Opto 22 (www.opto22.com). "We now have PACs and PLC-based data-collection systems with sensors, processing capability and memory, plus networking capability and database connectivity.
This all starts with Ethernet because it's a standard physical medium that can tie data-collection hardware directly to the archive or database. "Sensing and data collection systems that use Ethernet can deliver acquired data to Microsoft SQL, Access or another database or archive running on the PC," says Orchard. "Perhaps the most significant shift is the use of standard protocols to translate machine data into formats that easily can be inputted, understood, interpreted and passed on. Today, this is done by the PACs and PLC-based collection hardware that communicate via standard industrial protocols such as Modbus, Profibus, EtherNet/IP and others, along with standard IT protocols such as FTP for data transfer and SMTP for email."
Similarly, some developers have built DAQ devices that combine signal conditioning and converting tasks and then employ software tools to log and archive the resulting data. For example, Remote Data Acquisition (ReDAQ) system software from Dataforth (www.dataforth.com) defines each hardware device's data slot with an address, looks for the process data value from each and then records up to one value per second. "Users can access the resulting HTML- and XML-based database with a Web browser and configure the database to show charts and tables, as well as perform emulation, simulation and alerts," says Bill McGovern, Dataforth's national sales manager. "We think this will help traditional distributed control systems (DCSs) and cabinets evolve from centralized control rooms to truly distributed systems with a few shoebox-sized units with stand-alone PC capabilities, which are then spread over an entire site with each containing the whole archive for that facility."
Likewise, instead of merely collecting data on a set schedule, these new data-acquisition methods allow users to gather more targeted information in response to particular events and conditions. Jan Pingel, FactoryTalk Historian product manager at Rockwell Automation (www.rockwellautomation.com), says, "To get the right data and calculations quickly, more users employ automatically configuring historians, which can check the tags on a data server, usually via OPC connections, and show results on a spreadsheet. We extend this by allowing them to create and move files and show relevant tags to configure, which makes our historian easier to use. In fact, our upcoming machine historian will become part of OEMs' initial discovery, setup, validation and factory acceptance test (FAT) process, instead of being added after the equipment is already installed. This historian will sit next to the controller on the backplane and join the user's overall control strategy by giving a wider and deeper view of the machine and production process."
Because DAQ is becoming much more software-based, however, it's crucial for users to plan and pick the right file formats for organizing their data, reports Brett Burger, data acquisition product manager at National Instruments (www.ni.com). "It's a lot like properly organizing your desktop. We're fans of open-source .TDMS (technical data management streaming) files," says Burger. "Users also need flexible and versatile DAQ systems that can perform many types of measurements in one box and adapt and grow as needed. For example, our CompactDAQ and CompactRIO can mix and match parts to do multiple measurements and pull signal conditioning tasks into their modules so users can focus on their measurements."