cd1401-specmate

Think global I/O, connect (sometimes) local

Jan. 15, 2014
The many flavors of I/O connections: hard-wired, local variety, distributed and remote.
About the Author

After working as a semiconductor process engineer, Hank Hogan hung up his cleanroom suit and now writes about process control and other technologies from Austin.

I/O connections come in two flavors. There's the hard-wired, local variety and the distributed, remote type. For machine builders, deciding which way to go basically comes down to the number of nodes and the distances to them. Generally, both users and suppliers will tell you that machines with more I/O points that are further away tend to come out ahead by employing a distributed, remote connection strategy.

"If you have a smaller volume of I/O, and they're not spaced that far away from the control cabinet itself, then sticking with the traditional parallel wiring, everything wired back to the PLC, is probably the most cost-effective option," says Jason Haldeman, lead product marketing specialist for I/O and Light at Phoenix Contact.

The location of this dividing line for the number of nodes is fixed by two constraints. The first is a hard number. "PLCs have only so many connections available inside their chassis," Haldeman says. "If the number of I/O points causes that to be exceeded, one option is to go to an expansion chassis, but those tend to be expensive." He adds that, in situations such as these, it might make sense to go with distributed I/O since it sidesteps chassis limits.

For the second factor, I/O count, the crossover point has to do with cost. A hard-wired approach brings everything back to a control panel, which tends to be centrally located. There's a cost associated with such wiring, says Joe Wenzel, regional marketing lead for Rockwell Automation.

Typically, a hard-wired approach requires landing six terminations per I/O point. Each termination involves stripping wire and inserting it into a termination block, a process that's not free.

"A lot of customers will say it costs them $10 per point, for example, to do that," Wenzel says. "So every wire I do here could be $60 or some number. And there's also the risk of miswiring with a higher number of terminations."

In contrast, remote I/O often requires landing only two terminations. It's better yet to use a sensor hung directly on the network because this requires no standard-wired connections.

At some number of nodes, the savings from wiring alone will tip the scales toward a distributed approach. A node cluster also can be an argument for using distributed I/O, at least in selected areas.

"If I have a distant machine location where I have areas with high densities of I/O, then it would make sense to put a distributed I/O system out in those areas," says Rockwell Automation's regional marketing lead, Scot Wlodarczak.

Any wiring expense might not be a one-time event, and that could make remote I/O more attractive, comments Thomas Edwards, senior technical advisor at Opto 22. Very large machines might be assembled once by the system builder, tested and then shipped in pieces to the customer. Once there, the machine has to be reassembled and checked out again. What's more, after initial installation, a machine could be moved as factory floors are modernized or expanded.

"If the machine is big enough that it would get shipped or moved or even moved around the plant in pieces, making the connections becomes a real pain in the neck," Edwards says. "It's a big expense, and it's a very common point of failure, probably a much higher failure rate than any of the electronics. If you can eliminate that, you have a more reliable and less expensive machine."

As for distance, longer runs tend to favor distributed I/O. All signals lose strength when traveling down a wire, if for no other reason than the resistance of the wire and the resulting voltage drop. That loss is more pronounced as data rates go up due to the rapid switching of the signal. Thus, there's a need to periodically boost the signal. Ethernet-based technologies offer very inexpensive ways to do this. About 100 meters of wiring can run between signal-boosting switches.

Johnston Hall, commercial engineer for PLCs, networks and distributed I/O at Omron Automation and Safety, notes that the decision to go remote might be easier if other devices are present. Some of the possible devices that might be running remotely on a network connection in any case could include bar code readers, ID tags, servos, motors and even vision systems.

"If you have to communicate to them anyway, you need remote I/O," Hall explains. "It's another reason to think that putting the digital I/O on the same cable might make sense."

One past reason that made sense to go with local I/O no longer applies, says Kurt Wadowick, I/O and safety specialist at Beckhoff Automation. Local, hard-wired connections once were significantly speedier than remote, distributed technology, but that is no longer true if the right connection technology is used.\

"With EtherCAT, the data is organized right at the I/O module to the exact process image the PLC program needs, and is transferred directly to the controller's memory without requiring any effort from the controller CPU," Wadowick says.

He adds that typical update rates possible with such an approach are 200 16-bit analog points in 50 µs. Some 256 digital I/O points can update in 12 µs.

Finally, it's important to remember that the question of local versus remote is not "either/or" but "and." All PLCs have some provision for local connections and most installations make use of the capability. Thus, the majority of machines have a mix of local and remote I/O connections, says Kevin Wu, distributed I/O product manager for industry and factory automation at Siemens Industry.

The push to implement a distributed, remote I/O system happens when connections grow to be too numerous or runs too long. Remote I/O also might be implemented if a retrofit is done to bring in safety features, but everything begins with the local connection.

"A remote I/O station only happens when there's a local I/O station," Wu says. "Local always comes first because that's basically the location that all your field devices are closest to. As you expand the number of devices, that's where you have to consider a remote I/O rack."