As manufacturing and IT converge, the battles from 30 years ago over who will be in charge are returning. Back then, plant floor people wrestled with IT over who was responsible for the minicomputers, PLCs and data acquisition equipment that control LED factories and processes.
The IT people claimed that computers fell under their responsibility, and the plant-floor people claimed that PLCs were relay replacers, not computers. Finally, after many bloody territorial wars, a truce was declared: IT people kept control of accounting and payroll computers, and plant people got the PLCs and industrial computers.
A tour of data centers in Holland and the Hannover Fair industrial trade show in Germany revealed the battle over data center control is currently raging in Europe, and it will soon reach the North American shores.
RFID is a major driver, because its becoming so pervasive. RFID tags are being put on everything from incoming parts and components to systems being shipped out the door. The various pieces are tracked as they travel through the factory or process plant, and RFID data goes into and out of a host of computers.
ERP, CMMS, asset management and MES also are drivers, because these IT programs use data collected on the factory floor. Such data has to be acquired, sorted, formatted, packed up and shipped off via a plant network to various packages that need it.
A typical manufacturing plant has 200 different software programs processing this data, says one estimate. Historically, these packages ran on their own computers, which were located in control rooms, maintenance offices, accounting departments, IT computer rooms, QC labs or back at the home office.
However, modern server technology makes it possible to put all the computers into a single data center. IT workers, maintenance people, accountants, plant engineers and control engineers use client workstations, while the applications are housed in the servers. If you use CMMS software, you are probably already running it on a server and accessing it from handheld terminals and plant floor workstations.
As the various servers congregate in a data center, questions arise: Where will the data center be located? Who will be in charge of operations? Who will maintain it?
Brian Koch, a product manager at American Power Conversion Corp. (www.apc.com), says he's seen IT people and facility managers almost come to blows over the issue. He works with Rockwell Automation (RA) in a partnership where RA uses APCs UPS products in its automation systems, so hes been in a lot of plants with data centers in Europe and North America.
IT and plant floor people have different attitudes toward maintenance of these systems, Koch says. For example, IT's solution to a problem often is to shut down and reboot, which is fine for an accounting system. Factory floor people know you can't do that with control systems.
Koch says that European companies especially large operations are willing to give up individual computers in exchange for servers, because it's more economical. We see most data centers being installed in the manufacturing areas of large companies, he notes. Companies in North America are much slower to accept servers and data centers, but they are coming around, especially in new plants.
In addition to the territorial battles looming over who controls what, several maintenance and plant engineering problems arise with data centers. These include supplying clean power to the servers, cooling them, providing security, and maintaining the entire data center complex.
For example, installing dozens of pizza-box server computers in 19-in. racks poses severe power and heat problems. If a cabinet uses 30 kW of power to run the computers, it generates 30 kW of heat. In most new data centers, there is a return to the raised floor design of computer rooms of 30 years ago, where cold air comes out of the floor, blows through the racks, exits out the back, and then is drawn off. Sizing the air handlers, ducts, fans and so on is a considerable problem for plant engineers.
Companies are re-evaluating the role of the raised floor. Some customers are trying to cool their data centers at the rack or row-level versus cooling at the room level, Koch says. This helps solve hot-spot issues. Just another problem for a plant engineer.
At Delft University in Amsterdam, it was freezing in the data center. At the Deloitte Cyber Centre, also in Amsterdam, it was comfortable in the center.
Supplying backup power to keep critical manufacturing systems running is another plant engineering problem. Deloitte has mirrored data centers in case of a catastrophic failure. Either data center can run the facility if the other fails. Both Delft and Deloitte use APCs UPSs and backup diesel generators to keep the systems running through power outages. Both of these backups require a great deal of engineering and planning.
One of the most difficult plant engineering problems is sizing a UPS to fit current and future needs, says Erik Ubels, director of the data center at Deloitte. We now use modular UPSs from APC, so we can simply plug in what we need when we expand, he explains. In the past, we had to size UPSs to accommodate what we thought we would need several years in the future. That was very expensive. Modular UPSs solve that problem.
In Europe, energy is a major issue, thanks to the high cost of power and reliance on Russia to supply natural gas. When Russia cut off gas to the Ukraine during a political squabble, it sent shivers down the spines of governments and industries across the continent. In The Netherlands, laws were passed to reward companies that conserved energy.
Therefore, when Deloitte built its two mirrored data centers, they were designed for maximum energy efficiency. In the winter, heat from the servers is directed into the buildings floor, where it provides much of the heating requirements. The HVAC system also draws in cool outside air when ambient conditions warrant. Because of this, Deloitte gets a considerable incentive from the government.
When a company puts its computer resources into a data center, it has to protect it, which is another job for plant engineers. Paul Boontz, project manager at Delft University, says they had to rebuild their data center completely because a disgruntled student burned down the original one. The new data center is protected by multiple doors, ID cards, pass codes and walls that resist forced entry, fire and attacks.
Delfts security pales in comparison to Deloittes data center fortress. Deloitte, a worldwide financial and consulting company, derives almost all its income from work and services done in the data center. The center is surrounded by a high fence, a guardrail to keep vehicles from smashing through, pressure sensors in the parking lot to detect the movements of vehicles, infrared sensors to track the comings and goings of people, computers that monitor the sensors looking for suspicious behavior, and multiple levels of security, starting at the front door.
For a visitor, getting into the data centers inner sanctum server farm requires a guide, ID card, passing through multiple entry points and finally through a man trap entry with two doors. Security computers know who are visitors and where they are at all times. Setting up such a system is a new challenge for plant engineers.
The final challenge is maintenance of the data center, a job that requires knowledge of computer hardware, power, UPSes, backup generators, HVAC systems and security. Its almost an entirely new maintenance category, requiring very specific training.
Designing and building data centers in manufacturing plants also requires those same specific skills. APC, which was once just a UPS and power conditioner vendor, now helps customers design and build data centers. We had to help out, says Koch, because nobody else knew how to do it. Now weve become experts.
In a few years, when data centers become commonplace in North American facilities, Plant Services magazine readers also may become experts in this new field.