What is Big Data’s role in process optimization?

By Bob Rice

From the 20,000-foot view, Big Data involves the collection, consolidation, and analysis of disparate data in terms of sources and formats for the purpose of uncovering new insights and creating value. While Big Data got its start in the consumer and financial services sectors, its potential for unearthing valuable information has more recently captured the attention of the global manufacturing sector. Indeed, Big Data represents a major component of Industry 4.0 and the Industrial Internet of Things (IIoT). It even capitalizes on the cloud. But buzzwords aside, what role does Big Data play in optimizing the production processes that serve as the foundation of manufacturing?

Before digging into the ways in which Big Data can be used in process optimization, it’s valuable to consider why Big Data is only now making its entrance into the manufacturing realm. Three key factors stand out: data, storage, and analytics. Over the past several years the manufacturing industry has seen a dramatic drop in the costs of both sensor technologies and data storage. Those two developments have made it cost-effective for manufacturers to collect and store more data than was previously feasible. The third factor – significant advances in analytical technologies – is equally important, as innovative analytics are driving new predictive diagnostic and optimization capabilities.

So how does Big Data facilitate optimization at a typical production facility? Here are a few examples – the tip of the proverbial Big Data iceberg:

Nonobvious analytics
For years, alerts and alarms have pointed out obvious issues affecting plant performance. Triggered when a given constraint is surpassed, these simple notifications have helped staff improve their management of expansive and hazardous manufacturing environments. But alerts and alarms are limited – they’re constrained to simple thresholds such as HiHi and LoLo as well as the use of equally simple operators (e.g. total change greater than X, value below threshold for Y amount of time, etc.). In contrast, Big Data solutions are capable of analyzing the dynamic relationships that exist among and between multiple data sources, revealing new insights.  As one example, the same spectral analysis tools used to decode audio signals can assess frequencies within a plant’s process data to isolate the root cause of performance issues.

Operational intelligence
Most practitioners would agree that their plants were operated historically as a collection of related silos.  Each functional group – whether engineering, operations, or maintenance, etc. – focused narrowly on its respective area. Information and communication were confined largely to those staff within a particular group. That limitation meant that few if any personnel had a complete operational picture. A benefit of Big Data solutions is their ability to draw on information from across functional silos. Consider that some diagnostic solutions use everyday process data to uncover common mechanical issues that would otherwise go unnoticed. The use of all available plant data to uncover performance issues transcends the limits imposed by traditional roles and enhances the operational intelligence of all plant staff.

Changing mindsets
There was a time not too long ago when data was captured and stored only because a given manufacturer was legally required to do so. Think of government agencies such as the EPA, the FDA, and the NRC and the regulations they imposed. Widespread at the time was the view that data was little more than a needless expense because it was collected only to satisfy some government requirement. Perspectives have clearly changed with time and innovation. Many manufacturers now see their data as a potential profit center.  General Electric is just one company that has built a business around Big Data, investing heavily in predictive analytics and saving customers millions in otherwise lost productivity. The company's data clustering and machine-learning technologies are helping proactively identify conditions that lead to asset failure and unplanned downtime.

Raising standards
All process manufacturers strive for consistency in their production. But consistency can vary depending on things as seemingly inconsequential as the season of the year, the individuals who staff each shift, and the source of production inputs, among other factors. Individually, they may not be significant, but in aggregate those little differences can have a profound effect on control. The ability of staff to first recognize and then adjust for so many subtle changes is limited. Big Data can help. It is allowing manufacturers to understand the daily changes in production inputs, staffing, environmental conditions, etc., and helping them maintain – if not elevate – their standards. From KPIs that assess a plant's overall health of an entire to those that evaluate the oscillatory behavior of a single PID loop, Big Data is enabling today’s manufacturers to establish tighter control and raise their standards.

While Big Data’s uses and benefits continue to grow, Big Data has already proved to be an important part of process optimization. Big Data is no longer just a tool for people running the supply chain; it's a tool for everyone in manufacturing.

For more insights into Big Data and process optimization, see these articles: