• Company News

A New Recipe for Operational Excellence in Chemical Manufacturing

By David McKnight

Striving for operational excellence within process manufacturing can be grueling work. But even more so for chemical manufacturers, whose processes can span shifts, days, or weeks, requiring multiple handovers and the high potential for loss of valuable batch-wise history.

Part of the issue is a hesitancy of digitalization. Paper tracking is still common across the industry, especially in regulated processes, making it incredibly challenging to mine and analyze data for improvement opportunities. In addition, hiring and retaining operators who require considerable expertise is a perpetual challenge. And, as with all industries, reducing operating costs, improving time-to-market for new products, and manufacturability can become major hurdles for the chemical manufacturer.

Compounding these complexities is the sheer volume of chemicals that are manufactured annually. One estimate suggests there are over 350,000 chemicals and chemical mixtures in commercial use, including 85,000 that are considered toxic requiring additional regulatory and safety needs. Considering these interconnected and increasingly complex challenges, operational agility is hard to come by.

There is hope, however, with accessibility to exciting new digital innovations. Process Digital Twins and Predictive Quality tools are helping more chemical manufacturers stay on top of accelerating cost constraints and increased demand for innovation:

    • Process Digital Twin: A virtual replica of the entire manufacturing process, which enables AI and simulation techniques that provide simulation and optimization.
    • Predictive Quality: A branch of industrial machine learning data analytics that forecasts the quality or yield of a process step based on process conditions and material attributes.

Consider process manufacturing, which involves a series of steps (e.g., heating, mixing, pulverizing, etc.), to continuously transform the raw materials into a product. A Process Digital Twin describes and tracks the relationships of all process, setpoints and material variables in a centralized, human readable form. It often relies on multiple and converged IT and OT data sources to build the information model.

Variation in an upstream step can affect interim products and have a compounded negative effect on downstream steps and ultimately product quality. Accurately tracing fluid material flow in a continuous process is a key and complex aspect of a Process Digital Twin, requiring a combination of fluid dynamics and business rules. Since expert examination of material-process-human-method (a.k.a. “4M”) data can support computation, simulation, and mitigation of downstream affects, providing this Process Digital Twin information in real-time is foundational to any digital quality assurance program.

In addition, Process Digital Twins can help organizations answer such questions as:

      • Can a portion of the process complete more efficiently if run hotter or at higher pressure or at a higher speed?
      • Can different or cheaper catalysts or solvents be used or added to the process to improve efficiently?
      • Can existing manufacturing equipment be repurposed for new products? How would the recipe need to adapt?

 

Understanding the by-product effects of changes is equally important, and addressable with digital twin models.  Micro adjustments to recipe or ingredients can have substantial impact on energy, material and human resources required to produce a product, but often at the risk of off-quality.

Predictive Quality is a key benefactor of a comprehensive Process Digital Twin. Typically, predictive quality relies on supervised learning approaches, where an algorithm is first trained with historical data to figure out correlations and patterns of inputs that caused an output (i.e., regression analysis); it then applies those patterns to predict the output dynamically, based on inputs in real-time (i.e., forecasting).

With Predictive Quality, the inputs will include recipe and machine setpoints, process variables, material properties, environmental conditions, and human factors.  Despite the name, Predictive Quality is typically not predicting good or bad quality (i.e. “classifying”), rather its output is forecasting a critical to quality parameter value.  Manufacturers can expect minimally a 5% reduction (a Hitachi benchmark) in cost of poor quality with the implementation of predictive quality solutions.

Recently, Hitachi Vantara partnered with a tire manufacturer who struggled with wasted batches of rubber – the first step in the tire manufacturing process. Per quality assurance procedure, completed batches of rubber after the mixing step are lab-tested for several critical product parameters, including a viscosity parameter. If the results are outside of specified tolerances the entire batch must be scrapped. With Predictive Quality, using several process parameters and raw material properties Hitachi and the manufacturer were accurately able to predict the viscosity of the batch in real-time with sufficient time for operators to intervene, and course correct, often saving the batch. Ultimately in this case, the tire company’s wastage in the mixing process was reduced by 30%. Downstream throughput and quality improvements were also measured due to consistency of input material.

While process manufacturers face continual pressures to be more efficient and innovate, the good news is that the process manufacturing industry is flush with brilliant engineers, process expertise and plenty of data. Hitachi Vantara offers decision support tools and analytics, along with industry-leading IT/OT system integration. Process manufacturers should look to realize benefit through applied digital technologies, starting with Process Digital Twin and Predictive Quality.

Related 

SHARE THIS