«

»

Feb
19

So Many Models, So Little Time

My favorite “Far Side” cartoon has Einstein at a chalk board full of derived equations ending up with the ultimate equation “time=money.” In my mind, the negative free time of the process control engineer places some doubt as to whether this endangered species still exists. There have been sightings but the uncertainty principal says we can only ascertain their location or function but not both.

Experimental models do a good job of minimizing the time and expertise required of process control engineers by not relying upon process knowledge. Since these models are identified from test data, they are consistent with the ultimate goal of matching reality even if process understanding lags behind. Each technique excels at addressing a particular aspect. For example, Neural Networks (NN), Projections to Latent Structure (PLS), and Model Predictive Control (MPC), excel at identifying the nonlinear, interdependent, and dynamic, respectively, nature of process inputs. The strong point of one method is often the weak point of others and in the end somebody with some sort of process understanding should check to make sure the models make physical sense. There are several watch outs. For example, avoid extrapolation by a NN outside of its training data range because nonlinear relationships can take off exponentially. Since PLS and MPC assume linearity, you have to be careful about deviating too far from an operating point to the point where turndown and startup may require the identification and switching of different models. NN and PLS don’t try to model the process time constant or integrating process gain, so there is a model mismatch for well mixed volumes where the residence time translates to a process time constant or a “near” or “real” integrating process gain. Also, NN and PLS are often sold based on just throwing existing historical data at them ignoring the transfer of variability by closed control loops and not perturbing process inputs. The richness of the dynamics, the rangeability, and the identification of cause and effect suffers. What has been so important to the success of MPC, seems to have been lost

What about all the other types of models?

Tiebacks are very attractive because they initially require hardly any effort. They can be automatically generated from the configuration. These are great for control system familiarization and interface improvements (e.g. operator training and critiquing of graphics) and I/O checkout. They can be used to mimic the process response by the heuristic customization of ramp rates triggered by piping path logic to test out the configuration, particularly important for complex continuous and batch control systems.

Finally, there are the models based on chemistry and physics (not necessarily popular subjects). Very sophisticated software has been developed to provide a graphical flow sheet simulation of processes. Unfortunately, these generally require a sophisticated budget and user. Most of the big players focus on continuous steady state operation, the traditional realm of chemical engineering programs. Separate special purpose packages are typically required for batch. My experience with “state of the art ” process modeling software is that they do a good job of process design but are not as good as you might expect in showing the process dynamics especially considering they carry the label “high fidelity”. The process gain is off because the installed characteristic of the control valve and measurement scale are not included, the process dead time is too small because transportation and mixing delays are missing, and the process time constant is too small because thermal lags and jackets/coils are missing. To top it off, the trends are way too smooth because there is no mixing or sensor noise and no limit cycles from control valve stick-slip or backlash. For more enlightenment on the issues with dynamic process simulators see the Control magazine August 2005 article titled “The Light at the End of the Tunnel is a Train (Virtual Plant Reality)”.

When you sit back (something I am getting better at being partly retired) and look at the whole picture, it seems fractured.

Why aren’t there basic generic first principal models that focus on the process dynamics without getting bogged down in the complexity needed for process design? Why aren’t there hybrid models that take advantage of the best of what each method has to offer? What would we call these models that provide the type of fidelity needed for process control? Are we stuck in a rut because each expert thinks their particular method is best? Are there people with broad enough skills and attitude to pull it off?