White Paper on the Enabling Foundations for all Numerical Method Tools

By Keith Coleman,2014-08-10 06:51
10 views 0
White Paper on the Enabling Foundations for all Numerical Method Tools

    White Paper on the Enabling Foundations for all Numerical Method Tools

    Craig F. Passe

    Consulting Engineer


    The statement, “Turning your data into Gold” is possible when you understand the foundations of sampled data systems. An old belief was that with the many historical systems attached to the regulatory control DCS’s, an engineer only needed to extract the process control data and build an n-

    dimensional neural network model. That resulting modeling effort would enable the engineer to improve his process with his new discoveries and visions of this multi-dimensional process. In actual fact that kind of effort did not often result in great successes in those early days. I started looking at the data acquisition setup of these process historians and quickly realized that most of these were set to record all of the process variables at a one minute sampling rate. It was for no other reason than; “it seemed like a good rate for the historian at that time and didn’t use that much disk space with

    compression techniques being used (mindset from the 70’s and 80’s)”. Very few of these customers were aware of the Nyquist and Shannon theorems for properly sampling continuous signals. When sampled data systems acquire their data at less than Nyquist sampling rates, the data is aliased and does not accurately represent the process that it has been recording. Models made from this inaccurate data can therefore be equally inaccurate. (See the graphical aliasing example at the end of this paper)

    It became very difficult for me at the point of a sale to tell a customer that his DCS and Historian systems were not properly setup for “sampled data” numerical solutions. These Neural Network and other numerical method tools were now capable of making significant impacts to the bottom-line profitability of any process industry that had DCS foundations if the data acquisition system properly sampled that process data.

The Enabling Foundations:

     Most people understand the concept that to be able to build a high-rise building, you need an adequate foundation upon which this building will stand. This foundation concept is equally true for using the modern numerical method tools that have been developed for sampled data systems. The only enabling product that I am aware of that comes ready right off of the CD is Vsystem from Vista Control Systems in Los Alamos, NM. The following numerical method tools are now being used on real-time computer systems and all rely on sampled data:

    ; Statistics

    ; Fast Fourier Transforms (FFT)

    ; Statistical Quality Control (SQC)

    ; Statistical Process Control (SPC)

    ; Regression analysis

    ; Principal Component Analysis (PCA)

    ; Neural Network N-dimensional Models

The concept of “Garbage in, garbage out” is ever so real to these fine tools. Since these tools all rely on

    sampled data, it is of utmost importance that the users understand what they are sampling before they set the sampling rates for those variables that they intend to use for measurements and control. And for an accurate sampled representation of those signals, we must satisfy the sampling thermos of Nyquist and Shannon.

    The continuous process industries of chemicals/polymers, refining/petrochemicals, pulp and paper, utility power generation stations, source and wastewater treatment facilities and others often appear to run at steady state conditions. The data doesn’t appear to change very much and the equipment is very large, therefore changing the temperature of these large reactors can take a relative long time. This is true, but there are often other significant process variables such as pressures and flows that can and do change quite rapidly. A sampled data historian system that is setup for the temperatures is often not setup properly to capture the dynamics of pressures and flows and other “occasionally adjusted” process variables.

The Enabling Statement:

     Some DCS vendors now offer higher speed I/O and higher speed regulatory controllers. These modern DCS systems are now becoming capable of properly capturing the dynamics of their industries “process variables”.

    “If these new higher speed DCS systems are coupled with the appropriate Supervisory Control System and Historian, then the proper foundation has been established for using these numerical method Advanced Process Control (APC) and optimization techniques”. (See Figure 1)

     Optimizer Linear and Nonlinear

    VOA APC Expert

    System Models Models SOP’s

     Supervisory Control System and High Speed Historian

     Business Network

     Regulatory Control System (the DCS)

     Process Control Network

    Remote Active Standby

    Bus I/O Bus I/O Bus I/O

    Figure 1

     One of the key word in that statement above is “coupled”. We now live in a distributed computing environment and must never overload any of the distributed components. The regulatory control system has the very important task of safely controlling the controllers of the process. That is a full time job and should not be interrupted by requests to see what has been going on over a long time (plots), or compare the operational parameters of today with those of the “greatest” day. Functions like that should be off-loaded to the supervisory control computer. This “coupled” team combines to make the process control perform better. With a solid platform of accurate data, the supervisory control

    system can schedule the numerical methods programs to perform their advanced control and optimization functions in an appropriate part of the distributed computing environment.

    These new Neural Network multivariable modeling tools are capable of finding the process non-linearity’s and the relationships between the manipulated variables, the state variables and the resultant product properties. When these relationships are better understood, then one can more accurately adapt his process control to improve manufacturing costs. These new Neural Network modeling tools also have an integrated optimizer (linear and nonlinear) that is capable of addressing the costs of energy, materials, and product specifications. The optimizer is capable of being setup to minimize the costs of production against the product specifications of the industry or particular customers at the current time. When and if these conditions change, the optimizer can use that information to adjust manufacturing conditions to meet current requirements.

     Many processes have different grades or levels of their products. To maximize profits in these types of processes, one has to minimize the off-specification materials produced during a grade change transition. The classical linear controllers can not deal with the nonlinear effects that happen during transitions. A shorter transition time between product grades minimizes off-spec materials and therefore significantly increases profitability.

Return on Investment (ROI) Picture:

     Most manufacturers today have already made significant investments in their process automation for their DCS and Supervisory Control plant information systems. However it appears that the larger return on investment comes from the higher levels of the DCS foundations in the APC and the Optimization levels. Figure 2 below illustrates a good view of the ROI versus the investment in the process automation and control of production.

     Investment ROI

     Optimization Level

     Multivariable Control Level

    Supervisory Control Level

     Regulatory Control Level

    Figure 2

    This is not the “what comes first, the chicken or the egg” scenario. You must have invested in the correct lower automation levels to properly enable the use of the less capital expensive upper levels. It is very clear here what must come first. We early modeler’s using the neural network tools made most

    of the mistakes and hopefully we have learned from those mistakes. Lord Kelvin stated, “You can’t make what you can’t measure, because you can’t tell when you have got it made”. I am going to recommend to you that if you have already gone to all the trouble of installing instruments and sensors on your process, you might as well make sure that you store that information properly without any errors.

    Who knows what you can learn from you past? Who knows what use you can put that data too in the future? Accurate information is power.

A Simple Picture of under sampled data aliasing:

     The picture below gives a good simple example of what would happen if we sampled these three continuous signals at less than the Nyquist sampling rate. The top flow controller signal would completely miss the small transitions in that flow adjustment. The other two signals also show the information that would be missing from the composition signals if the sampling rate were to slow. Any modeling effort from that data would give less than desirable results.

     0 1 2 3 4 5

Summary Statements:

    “If the foundations for APC and optimization have been properly established, then the historical data acquisition part of the supervisory control system is adequate for projects to maximize the ROI”.

“If the existing plant information system is not capable of truly capturing the real process behavior, then

    you must invest in an additional Supervisory Control System that does have that capability for those special required dynamic variables. And Vsystem from Vista Control Systems is the only product that I am aware of that comes off the CD ready to serve this function”.

    - - - Craig F. Passe, Consulting Engineer


    ; H. Nyquist, "Certain topics in telegraph transmission theory," Trans. AIEE, vol. 47, pp. 617-644, Apr. 1928.

    ; C. E. Shannon, "Communication in the presence of noise," Proc. Institute of Radio Engineers, vol. 37, no.1, pp.

    10-21, Jan. 1949.

Report this document

For any questions or suggestions please email