InTech

JUL-AUG 2018

Issue link: http://intechdigitalxp.isa.org/i/1010819

Contents of this Issue

Navigation

Page 17 of 61

18 INTECH JULY/AUGUST 2018 WWW.ISA.ORG PROCESS AUTOMATION ity. However, adding a deadband or rate limiting to the output is also effective in tempering the controller, although such action is not self-adaptive to changes in noise amplitude. Statistical filters are also useful to temper process coefficient adjustment. A statistically based filter moves rapidly when a change is confidently detected and holds a constant value in between. When the process noise amplitude changes, the responsiveness automatically changes. However, the code for this CUSUM filter is about 10 lines and requires that the noise be relatively compliant with the basis of independent sample-to-sample fluctua- tions. The user interpretation of the sta- tistical trigger will be unfamiliar to many. Kalman: The Kalman filter is a sta- tistical filter that is significantly more complicated than other filters summa - rized in this article. It compares data to a model, then reports the value that has greater statistical confidence. Although common in the electronics and aero- space industries, where linear models are appropriate and fast computers are justified, it is not common in the CPI. Outlier removal filters Here the concept is that the signal is steady, but an occasional event hap- pens to provide a one-time (or brief ), wholly uncharacteristic value. This is often called an outlier, phantom, or a spurious event. The filter purpose is not to average, but to ignore that outlier, which might be related to occasio nal dropped data or an electrically induced spike. Electrical sources include volt- age or current surges from a nearby lightning strike, radio communica- tions (RFI), nearby motors starting up, or electric floor scrubbers. Loose or corroding wiring connections, sen- sors that receive mechanical shock, or dropped data packets in an overloaded communication system can also cause short-lived or one-sample outliers. Median filter: The median filter re- ports the middle of the most recent values—not the middle in chronologi - cal order, but the middle in value. For instance, if the three most recent values are 5, 6, and 3, the middle value 5 is re - ported. Often, redundant sensors are used in which the middle of three mea - surements is taken as the process value, in a procedure termed voting. However, voting is a special case of parallel mea - surements at the same time. In a median filter, the middle-of-three is from a se - quence of data. A median filter could be based on three, five, seven, or so sequen - tial data. If you suspect that two outliers could happen sequentially, because of some common cause, then a median of 5 will reject them. The median filter does temper noise a bit, but the application intent should be to remove outliers. Figure 4 illustrates the median filter (middle of 3) applied to the same set of data. Note: l When the signal makes a step change at the 30th sampling, the filter has a delay of about half the number of data. l The outlier at sample 120 is wholly ignored. l Throughout, the vagaries of the signal substantially mimic the measurement. Note: The median filter removes out- liers and rapidly tracks real changes. However, the user must choose an N that is large enough to exclude per- sistent outliers. Masking outliers can misrepresent important features, and noise is not removed. Data reconciliation: Here the con- cept is that a sensor, or several sen- sors, acquire a systematic bias, and the reported measurements are not only subject to random noise but also systematic error. Good and frequent calibration could eliminate this prob- lem, but often continual instrument recalibration is not convenient. In data reconciliation, the objective is to use simple process models to back out the values of the systematic errors from the data. It is a powerful technique, but re quires valid models, redundant pro- cess measurements, and online com- puting that is an order above the other algorithms discussed here. Heuristic methods: These are user defined, as appropriate, and could be based on any number of data consis- tency or validation checks that a human observer might use to judge veracity. For instance, if a sudden change in one measurement correlates to a simulta- neous or prior change in another, then the one-time effect may be interpreted as real, not an outlier. The logic is usu- ally implemented as "if-then-else" rules that pass through valid data and perhaps flag what appear to be outli- ers. Flagged data could then be "thrown out" or ignored by process control ap- plications, but still be retained for his- Figure 4. Characteristic performance of a median (middle of 3) to a process step change and spike

Articles in this issue

Links on this page

Archives of this issue

view archives of InTech - JUL-AUG 2018