DATA RECONCILIATION AND GROSS ERROR

DETECTION IN PROCESS PLANTS


The main assumption in all commercial Data Reconciliation is that measurement values correspond to steady state. For this reason, only one value per instrument is considered and practitioners are forced to make daily averages of gathered data. In addition, process plants are never at steady state. In spite of the questionable validity of the steady state assumption, data reconciliation packages have been relatively successful in industry, especially for stable plants.

However, several penalties are paid by using a technology based on the steady state assumption. First, any slow drifting is ignored and averaged. Second gross error detection is affected. If gross errors are truly outliers and not reflection of leaks or instrument bias, as they get averaged with good measurements. Also, some good measurements can be wrongly identified as gross errors, and as a consequence, degrees of redundancy are lost and precision of reconciled data is affected. There is therefore a need to capture these outliers before they corrupt measurement averages. Additionally, if averaged measurements containing gross errors are not eliminated and are used in the reconciliation, the results are corrupted (probably biased). All these penalties affect daily balances and have an important impact on plant economics and management. Finally, the steady state assumption of current data reconciliation technology has been disputed in several forums, from practitioners to academia.

There is increasing awareness in academia and in industry, that accuracy of results as well as gross error detection (especially leaks) can be improved using dynamic state based data reconciliation techniques. There has been some work in this area, especially in the line of Kalman Filtering based algorithms. However, although the data obtained is of good quality, this type of local estimators leave the practitioner with large volumes of data and no simple description of patterns of change. Setting aside the computational volume challenges, these methods, as any one step procedures, will follow the ups and downs of measurement data. Reconciliation of slow drifting, accompanied by large variance measurement will result in reconciled values that follow the measurement saw-type pattern, not being able to detect slow drifting patterns.

Dynamic Data Reconciliation is the long term response for the Software industry and it will slowly emerge as an issue of increasing importance in practice. Many researchers have been advocating that this line of work is the one that will produce meaningful answers when on-line optimization and parameter estimation increase their needs for accuracy.

This project focuses on several different  approaches that can provide simple representation of data over long periods of time, as well as on a revision of gross error detection (biases and leaks) techniques in view of the new framework.

In the area of gross error detection, a theory of equivalency of gross errors has been developed. In addition a new method for collective compensation has been developed.  Many papers in gross error detection have been published.

Our latest advances are in the value of instrumentation, which is also connected to the value of data reconciliation.

A program (Precise) was developed based on our gross error detection methodology and the gross error equivalency theory.

     Software

      Contact me for access to all published work