Foreword |
|
xiii | |
Preface |
|
xv | |
Acknowledgments |
|
xix | |
|
1 Motivations for Automating Process Fault Analysis |
|
|
1 | (20) |
|
|
1 | (1) |
|
|
1 | (2) |
|
1.3 The Changing Role of Process Operators in Plant Operations |
|
|
3 | (2) |
|
1.4 Methods Currently Used to Perform Process Fault Management |
|
|
5 | (5) |
|
1.5 Limitations of Human Operators in Performing Process Fault Management |
|
|
10 | (2) |
|
1.6 The Role of Automated Process Fault Analysis |
|
|
12 | (1) |
|
1.7 Anticipated Future CPI Trends |
|
|
13 | (1) |
|
1.8 Process Fault Analysis Concept Terminology |
|
|
14 | (7) |
|
|
16 | (5) |
|
2 Method of Minimal Evidence: Model-Based Reasoning |
|
|
21 | (34) |
|
|
21 | (1) |
|
|
22 | (1) |
|
2.3 Method of Minimal Evidence Overview |
|
|
23 | (26) |
|
2.3.1 Process Model and Modeling Assumption Variable Classifications |
|
|
28 | (3) |
|
2.3.2 Example of a MOME Primary Model |
|
|
31 | (5) |
|
2.3.3 Example of MOME Secondary Models |
|
|
36 | (3) |
|
2.3.4 Primary Model Residuals' Normal Distributions |
|
|
39 | (2) |
|
2.3.5 Minimum Assumption Variable Deviations |
|
|
41 | (3) |
|
2.3.6 Primary Model Derivation Issues |
|
|
44 | (3) |
|
2.3.7 Method for Improving the Diagnostic Sensitivity of the Resulting Fault Analyzer |
|
|
47 | (1) |
|
2.3.8 Intermediate Assumption Deviations, Process Noise, and Process Transients |
|
|
48 | (1) |
|
2.4 Verifying the Validity and Accuracy of the Various Primary Models |
|
|
49 | (2) |
|
|
51 | (4) |
|
|
52 | (3) |
|
3 Method of Minimal Evidence: Diagnostic Strategy Details |
|
|
55 | (32) |
|
|
55 | (1) |
|
|
56 | (1) |
|
3.3 MOME Diagnostic Strategy |
|
|
57 | (22) |
|
3.3.1 Example of MOME SV&PFA Diagnostic Rules' Logic |
|
|
57 | (10) |
|
3.3.2 Example of Key Performance Indicator Validation |
|
|
67 | (4) |
|
3.3.3 Example of MOME SV&PFA Diagnostic Rules with Measurement Redundancy |
|
|
71 | (3) |
|
3.3.4 Example of MOME SV&PFA Diagnostic Rules for Interactive Multiple-Faults |
|
|
74 | (5) |
|
3.4 General Procedure for Developing and Verifying Competent Model-Based Process Fault Analyzers |
|
|
79 | (1) |
|
3.5 MOME SV&PFA Diagnostic Rules' Logic Compiler Motivations |
|
|
80 | (3) |
|
3.6 MOME Diagnostic Strategy Summary |
|
|
83 | (4) |
|
|
84 | (3) |
|
4 Method of Minimal Evidence: Fuzzy Logic Algorithm |
|
|
87 | (22) |
|
|
87 | (1) |
|
|
88 | (2) |
|
|
90 | (1) |
|
4.4 MOME Fuzzy Logic Algorithm |
|
|
91 | (11) |
|
4.4.1 Single-Fault Fuzzy Logic Diagnostic Rule |
|
|
93 | (4) |
|
4.4.2 Multiple-Fault Fuzzy Logic Diagnostic Rule |
|
|
97 | (5) |
|
4.5 Certainty Factor Calculation Review |
|
|
102 | (2) |
|
4.6 MOME Fuzzy Logic Algorithm Summary |
|
|
104 | (5) |
|
|
105 | (4) |
|
5 Method of Minimal Evidence: Criteria for Shrewdly Distributing Fault Analyzers and Strategic Process Sensor Placement |
|
|
109 | (8) |
|
|
109 | (1) |
|
5.2 Criteria for Shrewdly Distributing Process Fault Analyzers |
|
|
109 | (4) |
|
|
110 | (1) |
|
5.2.2 Practical Limitations on Target Process System Size |
|
|
110 | (2) |
|
5.2.3 Distributed Fault Analyzers |
|
|
112 | (1) |
|
5.3 Criteria for Strategic Process Sensor Placement |
|
|
113 | (4) |
|
|
114 | (3) |
|
6 Virtual SPC Analysis and Its Routine Use in FALCONEER™ IV |
|
|
117 | (8) |
|
|
117 | (1) |
|
|
118 | (1) |
|
6.3 EWMA Calculations and Specific Virtual SPC Analysis Configurations |
|
|
118 | (5) |
|
6.3.1 Controlled Variables |
|
|
119 | (1) |
|
6.3.2 Uncontrolled Variables and Performance Equation Variables |
|
|
120 | (3) |
|
6.4 Virtual SPC Alarm Trigger Summary |
|
|
123 | (1) |
|
6.5 Virtual SPC Analysis Conclusions |
|
|
124 | (1) |
|
|
124 | (1) |
|
7 Process State Transition Logic and Its Routine Use in FALCONEER™ IV |
|
|
125 | (8) |
|
7.1 Temporal Reasoning Philosophy |
|
|
125 | (1) |
|
|
126 | (2) |
|
7.3 State Identification Analysis Currently Used in FALCONEER™ IV |
|
|
128 | (3) |
|
7.4 State Identification Analysis Summary |
|
|
131 | (2) |
|
|
131 | (2) |
|
|
133 | (8) |
|
|
133 | (1) |
|
8.2 Summary of the MOME Diagnostic Strategy |
|
|
133 | (1) |
|
8.3 FALCON, FALCONEER, and FALCONEER™ IV |
|
|
|
Actual KBS Application Performance Results |
|
|
134 | (2) |
|
8.4 FALCONEER™ IV KBS Application Project Procedure |
|
|
136 | (2) |
|
8.5 Optimal Automated Process Fault Analysis Conclusions |
|
|
138 | (3) |
|
|
139 | (2) |
|
Appendix A Various Diagnostic Strategies for Automating Process Fault Analysis |
|
|
141 | (22) |
|
|
141 | (1) |
|
|
142 | (1) |
|
|
143 | (1) |
|
|
143 | (1) |
|
|
144 | (1) |
|
A.6 Diagnostic Strategies Based on Qualitative Models |
|
|
145 | (1) |
|
A.7 Diagnostic Strategies Based on Quantitative Models |
|
|
145 | (2) |
|
A.8 Artificial Neural Network Strategies |
|
|
147 | (1) |
|
A.9 Knowledge-Based System Strategies |
|
|
147 | (1) |
|
A.10 Methodology Choice Conclusions |
|
|
148 | (15) |
|
|
149 | (14) |
|
Appendix B The FALCON Project |
|
|
163 | (24) |
|
|
163 | (1) |
|
|
164 | (1) |
|
B.3 The Diagnostic Philosophy Underlying the FALCON System |
|
|
164 | (1) |
|
B.4 Target Process System |
|
|
165 | (2) |
|
|
167 | (6) |
|
B.5.1 The Inference Engine |
|
|
168 | (1) |
|
B.5.2 The Human-Machine Inference |
|
|
169 | (1) |
|
B.5.3 The Dynamic Simulation Model |
|
|
169 | (3) |
|
B.5.4 The Diagnostic Knowledge Base |
|
|
172 | (1) |
|
B.6 Derivation of the FALCON Diagnostic Knowledge |
|
|
|
|
173 | (1) |
|
B.6.1 First Rapid Prototype of the FALCON |
|
|
|
|
173 | (1) |
|
B.6.2 FALCON System Development |
|
|
173 | (9) |
|
B.6.3 The FALCON System's Performance Results |
|
|
182 | (1) |
|
B.7 The Ideal FALCON System |
|
|
183 | (1) |
|
B.8 Use of the Knowledge-Based System Paradigm in Problem Solving |
|
|
184 | (3) |
|
|
185 | (2) |
|
Appendix C Process State Transition Logic Used by the Original FALCONEER KBS |
|
|
187 | (10) |
|
|
187 | (1) |
|
C.2 Possible Process Operating States |
|
|
187 | (2) |
|
C.3 Significance of Process State Identification and Transition Detection |
|
|
189 | (1) |
|
C.4 Methodology for Determining Process State Identification |
|
|
189 | (2) |
|
C.4.1 Present-Value States of All Key Sensor Data |
|
|
189 | (1) |
|
C.4.2 Predicted Next-Value States of All Key Sensor Data |
|
|
190 | (1) |
|
C.5 Process State Identification and Transition Logic Pseudocode |
|
|
191 | (5) |
|
C.5.1 Attributes of the Current Data Vector |
|
|
191 | (1) |
|
C.5.2 Method Applied to Each Data Vector |
|
|
192 | (4) |
|
|
196 | (1) |
|
Appendix D FALCONEER™ IV Real-Time Suite Process Performance Solutions Demos |
|
|
197 | (6) |
|
D.1 FALCONEER™ IV Demos Overview |
|
|
197 | (1) |
|
|
197 | (6) |
|
D.2.1 Wastewater Treatment Process Demo |
|
|
197 | (2) |
|
D.2.2 Pulp and Paper Stock Chest Demo |
|
|
199 | (4) |
Index |
|
203 | |