Process simulation - Esko Flashcards
In what ways are the data used?
They are used in modeling, optimization, monitoring, control, maintenance, economic evaluations
What kind of errors that can be found in data?
1- Random errors
2- Gross errors
What should be done to the data before applying them?
1- data have to be consistent: mass & energy balances
2- possible gross errors have to be detected and accordingly treated
What are the steps taken in the process collection and what are its applications?
1- data acquisition 2- data retrieval 3- data validation and reconstruction 4- data filtering 5- data reconciliation Applications are: Parameter estimation - Simulation - Optimization - Advanced control - Accounting - Instrument maintenance
Describe random errors?
Normal distribution βͺ describes process data: several independent phenomena, cumulative sum of effects βͺ π¦π = π¦ + π βͺ π¦π, measured, π¦, true value, π, error, π2, variance, π, standard error βͺ error mean value is zero: πΈ π = 0
Describe gross errors?
βͺ Non-random events βͺ Causes: miscalibration, malfunction, sensor or equipment fouling βͺ Error has a certain magnitude, the magnitude can change during time It has four forms: 1- bias 2- total failure 3- drifting 4- precision degradation
What are the consequences of errors?
1) The untreated errors can lead to bad plant performance
2) bad control system behavior
3) benefits by optimization are not reached
4) process drifting to an uneconomic or unsafe state
5) results of parameter estimation are wrong
What is the purpose of data reconciliation?
what are the methods used in data reconciliation?
and what does it require?
Purpose: To minimize the effects of random errors in the process data before they are used in applications.
Methods:
use process model
β use constraints
β estimate process variables in order to satisfy:
β’ mass balances
β’ energy balances
β’ constraints
It requires redundancy: variables measured and calculated
How are the variables classified?
They are classified into 5 types: 1- measured 2- redundant 3- unmeasured 4- observed 5- unobserved
What are the requirements of the gross error detection strategies?
- Detection problem:
β’ Ability to detect the presence of one or more gross errors in the data - Multiple gross error identification problem:
β’ Ability to locate and identify multiple gross errors which may be present simultaneously in the data - Estimation problem:
β’ Ability to estimate the magnitude of the gross errors
What are the gross errors detection methods?
1- Detect outliers. Check adjustments after reconciliation and compare with 95 % confidential region
- Global test: calculate test statistic and compare with chi2 distribution
- Nodal test: calculate statistics for each constraint
- Measurement test: calculate statistics based on adjustments
- Generalized likelihood test: calculate statistics using the maximum likelihood principle
How outliers are detected in gross errors?
- Perform data reconciliation
- For each adjustment ππ: if ππ > 2π then this measurement is a probable candidate of erroneous measurement
- Remove bad measurement and repeat the whole process
βͺ Normal distribution: 2π is the width of confidence interval where 95% of all measurements should fall.
How is a global test is performed?
- Calculate constraint residual π« = ππ²
- Calculate variance-covariance matrix π = ππAT
- Calculate global test statistics πΎ = π«ππβππ«
- Compare with πβπ2 distribution with 0.05 = 1-0.95 significance and with degree-of-freedom = independent rows in A. if πΎ is bigger, then probably
there are gross errors
βͺ Assumption: if there are no gross-errors, then π« is distributed normally
What are the types of process problems? and what are the proposed solutions?
Problems are: - Simulation problems - Design problems - Optimization problems - Data reconciliation Solution strategies are: - sequential modular - sequential modular with non-linear solution for the tear variables - equation oriented
Describe the sequential modular solver (SM) in Aspen. What kind of problems it has?
It produces solution by performing block by block calculations
- every block solved independently
- when there are recycles, perform iterations until iteration streams (tear streams) do not any more change
ο§ problems
β performs well only in pure simulation problems
β design problems: always an additional iteration loop for each design constraint
β optimization/reconciliation requires special arrangements
β not very flexible