Principles of measurement Flashcards
Describe the basic components of a standard measurement system?
measurement systems are designed to measure a variable, process it and display it for interpretation
consist of
* input - e.g. pressure
* transducer - converts one form of energy to another e.g. current
* transmission path - cables to transmit current
* processor - amplification, filters, AD conversion
* display unit
what is the difference between analogue and digital ?
2 types of data represenation / signals
analogue = continous waveform e.g. BP waveform
digital = single numerical value e..g MAP
what are the requirements for an ideal clinical measuring system?
economical
- cheap, easy to use, does not require frequent calibration
accuracy
- highly accurate
- reliable
- high signal to noise
outputs are of convienent measurement e.g. not many decimal points.
define accuracy
the degree of how close the measured value is to the actual value.
calibration is required for accuracy
define sensitivity
the measurement system picks up small changes. This may or may not be beneficial. e.g. if recording large range of values, a small change may be a hinderance.
define linearity ..
the input is proportional to the output
define non-linearity..
the output value is not proportional to the input. it changes disproportionately e.g. exponential relationship
define drift
Drift refers to a deviation from the true value over time i.e. the accuracy is lost overtime this may result in
it may be …
* linear/offset error - offset always by the same amount. requires 1 point calibration
* gain/gradient error - increase / decrease in drift as input values increase. requires 2 point calibration
caused by changes in the measurement equiptment e.g. aging.
define hysteresis
this is the phenomena where a state of a system or its output depends not only on the input value but also the history of the system e.g. if it is increasing or decreasing.
define precise
Precision relates to the reproducibiltiy of repeated measurements. i.e. less random error
define bias..
depends on context
in data /measurement
Bias is the difference between the expected (average) result of a measurement or estimation and the true value. i.e. if thermometer is off by 2 degrees this is its bias.
in statistics it is the tendancy to favour /disfavour an outcome based on unfair / uncontrolled variables.
draw graphs for:
* accurate and precise data
* accurate and imprescise
draw graphs for:
* inaccurate and precise data
* inaccurate and imprescise
describe accuracy in terms of bias and precision …
an output needs to be both precise and non-bias (i.e. not drifted) to be accurate
what types of measurement errors are there and calibration methods for these..
- linear/offset error - offset always by the same amount. requires 1 point calibration
- gain/gradient error - increase / decrease in drift as input values increase. requires 2 point calibration
what is calibration?
the process of comparing the output of a measuring device against a known standard to ensure accuracy of this device.
Different devices vary on levels of calibration (one point, 2 point, multi point) and how often it is required (depends on drift)