QUALITY ASSURANCE and QUALITY CONTROL Flashcards
Process by which lab ensures quality results by closely monitoring preanalytical, analytical, & post analytical stages of testing
QUALITY ASSURANCE
Examples of Preanalytical phase; everything that precedes test performance:
- Test ordering
- Patient preparation
- Patient ID
- Specimen collection
- Specimen transport
- Specimen processing
Examples of Analytical phase; everything related to assay:
- Test analysis
- Quality control (QC)
- Reagents
- Calibration
- Preventive maintenance (involves machines)
Examples of Postanalytical phase; everything that comes after test analysis:
- Verification of calculations & preference ranges
- Review of results
- Notification of critical values
- Result reporting
- Test interpretation by physician
- Follow-up patient care
- Delta check
Part of analytical phase of quality assurance; process of monitoring results from control samples to verify accuracy of patient results
Quality Control (QC)
It is the average of data points:
Mean
It is the midpoint of distribution
Median
It is the most frequent observation
Mode
It is the difference between highest and lowest value; easiest measure of spread
Range
Most frequently used measure if variation:
Standard Deviation (SD)
An index of precision used to compare the dispersion of two or more groups of data with different units/concentrations
Coefficient of variation (CV)
Used to determine if there is a significant difference between the means of two groups of data; compares accuracy
T-test
mnemonic: “ATM”
A - Accuracy
T - T-test
M - Mean
Used to determine if there is a significant difference between the SD of two groups of data; compares precision:
F-test
mnemonic: “SPF”
S - SD
P - Precision
F - F-test
____% of the data fall between +/- SD from the mean
68%
_____% of the data fall between +/-2 SDs from the mean.
95%
_____% of the data fall between +/-3 SDs from the mean
99%
Relationship of SD with Dispersion:
Directly proportional
Relationship of SD with Precision:
Inversely proportional
Nearness or closeness of assayed values to the true value:
Accuracy
Nearness or closeness of assayed valued to each other
Precision (Reproducibility)
Ability of an analytical method to maintain accuracy and precision over an extended period of time
Reliability
Degree by which a method can easily be repeated
Practicability
Ability to measure the smallest concentration of the analyte of interest
Analytical sensitivity
Ability to measure only the analyte of interest
Analytical specificity
Also known as linearity; range of values over which lab can verify accuracy of test system
Reportable range
Formerly call ‘normal value’; can vary for different patient populations.
Reference interval
Reference interval is established by testing minimum of ____ healthy subjects & determining range in which 95% fall
120 healthy subjects
Verifying reference interval includes how many study subjects/individuals?
as few as 20 study individuals
(as few as 40 study individuals) - Bishop 8th edition
Reporting a positive result in a patient who has the disease
True Positive (TP)
Reporting a positive result in a patient who doesn’t have the disease
False Positive (FP)
Reporting a negative result in a patient who doesn’t have the disease
True negative
Reporting a negative result in a patient who has the disease
False negative
% of population with the disease that test positive; ability of the analytical method to detect the proportion of individuals with the disease
Diagnostic Sensitivity
Formula of Dx. Sensitivity:
Dx. Sensi = TP / (TP + FN) x 100
% of population without the disease that test negative; ability of the analytical method to detect the proportion of individuals without the disease
Diagnostic Specificity
Formula of Dx. Specificity:
Dx. Speci = TN / (TN + FN) x 100
Important in “ruling out” the disease and selecting screening test:
Diagnostic Sensitivity
Important in “ruling in” the disease and for confirmatory test:
Diagnostic Specificity
% of time that a positive result is correct; Totality positive result
Positive Predictive Value (PPV)
Formula for PPV:
PPV = TP / (TP + FP) x 100
% of time that a negative result is correct; Totality of negative result
Negative Predictive Value (NPV)
Formula for NPV:
NPV = TN / (TN + FN) x 100
Assayed on a regular schedule to verify that a laboratory procedure is performing correctly:
QC samples
For new instrument or new lot of reagents, analyze QC materials for ____ days
20 days
Note: also make new LJ chart for new reagent/instrument
Characteristics of ideal QC materials:
- Must resemble human samples
- Inexpensive and stable for long periods
- No communicable disease
- No known matrix effects
- With known analyte concentrations (for assayed controls)
- Convenient packaging for easy dispensing and storage
Most common presentation for evaluating QC results; shows each QC result sequentially over time; also called a Shewart plot
The Levey-Jennings Control Chart
When are you going to stop plotting in the LJ chart?
Once there is a new reagent or instrument.
Errors observed on LJ charts:
Trend
Shift
Outliers
LJ chart error where control values are increasing or decreasing for six consecutive runs
Trend
Main cause of trend error:
Deterioration of reagents
LJ chart error where six consecutive values are on the same side of the mean
Shift
Main cause of shift error:
Improper calibration of instrument
Highly deviating values; control result outside established limits:
Outliers
1 control >+/- 2s from mean. warning flag of possible change in accuracy or precision
1 (2s)
1 control >+/- 3s from mean
1 (3s)
2 consecutive controls >2s from mean on same side
2 (2s)
2 consecutive controls differ by >4s
R (4s)
4 consecutive controls >1s from mean on same side
4 (1s)
10 consecutive controls on same side of mean
10x
Example of Westgard rules that are random errors:
1(2s)
1(3s)
R(4s)
Example of Westgard rules that are systematic errors:
2(2s)
4(1s)
10x
Type of error that is present in all measurements; due to chance; no means of predicting it; error that doesn’t recur in regular pattern:
Random error
Error that influences ALL observations consistently in one direction; recurring error inherent in test procedure;
Systematic error
Examples of Random errors:
- Error due to dirty glassware
- Use of wrong pipet
- Voltage fluctuation
- Sampling error
- Anticoagulant or drug interference
Examples of Systematic errors:
- Dirty photometer
- Faulty ISE
- Evaporation or contamination of standards or reagents
Also known as external quality assessment; consists of evaluation of method performance by comparison of results versus those of other laboratories for the same set of samples:
Proficiency testing
Components of a QA program:
- Patient identification
- Collection of samples
- Testing
- Delta checks
- Critical values/Panic values
- Data reporting
- Preventive maintenance
Comparison of patient data with previous results:
Delta checks
Failed delta check deviation:
> 20% deviation