Midterm 1 Review Flashcards
Define Accuracy?.
A measurement of closeness of a measured value to the true value
Define Precision.
A measure of reproducibility. The agreement between replicate measurements of the same quantity
Random Error?
Affects precision. usually small human error. The random nature of indeterminate errors makes it possible to treat these effects by statistical methods.
Systematic/Determinate Error?
Have a definite value and an assignable cause and are of the same magnitude for replicate measurements made
in the same way. Systematic errors lead to bias in measurement results
Confidence Interval?
Provides an expected range in which the true mean is in. An interval surrounding an experimentally determined mean x within which the population mean m is expected to lie with a certain degree of probability.
Confidence limit?
The numbers at the upper and lower end of a confidence interval.
Null Hypothesis?
There is no significant difference
Selectivity?
Refers to the degree to which the method is free from interference by other species contained in the sample matrix. (How a method is influenced by other species in the sample)
Sensitivity?
Instruments response to change in analyte concentration. Affected by slope and precision of cal curve.
Calibration Sensitivity?
For two method with equal precision, the one with streeper cal curve is more sensitive.
S = mc + Sblank
Analytical Sensitivity?
If two calibration curves of equal slope the one with higher precision is more sensitive.
Y = m/s
Limit of Detection (LOD)
The smallest analyte that can be detected with statistical confidence
Limit of Quantification (LOQ)
The smallest analyte that can be quantified with statistical confidence
Limit of Linearity (LOL)
The point where signal is no longer proportional
Matrix?
The component of the sample other than the analyte of interest.