Metrology Flashcards
What is Metrology?
It is the study of meassurments.
Describe LEGAL Metrology.
It ensures the conversion of national standards and guarantees their accuracy in comparison with the international standards.
Define Accuracy.
Accuracy is the maximum amount by which the result varies from the true physical value or the nearness of the measured values to its true value often expressed as a %
Define Precision.
Precision is the degree of the repetitiveness of the measuring process.
How to calculate error?
1) Error = Meassured Value - True Value
2) Error = (Meassured - True) / True * 100 (%)
What are the three measurement concepts?
1) Measurand: a physical quantity such as length, weight, and angle to be measured.
2) Comparator: to compare the measurand with a known standard for evaluation
3) Reference: the physical quantity or property to which quantitive comparisons are to be made, which is internationally accepted.
Describe what calibration is.
Calibration is the procedure used to establish a relationship between the values of the quantities indicated by the measuring instrument and the corresponding values given by standards for specified conditions.
What is the difference between Static calibration and dynamic calibration?
Static calibration is when the variable involved remains constant while calibrating a given instrument. E.g. it not time depemndant. Whereas, if the value is time-dependent or time-based info is required, it is dynamic.
Define:
- Systematic/controllable errors
- Avoidable errors
Systematic/controllable errors are errors that deviate by a fixed amount from the true value and can be controlled in both their magnitude and direction.
Avoidable errors are errors in choosing the value. For example, Reading errors, the parallax effect. misalignment, zero errors.
What are random errors and name the potential sources?
Random errors provide a measure of random deviation when measurements of a physical quantity are carried out with variable magnitudes/directions.
Potential sources:
- Transient fluctuations in the instrument
- play in the linkages of the instrument
- error in reading the fractural part of engraved scale divisions
When is a system free from hysteresis?
When a measured quantity remains the same irrespective of whether the measurements have been obtained in an accending or descending order
Define Linearity.
linearity is defined as the max deviation of the putput of the measuring system from a specific straight line applied to a plot of data points on a curve.
What’s the difference between an endpoint line and a terminal line?
endpoint line has a y-intercept and a terminal line goes through the origin.
what is the resolution of measuring instruments?
It is the smallest change in physical property that an instrument can sense.
What is THRESHOLD
Threshold is the numerical value of the input that causes a detectable change in output