Metrology Flashcards
What is Metrology?
It is the study of meassurments.
Describe LEGAL Metrology.
It ensures the conversion of national standards and guarantees their accuracy in comparison with the international standards.
Define Accuracy.
Accuracy is the maximum amount by which the result varies from the true physical value or the nearness of the measured values to its true value often expressed as a %
Define Precision.
Precision is the degree of the repetitiveness of the measuring process.
How to calculate error?
1) Error = Meassured Value - True Value
2) Error = (Meassured - True) / True * 100 (%)
What are the three measurement concepts?
1) Measurand: a physical quantity such as length, weight, and angle to be measured.
2) Comparator: to compare the measurand with a known standard for evaluation
3) Reference: the physical quantity or property to which quantitive comparisons are to be made, which is internationally accepted.
Describe what calibration is.
Calibration is the procedure used to establish a relationship between the values of the quantities indicated by the measuring instrument and the corresponding values given by standards for specified conditions.
What is the difference between Static calibration and dynamic calibration?
Static calibration is when the variable involved remains constant while calibrating a given instrument. E.g. it not time depemndant. Whereas, if the value is time-dependent or time-based info is required, it is dynamic.
Define:
- Systematic/controllable errors
- Avoidable errors
Systematic/controllable errors are errors that deviate by a fixed amount from the true value and can be controlled in both their magnitude and direction.
Avoidable errors are errors in choosing the value. For example, Reading errors, the parallax effect. misalignment, zero errors.
What are random errors and name the potential sources?
Random errors provide a measure of random deviation when measurements of a physical quantity are carried out with variable magnitudes/directions.
Potential sources:
- Transient fluctuations in the instrument
- play in the linkages of the instrument
- error in reading the fractural part of engraved scale divisions
When is a system free from hysteresis?
When a measured quantity remains the same irrespective of whether the measurements have been obtained in an accending or descending order
Define Linearity.
linearity is defined as the max deviation of the putput of the measuring system from a specific straight line applied to a plot of data points on a curve.
What’s the difference between an endpoint line and a terminal line?
endpoint line has a y-intercept and a terminal line goes through the origin.
what is the resolution of measuring instruments?
It is the smallest change in physical property that an instrument can sense.
What is THRESHOLD
Threshold is the numerical value of the input that causes a detectable change in output
What is drift and what may cause it?
Drift can be defined as the variation caused in the output of an instrument, which is not caused by any change in the input.
It can be caused by internal temp variations and a lack of component stability.
what is zero stability?
It is the ability of an instrument to return to the zero reading after the input signal comes back to the zero value and other variations due to temp, pressure, vibrations, magnetic effect have been eliminated.
Define Loading effects.
the loading effect is defined as the incapability of a measuring system to faithfully measure, record or control the measurand in an undistorted form.
State what the Dynamic Response of a system is and name the two different types.
The dynamic response is the behaviour of the measuring system under the varying conditions of input with respect to time.
Steady-state periodic quantity - has defined repeating time cycle
Transient magnitude - time variation doesn’t change
Name and describe the Three characteristics of a measuring system.
Speed of response - the speed with which the measuring instrument responds to the changes in the measured quantity
Measuring Lag - is the time when an instrument begins to respond to a change.
Time delay - Time for the measuring system to respond after a dead time to the applied input
Stae the three stages of a measurement system and their functions.
- Primary dector-transducer - to sense the physical quantity input signal and transform it into its analogous signal. Known as a transducer or sensor. Should only detect the input quantity wanted.
- Intermediate modifying stage - the signal is modified and amplified appropriately with the help of conditioning, increasing the signal-to-noise ratio, and processing devices.
- Output or terminating stage - Presents the value of the output that is anologous to the input value.