Week 1 Flashcards
it is a process of quantifying or assigning numerical values to the characteristics or properties of a phenomenon
Measurement
measurement can be use for
precision and objectivity
comparisons and analysis
communication
it provide standard measures and weights; serves as the national physical laboratory for the united states
National Bureu of Standards (1901-1988)
The NBS took custody of the copies of
kilogram and meter bars
The NBS developed measurements for
electrical units and light measurement
is an agency of the Unites States department whose mission is to promote american innovations and industrial competitiveness
National Institute pf Standard and Technology (NIST)
NIST activities are organized into physical science laboratory programs that include
- nanoscale science and technology
- engineering
- information technology
- neutron research
- material measurement
- physical measurment
NIST roles
- Maintaining standards
- Calibration services
- Certification
- Research and development
- Dissemination of information
- International collaboration
SRMs stands for
Standard Reference Materials
is the art of testing the validity of measurements by an instrument in normal operation by comparison with measurements made by primary or secondary standard
calibration
is the scheduled adjustment of instrument to maintain accuracy and reliability by comparing their outputs to reference standard
routine calibration
this process is essential for ensuring precise measurements and compliance with industry standards
routine calibration
Routine calibration roles
- Visual inspection for obvious physical defects
- Visual Inspection for proper installation and application
in accordance with manufacturer’s specification - Zero setting of all indicators
- Leveling of devices which requires this precaution
- Operational test to detect major defects
Refers to the degree of agreement between the measured value and the true (accepted) value
Accuracy
Closeness with which the reading approaches the true value or standard.
Accuracy
Refers to the degree of agreement of a set or group of measurements among themselves
Precision
Describes the reproducibility of results, that is, the agreement between numerical values that have been made
in exactly the same way
Precision
is the numerical difference
between the indicated or measured value and the true value
Error
Types of Errors
(!) gross errors
(2) determinate/ systematic errors
(3)indeterminate/ random/ accidental errors
Errors that are so serious that there is no real alternative to abandoning the experiment.
Gross Errors
(type of error)
instrument breakdown
gross error
Are caused by a miscalibrated instrument that affects all
measurements or the design of the experiment.
Determinate/Systematic Errors
(type of error)
using a thermometer that consistently reads a degree
Celsius higher than the actual temperature.
Determinate/Systematic Errors
a scale that consistently reads 1% higher than the true weight of the object
Determinate/Systematic Errors
caused by nonideal instrument behavior, by faulty calibrations, or by use under inappropriate condition
instrumental errors
caused either by carelessness, lack of
experience, or bias on the part of the observer
personal errors
due to incorrect application and faulty installation
application errors
Naturally occurring errors that are to be expected with any experiment
random errors
(type of error)
parallax errors
random errors
(type of error)
environmental errors
random errors
(type of error)
instrumental limitation
random error
difference between the measured value and the true value and is reported in the same units as the measurement
absolute error
ways of expressing accuracy
- absolute error
- relative error
the ratio of the absolute error to the true value
relative error
formula of finding the absolute error
E = measured value - true value
formula of finding the relative error
Relative error = measured value - true value/ true value
(true or false)
large static errors are desirable
false
(undesirable)
the deviation of the instrument reading from the true value
static error
is the degree of closeness with which the same value of a variable can be measured at different times
reproducibility
(true or false)
perfect reproducibility signifies that the instrument has no drift
true
means a gradual separation of the measured value from the calibrated value, usually after a long interval of time
drift
important property of instruments which is determined by design
sensitivity
The numerical value of the sensitivity is influenced
by the
requirements of instrument application
It reflects how sensitive the meter is to alterations in the quantity being measured
responsiveness
refers to a minimal change in the measured quantity required to cause a noticeable shift in meter’s indication
responsiveness
can be defined as the measure of the instrument between the lowest and highest readings it can measure
range
the difference between range values
span
maximum - minimum
span
is the number of figures that should be retained as valid and is dependent on the probable error associated with the observation or reading
significant figures
The art and science of applying measuring instruments and controlling devices to a system or process for the purpose of determining the identity or magnitude of certain varying physical quantities or chemical phenomena
instrumentation
refer to devices as simple as direct
reading thermometers, or as complex as multi-sensor
components of industrial control systems
instrumentation