Week 1 Flashcards

1
Q

it is a process of quantifying or assigning numerical values to the characteristics or properties of a phenomenon

A

Measurement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

measurement can be use for

A

precision and objectivity
comparisons and analysis
communication

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

it provide standard measures and weights; serves as the national physical laboratory for the united states

A

National Bureu of Standards (1901-1988)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The NBS took custody of the copies of

A

kilogram and meter bars

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The NBS developed measurements for

A

electrical units and light measurement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

is an agency of the Unites States department whose mission is to promote american innovations and industrial competitiveness

A

National Institute pf Standard and Technology (NIST)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

NIST activities are organized into physical science laboratory programs that include

A
  • nanoscale science and technology
  • engineering
  • information technology
  • neutron research
  • material measurement
  • physical measurment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

NIST roles

A
  • Maintaining standards
  • Calibration services
  • Certification
  • Research and development
  • Dissemination of information
  • International collaboration
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SRMs stands for

A

Standard Reference Materials

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

is the art of testing the validity of measurements by an instrument in normal operation by comparison with measurements made by primary or secondary standard

A

calibration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

is the scheduled adjustment of instrument to maintain accuracy and reliability by comparing their outputs to reference standard

A

routine calibration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

this process is essential for ensuring precise measurements and compliance with industry standards

A

routine calibration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Routine calibration roles

A
  • Visual inspection for obvious physical defects
  • Visual Inspection for proper installation and application
    in accordance with manufacturer’s specification
  • Zero setting of all indicators
  • Leveling of devices which requires this precaution
  • Operational test to detect major defects
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Refers to the degree of agreement between the measured value and the true (accepted) value

A

Accuracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Closeness with which the reading approaches the true value or standard.

A

Accuracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Refers to the degree of agreement of a set or group of measurements among themselves

A

Precision

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Describes the reproducibility of results, that is, the agreement between numerical values that have been made
in exactly the same way

A

Precision

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

is the numerical difference
between the indicated or measured value and the true value

A

Error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Types of Errors

A

(!) gross errors
(2) determinate/ systematic errors
(3)indeterminate/ random/ accidental errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Errors that are so serious that there is no real alternative to abandoning the experiment.

A

Gross Errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

(type of error)
instrument breakdown

A

gross error

21
Q

Are caused by a miscalibrated instrument that affects all
measurements or the design of the experiment.

A

Determinate/Systematic Errors

22
Q

(type of error)
using a thermometer that consistently reads a degree
Celsius higher than the actual temperature.

A

Determinate/Systematic Errors

23
Q

a scale that consistently reads 1% higher than the true weight of the object

A

Determinate/Systematic Errors

24
Q

caused by nonideal instrument behavior, by faulty calibrations, or by use under inappropriate condition

A

instrumental errors

25
Q

caused either by carelessness, lack of
experience, or bias on the part of the observer

A

personal errors

26
Q

due to incorrect application and faulty installation

A

application errors

27
Q

Naturally occurring errors that are to be expected with any experiment

A

random errors

28
Q

(type of error)
parallax errors

A

random errors

29
Q

(type of error)
environmental errors

A

random errors

30
Q

(type of error)
instrumental limitation

A

random error

31
Q

difference between the measured value and the true value and is reported in the same units as the measurement

A

absolute error

32
Q

ways of expressing accuracy

A
  • absolute error
  • relative error
33
Q

the ratio of the absolute error to the true value

A

relative error

34
Q

formula of finding the absolute error

A

E = measured value - true value

35
Q

formula of finding the relative error

A

Relative error = measured value - true value/ true value

36
Q

(true or false)
large static errors are desirable

A

false
(undesirable)

37
Q

the deviation of the instrument reading from the true value

A

static error

38
Q

is the degree of closeness with which the same value of a variable can be measured at different times

A

reproducibility

39
Q

(true or false)
perfect reproducibility signifies that the instrument has no drift

40
Q

means a gradual separation of the measured value from the calibrated value, usually after a long interval of time

41
Q

important property of instruments which is determined by design

A

sensitivity

42
Q

The numerical value of the sensitivity is influenced
by the

A

requirements of instrument application

43
Q

It reflects how sensitive the meter is to alterations in the quantity being measured

A

responsiveness

44
Q

refers to a minimal change in the measured quantity required to cause a noticeable shift in meter’s indication

A

responsiveness

45
Q

can be defined as the measure of the instrument between the lowest and highest readings it can measure

46
Q

the difference between range values

47
Q

maximum - minimum

48
Q

is the number of figures that should be retained as valid and is dependent on the probable error associated with the observation or reading

A

significant figures

49
Q

The art and science of applying measuring instruments and controlling devices to a system or process for the purpose of determining the identity or magnitude of certain varying physical quantities or chemical phenomena

A

instrumentation

50
Q

refer to devices as simple as direct
reading thermometers, or as complex as multi-sensor
components of industrial control systems

A

instrumentation