Week 2: Theory of Errors (18 - 19 C) Flashcards

1
Q

How was Probb viewed in 18th C?

A
  1. Objective Interpretation
     A literal interpretation; probability describes the frequency of events
     If P(head) = 0.5, then in 1000 coin tosses I should expect ~500 heads
     Closely aligns to the modern interpretation of probability; with many realizations of an event, the outcome will converge on an expected result
    • Assumes outcomes on infinite tosses though, which is impractical
  2. Subjective interpretation (dominant)
     Probability describes degrees of confidence, rather than an expected result
     More practical in that it does not rely on infinite coin tosses to be meaningful
     Belief that probability lends itself as a tool (confidence bar) for decision making
    • Akin to Pascal’s viewpoint of statistical analysis as a tool
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is determinism?

A

•Determinism is built on a few core beliefs:
 All substances are made of particles
• Much like stellar bodies, there are forces interacting between them
 Governed by motion and initial conditions of the world
• By knowing the initial conditions and understanding the forces and play, you can have perfect knowledge of the world
 System is complex but fully determined
• Humans are merely incapable of understanding its full extent
 Responsible for and can explain all the natural phenomena in the world

Roots trace back to 17th C & Newton’s Principia.
Many champions, e.g. Laplace

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How does determinism mesh with Probb?

A
  • Combination of ontological determinism with epistemological uncertainty emerged as dominant opinion among scientists and theorists
  • Would remain through 19th century and into the 20th century, until the rise of quantum mechanics
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What was the problem with Astronomical data?

A

• Starting in the 15th and 16th century, oversea navigation became key for European exploration
o Precision became increasingly critical through the 17th and 18th centuries

• Problem was rooted in DATA SYNTHESIS
o Events happened just once – but there were multiple data sets from many observatories
 Reconciling these data sets was an issue not only seen in astronomy, but also optics and electric science

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How did data synthesis change? When?

A

Pre-19th century, data synthesis focused on finding the “best” data (demonstrated by Nevil Maskelyne)
o There’s only one correct result – we should seek it out and disregard the rest
 Implies that all the other observations are incorrect, and their observers incompetent and not up to their task
 Pushed data collection to be a matter of great moral/ethical responsibility to follow best procedures and collect “correct” data
How you identify the best… Most consistent? Best vs theory?

19th century brought a shift in data synthesis (e.g. Friedrich Bessel)
o Realized identifying the best was not feasible – sought to use all data
 Synthesize all the sets somehow using probability theory and statistics
o Removed the moral implications – delta in observations was due to psychological/physiological differences
o This change in mindset gives rise to the Theory of Errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How was the new data synthesis done?

A

• Techniques to synthesize data begin in the 19th century
o Adrien-Marie Legendre is working with the positional data of comets, and realizes there are variations in the measurement data
o 1805: Legendre publishes Method of Least Squares

Problem: Body has N measured quantities, governed by (N+1) parameters. If we have M data sets (»(N+1)), how do we mesh?

CRITICAL INSIGHT: Assume each measurement set has its own associated error.

Minimizing the total squared error through calculus, he develops N+1 equations which can be solved. METHOD OF LEAST SQUARES

This became the dominant method through 19th C.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why was Least Squares accepted?

A

Legendre’s 1805 proof was grounded in calculus, but needed probability theory support. It came shortly after:

Gauss’s max likelihood estimate showed that L2 was the most likely to be correct if error was Gaussian

Laplace’s Central Limit Theory showed that averaging many IID RV’s would approach a Gaussian distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Carl Friedrich Gauss

A

German matehmatician
Performed a MAX LIKELIHOOD ESTIMATE on the method of least squares
i.e. What values of parameters maximize probability to be correct?

Assumed error of data sets were IID RV’s.
Showed Least Squares method was highest probb of being correct, as long as error followed Gaussian distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Adrien-Marie Legendre

A

A French mathematician

o 1805: Legendre publishes Method of Least Squares

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Pierre Simon Laplace

A

French mathematician - previously saw his work on inverse/conditional probb.

The Central Limit Theorem:
o Suppose you have N independent and identically distributed random variables (IID RV’s)

As N increases, the distribution of Sn (their average) approaches a Gaussian distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Nevil Maskelyne

A

An English Astronomer.
1. Late 18th century, was reforming his data in his Greenwich observatory
2. Noticed some assistants’ recorded times were consistently delayed when compared to accepted results
 Assumed assistants were incapable and fired them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Friedrich Bessel

A

German astronomer/mathematician.

  1. Noticed the same phenomena, where some observers were always deviations
  2. Through the 1820’s, he performed experiments to analyze these errors
     Asked assistants to perform contrived measurements, such as timing a known duration, to measure their individual errors
  3. Realized errors were consistent among each observer

• Bessel developed the idea of personal equations to account for this

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Personal Equations

A

Developed by Friedrich Bessel, early 19th C

  1. attributed deltas to psychological/physiological traits of the observers
     In the same vein as “reaction time”
  2. The personal equation of an observer was inherent to the data they provided
     All data was subsequently provided alongside the recorder’s personal equation
  3. This only further validated the idea of Least Squares data synthesis, as opposed to trying to find a “best” data set
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What were the implications of Personal Equations?

A
  1. Must actively reduce uncertainty in our measurements
     Can train observers to mitigate delays
     Can develop automatic measurement techniques (~19th century)
  2. We can study uncertainty
     It is human nature to be different; this is something which can be studied and measured rather than treated as incompetence
     This gives the beginnings of “psychophysics”

precursor to psychology; develops in the 2nd half of the 19th century

How well did you know this?
1
Not at all
2
3
4
5
Perfectly