Definitions and Explanations Flashcards
poisson statistics
the number of photons arriving at our detector from a given source will fluctuate
treat the arrival rate of photons statistically
poisson statistics assumptions
- Photons arrive independently in time
- Average photon arrival rate is constant
as Rτ increases
the shape of the Poisson distribution becomes more symmetrical
tends to a normal or gaussian
variance
is a measure of the spread in the Poisson distribution
Laplace’s basis for plausible reasoning
Probability measures our degree of belief that something is true
probability density function
when measuring continuous variables which can take on infinitely many possible values
sketch of a poisson PDF
see notes
sketch of a uniform PDF
see notes
sketch of a central/normal or gaussian pdf
see notes
sketch of a cumulative distribution function (CDF)
see notes
the 1st moment is called
the mean or expectation value
the 2nd moment is called
the mean square
the median divides
the CDF into two equal halves
the mode is
the value of x for which the pdf is a maximum
central limit theorem
explains the importance of a normal pdf in statistics
but still based on the asymptotic behaviour of an infinite ensemble of samples that we didn’t actually observe
Isoprobability contours for the bivariate normal pdf
p > 0 : positive correlation y tends to increase as x increases
p < 0 : negative correlation y tends to decrease as x increases
as |p| -> 1
contours become narrower and steeper
the principle of maximum likelihood
is a method to estimate the parameters of a distribution which fit the observed data
if we obtain a very small P-value
we can interpret this as providing little support for the null hypothesis,
which we may then choose to reject
Monte-carlo methods
method for generating random variables
can test psuedo-random numbers for randomness in several ways
a) histogram of sampled values
b) correlations between neighbouring pseudo-random numbers
c) autocorrelation
d) chi squared
Markov Chain Monte Carlo
method for sampling from PDFs
- start off at some randomly chosen value (a(1),b(1))
- compute L(a(1),b(1)) and gradient
- Move in direction of steepest +ve gradient
- repeat from step 2 until (a(n),b(n)) converges on maximum likelihood