Formula Flashcards
expectation value of the number of photons
E(N) = <N> = Rτ</N>
Poisson distribution probability of receiving N photons in time τ
p(N) = [(Rτ)^(N) e^(-Rτ)]/N!
where Rτ = µ
variance of N
var(N) = E{[N-E(N)]^2}
var(N) = Rτ
arrival rate
R(hat) = N(obs)/τ
probability density function
Prob(a ≤ x ≤ b) =(b ∫ a) p(x)dx
normalisation
(∞ ∫ -∞) p(x) dx = 1
uniform pdf
p(x) = { 1/(b-a) a < X < b
{ 0 otherwise
central/normal or gaussian pdf
p(x) = 1/√2πσ …
CDF
P(x’) = (x’ ∫ -∞) p(x) dx
P(-∞) = 0 P(∞) = 1
the nth moment of a pdf discrete case
<x^(n)> = (b Σ x=a) x^(n) p(x)Δx
the nth moment of a pdf continuous case
<x^(n)> = (b ∫ a) x^(n) p(x) dx
1st moment
= mean or expectation value
2nd moment
= mean square
variance : discrete case
discrete case 2nd moment - 1st moment^2
variance: continuous case
continuous case 2nd moment - 1st moment^2
median
P(x(med)) = (x(med) ∫ -∞) p(x’) dx’ = 0.5
sample mean
µ(hat) = 1/M (M Σ i =1) x(i)
variance of sample mean
var(µ(hat)) = σ^2/M
bivariate normal distirbution
p(x,y) on formula sheet
quadratic form
Q(x,y) on formula sheet
correlation coefficienct
known as p
satistifies E = pσ(x)σ(y)
on formula sheet
covariance
cov(x,y) on formula sheet
Pearsons product-moment correlation coefficient
r on the formula sheet in the correlation and covariance section
ordinary linear least squares
y(i) on formula sheet
likelihood function
L = (n Π i=1) p(x(i))
chi2
on formula sheet
poisson distribution k
= 1
normal distirbution k
= 2
number of degrees of freedom
r = N - k - 1
reduced chi squared
on formula sheet
chi2 pdf
p(v)(chi2)
where v is the degrees of freedom
P-value
1-P(chi2(obs))
variable transformations
p(y)dy = p(x)dx
p(y) = p(x(y))/|dy/dx|
power spectral density
= total power = (∞ ∫ -∞) |h(t)|^2 dt = (∞ ∫ -∞) |H(f)|^2 df
parsevals theorem
critical frequency
H(f) = 0 for all |f| ≥ f(c)
Nquist-Shannon Sampling Theorem
alias effect
if f(s) < 2f
Nquist-Shannon Sampling Theorem
T < 1/2f(c)
or f > 2f(c)
S = χ^2(a,b)
S = (N Σ i=1) ε^2(i)
where y(i) = a + bx(i) + ε(i)
so ε(i)^2 = [y(i) - a - bx(i)]^2
mean =
total counts / area
p(<5) =
p(0) + p(1) + p(2) + p(3) + p(4)
maximum likelihood
l = ln(L)
differentiate and set equal to zero
y ~ N[0,1]
normal pdf with mean zero and standard deviation unity
if we want z ~ N[µ, σ]
z = µ+σy
and p(y) =
x ~ U[0,1]
and p(x) = { 1 for 0<x<1
{ otherwise
y = a+(b-a)x
P(exp)(x) =
(x ∫ 0) p(exp) dx
variance =
(standard deviation)^2
σ^2
σ = √variance
fourier transform of d(t) = δ(t-1) + δ(t+1)
exp[2πif] + exp[-2πif]
standard deviation =
in poisson case
õ
RMS =
√{A^2[f(upp)-f(lower)]}
where A a constant amplitude = white noise