P&R Block 1 Flashcards
What was the “paradox” raised by Saint Augustine?
The present is all that can be said to truly exist, but the present only exists for an instant.
The moment associated with the present turns into the past instantaneously.
What is “temporal reductionism”?
Who is this associated with?
Aristotle and Leibniz.
All temporal statements can be reduced to statements about temporal relations between events.
What is “absolutism with respect to time”?
Who is this associated with?
Plato and Newton.
There is some “true time”, independent of events. Time is like a container in which events are placed. Time exists in the absence of events.
Who is the idea of “time series’” associated with?
McTaggart
What are the A and B-series?
What was the supposed conclusion of these arguments?
A-series: events are ordered past, present, future. (“real change”)
B-series: events are ordered from earlier to later and are independent from the “present”.
Conclusion: time is “not real”.
What is a “presentist” view?
What about “block universe” and “growing block universe”?
Presentist: only the present is real.
Block universe: all times and events are real, we experience movement through the 4D block universe.
Growing block universe: the past and present are equally real, but the future is not real.
What kind of laws are time-reversible?
Why are they sometimes not? Why is this not an issue?
Microscopic ones
CP asymmetric weak interactions must also be time-asymmetric. However, there is no proposed connection between this asymmetry and the “arrow of time”.
What is the name of the precursor to the entropic argument for the thermodynamic arrow of time?
Boltzmann’s H-theorem (1872)
A function H that is non-increasing in time.
What is the stasszahlansatz, and why was this an issue with Boltzmann’s H-theorem?
The molecular chaos hypothesis.
i.e. Boltzmann assumed particle velocities and positions are uncorrelated. However, correlations do build up over time due to collisions, and by ignoring this, Boltzmann introduced time-asymmetry.
What is poincare recurrence?
Certain closed systems will return to a state arbitrarily close to their initial state given sufficient time.
However, this is usually not a valid argument against the thermodynamic arrow of time, as the required sufficient time may be arbitrarily large.
How is Boltzmann entropy defined?
S_B = k ln(Omega)
k is the Boltzmann constant.
Omega is the volume of phase space corresponding to the macrostate M(X). Where X is the associated microstate.
- minus sign is missing in the phase-space definition as opposed to the probability definition
How can the argument that Boltzmann entropy always increases be simply stated?
Higher entropy states are associated with an exponentially larger phase space volume.
It is overwhelmingly likely for a system to evolve to a microstate associated with a higher entropy macrostate.
What is the “past hypothesis”?
The arrow of time is a result of the universe being in a very low-entropy state at the “beginning”.
Who (and when) can the term “arrow of time” be attributed to?
What was this “arrow of time” referring to?
Eddington (1928)
His arrow of time was the thermodynamic one: time’s arrow points in the direction of increasing entropy of a system.
How many different “arrows of time” can you name?
Thermodynamic
Cosmological
Causal? / Psychological
Radiative
Quantum
Who wrote the paper “The Unreasonable Effectiveness of Mathematics in the Natural Sciences”?
When?
Eugene Wigner, 1960
What does Joel Lebowitz (1993) believe about the arrows of time?
All arrows of time are the result of the initial low-entropy state of the universe.
A “measurement” formalism in QM can have a second-law-type time-asymmetric component, such that the quantum arrow of time can be traced to the initial low-entropy state of the universe.
What is the formula for Shannon information?
What is the theoretical basis of this generalisation?
I(p_i) = - log_2( p_i )
The generalisation of the concept of how many binary questions are required to determine which state (with associated p_i) a system is in.
What is the formula for Shannon Entropy?
What can this quantity be thought of?
Why is it called Entropy?
= - sum( p_i * log_2[p_i] )
The average information generated by a system: the weighted average of the information given by each state. Can be thought of as an “average surprise” function.
Has the same form as Gibbs entropy (the generalisation of Boltzmann entropy). [as pointed out by Von Neumann]
What is the unit of entropy?
Joules per Kelvin (sometimes also per kg)