Stochastic Processes I Flashcards
What is a stochastic process?
A stochastic process is a collection of random variables indexed by time. It can be discrete-time (variables at specific time points) or continuous-time (variables at any time).
How can a discrete-time stochastic process be described?
A discrete-time stochastic process can be described as a sequence of random variables, such as X_0, X_1, X_2, etc., where each variable represents a state at a discrete time point.
Define a continuous-time stochastic process.
A continuous-time stochastic process involves random variables at any continuous point in time, which can change in a continuous or discontinuous manner.
What is an alternative definition of a stochastic process?
An alternative definition is viewing a stochastic process as a probability distribution over a space of paths, each path being a possible realization of the process over time.
Describe a simple random walk.
A simple random walk is a discrete-time stochastic process where at each step, a random variable (like flipping a coin) determines the next state, leading to a path that randomly moves up or down over time.
What is a Markov chain?
A Markov chain is a stochastic process where the future state depends only on the current state and not on the sequence of events that preceded it.
How can the transition probabilities in a Markov chain be represented?
In a Markov chain with a finite number of states, transition probabilities can be represented using a transition probability matrix, where each element indicates the probability of transitioning from one state to another.
What is a stationary distribution in a Markov chain?
A stationary distribution in a Markov chain is a probability distribution over states that remains constant over time, meaning if the chain starts in this distribution, it will remain in this distribution at all future times.
Define a martingale in the context of stochastic processes.
A martingale is a type of stochastic process where the conditional expectation of the next value in the sequence, given all previous values, is equal to the current value, implying a “fair game.”
What does the Optional Stopping Theorem state for martingales?
The Optional Stopping Theorem states that for a martingale, the expected value at a stopping time (a chosen time to stop the process) is equal to the initial expected value, under certain conditions.
How does the concept of a stopping time apply to martingales?
A stopping time in a martingale is a strategy based on the process’s history up to the current time, determining when to stop the process. It ensures that future predictions do not influence the decision to stop.
What is the significance of the Optional Stopping Theorem in game theory?
The Optional Stopping Theorem implies that in a fair game modeled by a martingale, no strategy can provide a guaranteed profit or loss over time, emphasizing the fairness of the game.
What determines the future state in a Markov chain?
In a Markov chain, the future state is determined solely by the current state, disregarding the sequence of events or states that led to the current state.
What is an example of a real-life system that can be modeled as a Markov chain?
A simple example is a weather model where the future weather condition (like sunny, rainy) depends only on the current condition, not on the sequence of past weather conditions.
Explain the transition probability matrix in a finite Markov chain.
The transition probability matrix in a finite Markov chain is a square matrix where each element represents the probability of transitioning from one state (row) to another state (column).
How does the concept of stationary distribution apply to long-term predictions in Markov chains?
Over a long period, a Markov chain with a stationary distribution will reach a state where the probabilities of being in different states remain constant, aiding in long-term predictions.
What is the significance of the Perron-Frobenius theorem in the context of Markov chains?
The Perron-Frobenius theorem assures the existence of a positive, real largest eigenvalue and corresponding eigenvector for a positive matrix, which in Markov chains, relates to the stationary distribution.
Describe the properties of a simple random walk as a stochastic process.
A simple random walk has independent increments and stationary increments, meaning that the steps are independent of each other and the probability distribution of each step is the same regardless of the time at which it occurs.
What does it mean for a stochastic process to be a martingale in terms of fair games?
A martingale models a fair game in the sense that the expected value of a player’s winnings or losses at any future point is equal to their current position, indicating no expected gain or loss.
How does a random walk serve as an example of both a Markov chain and a martingale?
In a random walk, the future state depends only on the current state (Markov property), and the expected position at any future step is equal to the current position (martingale property).