318 test 2 Flashcards
What is probabilistic modeling? (3)
- It utilizes the effects of random occurrences or actions to forecast future possibilities.
- This method is quantitative, focusing on projecting various possible outcomes.
- It can predict outcomes that may extend beyond recent events.
What does nondeterminism in algorithms mean in computer science?
In computer science, nondeterminism refers to an algorithm that can exhibit different behaviors on different runs with the same input, as opposed to a deterministic algorithm.
What is the Markov property?
- the conditional probability distribution of
future states of the process depends only on the present state - It implies a lack of memory in the process, where past events do not influence future
What is a Markov chain?
- A Markov chain is a discrete-time stochastic process.
- It satisfies the Markov property.
How does a Markov chain differ from a continuous-time stochastic process?
Unlike continuous-time stochastic processes, a Markov chain specifically operates in discrete time intervals.
Time Series
- a series of data points ordered in time (discrete-time data)
Univariate
: one variable is varying over time
Multivariate
: multiple variables are varying over time
examples of Markov chains
- stock market
- Random walk on a grid (see slide)
- Weather forecast
How can one characterize real signals in terms of signal models?
- provide a basis for a theoretical description of signal processing systems
- enhance understanding of the signal source even if the source is unavailable
(through simulation of the real-world process) - enable building prediction systems, recognition systems, identification systems
what are 2 Approaches to signal modeling
. Deterministic models
Statistical models:
Deterministic models:
use known specific properties of a signal (amplitude, frequency)
- Statistical models:
characterize only statistical signal properties (Gaussian, Poisson)
what are the Three fundamental problems in Hidden Markov model (HMM) design and analysis, and what do they each mean:
likelihood, best sequence, adjust parameters to account for signals
1.Evaluation of the probability (or likelihood) of a sequence of
observations generated by a given HMM
2. Determination of a “best” sequence of model states
3. Adjustment of model parameters so as to best account for the observed signals
what is a stochastic model
A stochastic model is a type of mathematical or computational model that incorporates randomness and unpredictability as intrinsic elements.
How do simpler Markov models differ from HMMs?
In simpler Markov models, states are directly observable.
In an HMM, the state sequence the model passes through remains hidden.
What are some applications of Hidden Markov Models in temporal pattern recognition?
- speech
- handwriting
- gesture recognition
- part-of-speech tagging
- musical score following
- partial discharges
- bioinformatics
understand the cointoss example for HMM
is a 3 state model for a coin good? how many unknow parameters does a 3 state model have? explain why or why not
- if the actual experiments use only a single coin, the 3-state model would be
highly inappropriate as the model would not correspond to the actual physical event
understand the ball and urn problem
HMM state
- These represent the different conditions or configurations that can produce the observations but are not directly visible.
- Example: In weather modeling, the states might represent “sunny”, “cloudy”, and “rainy”. In the urn example, they represent different urns with specific distributions of colored balls.
state transition probabilities
- these indicate the probability of transitioning from one state to another.
- Example: In the weather example, aij could represent the probability that a sunny day follows a cloudy day. In the urn example, aij would represent the probability of choosing urn j after urn i.
- This creates a total of N^2 transition probabilities because there are
N probabilities for each of the
N states.
urn example: initial state probability:
- This is the probability distribution over the initial state, i.e., where the process is likely to start.
- Example: For the weather, it might be the historical likelihood of the weather being sunny, cloudy, or rainy at the start of the observation period. In the urn example, it would be the likelihood of starting from each specific urn.
Likelihood of the observed sequence:
Given the model, how likely is the sequence of observations to occur?
what is “hidden” in the urn problem
- the process of picking the ball
- The Markov process itself cannot be observed, only the sequence of labeled balls
what are the two parameters of HMM
- TRANSITION PROBABILITIES and
INITIAL STATE PROBABILITIES, - and (ii) OUTPUT PROBABILITIES
TRANSITION PROBABILITIES
The transition probabilities control
the way the hidden state at time T is chosen given the hidden state at time T-1
what are the 5 HMM characteristics
- N number of states
- M number of distinct observation symbols
- state transition probability distribution
- observation symbol probability distriution
- initial state distribution
what are the 3 things you need to fully define a HMM model
- Two parameters: the number of states (N) and the number of observation symbols (M).
- The observation symbols themselves, which can be either discrete or continuous.
- Three sets of probabilities: the transition probabilities (A), the observation probabilities (B), and the initial state probabilities (π).
compact notation: lambda = (A,B,pi)
what are the 3 basic problems in HMM:
Efficiency, state, optimize
- given observation sequence, how do we efficiently compute the probability of the observation sequence given the model. Aka scoring:
- how well
a given model matches a given observation sequence. - how to choose optimal state
- aka: attempting to uncover the
hidden part of a model.
- there is no correct state
- state is chosen based on situation - how do we optimize model parameters?
- train then fit the data
what are the ways you can compute the probability of the given model
brute force: is by enumerating each and every possible state sequence of length T. This could take a while though.
how does HMM generate a sequence of observations?
first select the first state based on the initial probability, namely Pi
using that state and transition matrix we can get state2.
rinse and repate
what is the time complexity for HMM
2T * N^T
- T represents the length of the observation sequence
- N represents the number of possible states