Artificial Intelligence Flashcards
In Bayesian parameter estimation, the ________ distribution is often used as a prior or a real-valued quantity
Gaussian
In a Bayes classifier with Gaussian class-conditional densities, the parameters that need to be estimated for each class are the _____ and ________ _______ of the Gaussian distribution
mean, covariance matrix
The core principle of _________ statistics is using prior knowledge and observed data to update beliefs.
Bayesian
MAP stands for ________ ___ _______ in the context of Bayesian estimation.
Maximum A Posteriori
The property of Gaussian distribution that is NOT directly related to its use in the central limit theorem is the highest entropy among distributions with a given mean and variance. (True / False)
True
The ________ ______ theorem states that sums of independent random variables have a Gaussian distribution.
central limit
Kschischang’s Algorithm is proposed for message passing on a factor graph in _________ ________ __________.
Loopy Belief Propagation
The role of a regularization term in a machine learning model is to reduce overfitting by penalizing large coefficients. (True / False)
True
Training data is NOT typically a component of a Bayesian network. (True / False)
True
In _______ learning, the goal is to learn a mapping from input ( X ) to output ( Y ) based on a training set of input-output pairs. This process is primarily focused on prediction.
supervised
The Distribution rule is a basic rule of probability. (True / False)
False
The Distribution rule is NOT a basic rule of probability.
Bayes’ theorem is a crucial concept in probabilistic machine learning. It relates the conditional and marginal probabilities of random variables.
What is the formula for Bayes’ theorem?
(P|A)= [P(B|A) P(A)]/P(B)
A Gaussian distribution is also known as ______ ________.
Normal distribution
The parameters that fully define a univariate Gaussian (normal) distribution are the ________ and the __________.
mean (μ), variance (σ²)
The _______ distribution represents the updated belief about a parameter after observing data in Bayesian inference.
posterior