Semester 1 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Accuracy?

A

(TP+TN)/(TP+TN+FP+FN)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Bayes Theorem

A

P(A/B)=P(A)*P(B/A)/P(B)

Posterior=Prior*Likelihood/Evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Bayesian vs frequentist statitics

A

Bayesian asumes that the data are true, and that the parameters don’t have a single value, but a range of plausible values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Bernoulli distribution?

A

Single trial, only 2 possible outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Convolutional Network?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Decision Tree?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

DSFI?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Error due to bias?

A

Model doesn’t learn the patterns of the training data. It oversimplifies the patterns or learns the wrong patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Error due to variance?

A

Model pays too much attention to patterns in the training data and “memorizes” the patterns. The model isn’t flexible enough to generalize into new data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Gamma distribution?

A

Continuous probability distribution that models right-skewed data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Gradient Boosting Machine

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How to measure dependence between variables if the variables are continuous? Discrete?

A
  1. Pearson.
  2. Spearman.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

KNN?

A

K Nearest Neighbours. Classification or Regression. Finds the K nearest points to a given new data point and uses the majority or average vote to predict the target variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Likelihood estimation method?

A

What is the likelihood of the variable having a certain distribution, given the observed data?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Precision?

A

TP/(TP+FP)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Medida de qué tan hacia los extremos está una cierta distribución?

A

Kurtosis

17
Q

Recall?

A

TP/(TP+FN)

18
Q

Skewness?

A

Si la distribución está cargada hacia la izquierda o la derecha.

19
Q

Tipos de Naive Bayes?

A

Gaussian: continuous variables
Multinomial: discrete variables
Binary: binary variables

20
Q

Valores de distribución Normal, para +-1d.e? +-2d.e? +-3d.e?

A

34.1% (68.2%)
13.6% (95.4%)
2.1% (99.7%)

21
Q

Poisson distribution?

A

Models the probability of an event happening k times whithin a given interval of time or space.

22
Q

False Positive Rate?

A

1-Specificity = FP/(FP+TN)

23
Q

True Positive Rate?

A

TP/(TP+FN)

24
Q

Specificity?

A

TN/(TN+FP)

25
Q

Formula de Probabilidades? Odds? Como pasar de Odds a Probabilidades?

A

a. F/(F+NF).
b. F/NF.
c. b/b+1

26
Q

Neural Network?

A

Input layers + hidden layers + output layers.

Each neuron has a weight for each past neuron + one bias + function (usually sigmoid).

It calculates a Cost function and then looks for the negative gradient, which tells us which weights and biases we need to change in order to lower the cost, this is called Backpropagation.

27
Q

Stochastic gradient descent?

A

Stochastic means doing the gradient descent process but not with all samples at the same time, instead you compute it for one batch, take a step and then compute it for the next batch, and take another step, etc.