Test 2 Flashcards

1
Q

!P is true if

A

P is false in the current model (negation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

P and Q is true IF

A

Both P & Q are true in the model (conjunction)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

P or Q is true IF

A

P or Q is true in the model (disjunction)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

P -> Q is true UNLESS

A

P is true and Q is false in M (implication)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

P <-> Q is true IF

A

P & Q are both false or both true (bi conditional)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sentences have a truth value with respect to a model. This means that:

A

The model is the list of the values of the variables slotted into the given sentence. I.e. a sentence might have 3 variables, and for a particular model, those values might be, say, “true, true, false”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A ^ B is logically equivalent too

A

B ^ A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A v B is logically equivalent to

A

B v A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

((A ^ B) ^ C) is logically equivalent to

A

(A ^ (B ^ C))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

((A v B) v C) is logically equivalent to

A

(A v (B v C))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

!(!A) is logically equivalent to

A

A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A -> B is logically equivalent to

A

!B -> !A and also !A v B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A <-> B is logically equivalent to

A

((A -> B) ^ (B->A))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

!(A ^ B) is logically equivalent to

A

!A v !B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

!(A v B) is logically equivalent to:

A

!A ^ !B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

((A ^ (B v C)) is logically equivalent too:

A

((A ^ B) v (A ^ C))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

(A v (B ^ C)) is logically equivalent to:

A

((A v B) ^ (A v C))

18
Q

A sentence is satisfiable if:

A

It is true in some models.

19
Q

A sentence is valid if:

A

It is true in all models.

20
Q

A sentence is unsatisfiable if:

A

It is true in no models.

21
Q

Horn Clause:

A

A clause where at most one literal is positive.

E.I: (!P v !Q v V) and (!P v !W) are Horn Clauses.

22
Q

Definite Clause:

A

A clause where exactly 1 literal is positive.

EX: (!P v !Q v H v !D) is a definite clause

23
Q

How do you rewrite a Horn Clause to an implication?

Ex: (!C v !B v A)

A

First separate ! Terms
(!C v !B) v A

Negate ! Terms
(C ^ B) v A

Switch to implicate
(C ^ B) -> A

24
Q

What is Bayes Theorum?

A

P(b | a) = P(a | b) P(b) / P(a)

25
Q

What can P(a | b, c), and P(c | b) get us

A

P(a | b) = P(a | b, c) P(c | b)

26
Q

P(a ^ b) = ?

A

P(a | b) (P(b)

27
Q

P(x,y) = ? When both are independant

A

P(x) P(y)

28
Q

P(x, y) = ? When x & y are related to each other

A

P(x) P(y | x)

29
Q

If you want the probability of P(!d), but dont have it, what could you use to solve it

A

If you had them, you could: P(!d | f)P(f) + P(!d | !f)P(!f) to get P(!d)

Aka: The probability of it being !day given that aliens are friendly + probaility of it being !day given aliens are !friendly

30
Q

What does random forest do and why does it work?

A

Random Forest is when you take a random sample of your data, and build an ongoing series of decision trees. This gives us a larger model. The more decision trees you have with different criteria, the better the random forest will perform due to it increasing prediction accuracy.

It reduces overfitting, variance, and bias.

31
Q

What is Underfitting & what are its causes?

A

Underfitting usually occurs when your model doesnt have enough time or enough data. This results in it being unable to recognize general patterns and making bad predicitions.

Might also be due to having a bad model.

32
Q

What is overfitting and how does it occur?

A

Overfitting tends to occur when your model is too accurate, or rather, too tuned to the training data.

It’ll occur when your model has learned patterns which are only seen in the training data.

New predicitions will be much less accurate because they dont have the patterns only seen in the training data.

If there is a large variance between Validation data and Training data (Where you have high training data accuracy and low validation accuracy), you likely have overfitting.

To resolve it

33
Q

Suppose you train a classifier and test it on a validation set. It gets 30% classification accuracy on the training set and 30% accuracy on the validation set.

What is your model suffering from?
How can you resonably improve it?

A

It is likely suffering from Underfitting. To improve the model, you could likely add new features, as well as collect more training data.

34
Q

What does very low error on the training data, but high error on the test data represent?

A

Overfitting (high variance)

35
Q

What does High error on training data and higher error on the the test data suggest?

A

Underfitting

36
Q

If you have high bias, low variance, and low complexity, your model is

A

Underfitting.

37
Q

If you have low bias, high variance, and high complexity, you have…

A

Overfitting

38
Q

What are some ways to reduce overfitting

A

-Get more training data
-Add better features (to better capture the structure)
-Remove some features (to reduce noise)

39
Q

P( a | b) = ?

A

P(a | b) = P(a ^ b) / P(b)

40
Q

P(a ^ b) = ?

A

P(a ^ b) = P(a | b) P(b)

41
Q

What is modus ponens?

A

A <-> B, A / B

B <-> A, B / A