Evaluation-1 Flashcards

1
Q

what is reliability

A

Evaluation is the process of understanding the reliability of any AI model, based on outputs by feeding the test data set into the model and comparing with actual answers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

why us model evaluation importiannt

A

Model Evaluation is an integral part of the model development process. It helps to find the best model that represents our data and how well the chosen model will work in the future.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

define undeffittig

A

In the first diagram,The model’s output does not match the true function at all. Hence the model is said to be underfitting and its accuracy is lower.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

define overfitting

A

When we use the same data we used to build the model for evaluaion,the model will simply remember the whole training set, and will therefore always gives correct prediction. This is known as overfitting.
In the third case, model performance is trying to cover all the data samples even if they are out of alignment to the true function. This model is said to be overfitting and this too has a lower accuracy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is a perfect fit

A

the model’s performance matches well with the true function which states that the model has optimum accuracy and the model is called a perfect fit.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what is
i) prediction
ii) reality

A

i) prediction is the output that is given by the machine
ii) reality is the true scenario wich takes place for which the prediction was made.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what is a confusion matrix?

A

it table setup which Helps to understand the prediction results. It shows the comparison between the prediction and reality .This is sometimes referred to as Contingency table or error matrix or matching matrix.

A 2x2 matrix denoting the right and wrong predictions might help us analyse the rate of success. This matrix is termed the Confusion Matrix.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what is true positive

A

The predicted value matches the actual value

The actual value was positive and the model predicted a positive value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is true negative

A

The predicted value matches the actual valueThe actual value was negative and the model predicted a negative value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is false positive

A

The predicted value was falsely predictedThe actual value was negative but the model predicted a positive valueAlso known as the Type 1 error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what is false negative

A

The predicted value was falsely predictedThe actual value was positive but the model predicted a negative valueAlso known as the Type 2 error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what are the 4 evaluation meaures

A

-accuracy
-precision
-recall
-F1 score

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

define accurucay

A

Accuracy is defined as the percentage of correct predictions out of all the observations. A prediction can be said to be correct if it matches the reality. Here, we have two conditions in which the Prediction matches with the Reality: True Positive and True Negative.
accuracy= correct predicitoon/total case x 100
=tp+tn/tp+tn+fp+fn x 100

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

why is accuracy not a good measure

A

Assume that the model always predicts that there is no fire. But in reality, there is a 2% chance of forest fire breaking out. In this case, for 98 cases, the model will be right but for those 2 cases in which there was a forest fire, then too the model predicted no fire.
Here,
True Positives = 0
True Negatives = 98
Total cases = 100
Therefore, accuracy becomes: (98 + 0) / 100 = 98%
This is a fairly high accuracy for an AI model. But this parameter is useless for us as the actual cases where the fire broke out are not taken into account. Hence, there is a need to look at another parameter which takes account of such cases as well.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

define precision

A

Precision is defined as the percentage of true positive cases versus all the cases where the prediction is true. That is, it takes into account the True Positives and False Positives.
precision= tru positive/ all predicted positivies x 100
= tp/tp+fp x 100

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

why is precision important

A

You might recall the story of the boy who falsely cries out that there are wolves every time and so when they actually arrive, no one comes to his rescue. Similarly, here if the Precision is low (which means there are more False alarms than the actual ones) then the firefighters would get complacent and might not go and check every time considering it could be a false alarm.
This makes precision an important criteria.

If precision is high, this means the true positive cases are more, giving lesser false alarms.

17
Q

what is recall

A

It can be defined as the fraction
of positive cases that are correctly identified. It majorly takesinto account the true reality cases where
in Reality there was a fire but the machine either detected it correctly or it didn’t. It considers tp and fn

recall= tp/tp+fn

18
Q

when do we use precision and recal

A

when false negative costs more- recal
when false positive costs more-precision

19
Q

what is F1 score?

A

F1 score can be defined as the measure of balance between precision and recall.
F1= 2 x precision x recall/ precision+recall

20
Q

what is an ideal f1 score

A

An ideal situation would be when we have a value of 1 (that is 100%) for both Precision and Recall. In
that case, the F1 score would also be an ideal 1 (100%). It is known as the perfect value for F1 Score.
As the values of both Precision and Recall ranges from 0 to 1, the F1 score also ranges from 0 to 1.