Lecture 4 Flashcards

1
Q

There are many ways to communicate the same thing

A

like 75
can be 75
or seventy five
or IIIIIIIIIIIIIII(75 sticks)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

mark

A

basic geometric element that depict items or links

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

channel

A

Control the appearance of marks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Mark types in table datasets

A

mark represent an item

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Mark types in network datasets

A

represent either an item (node) or a link that represents a relationship between items

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A connection mark shows a pairwise relationship between two items using a line

A

true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A containment mark shows hierarchical relationships using areas (nested connection marks within each other at multiple levels)

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Bootstrap presents a way to resample your sample over and overhowever it does it with replacement unlike cross validation

A

true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

If n is way larger than predictors, your model will perform well

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

If I want to reduce the number of predictors I can use subset selections, one of these is

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

We have 3 ways for subset selection

A

1- Best subset selection
2- Forward subset selection
3- Backward subset selection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Best Subset Regression (selection):

A

Minimize RSS ( square distance between what we predict and the values that we have in the data)

Maximize adjRsquare( how much proportion the predictors explain the variance in the response)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

For best subset selection we need leaps

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

adjusted r-square

A

Adjusting R-square by the number of predictors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Lasso approach

A

It will shrink coefficients and it will do feature selection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Models generated from Lasso are sparse

A

Sparse meaning less number of predictors

17
Q

In both ridge and lasso we want to minimize RSS

A

Lasso: absolute value
Ridge: square

18
Q

In case not all predictors relate to the response Lasso will prerform better

A

True

19
Q

How to select Lambda

A

we do vlalidation and choose one of least error