L3_LDA Flashcards

1
Q

Solution for Correlated Data

A

(Fisher’s) Linear Discriminant Analysis LDA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What measures the Correlation?

A

the linear relationship between X and Y
1,0 monotonic increasing
-1,0 monotonic decreasing
0,0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Goal of Linear Discriminant Analysis - View classification in terms of dimensionality reduction

A
Find a (normal vector of a linear decision boundary) w that showing the greatly improved class separation by Maximizes mean class difference, and
Minimizes variance in each class
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

LDA is the optimal classifer

A

If data is Gaussian with equal class covariances

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The goal of classification is

A

generalization: Correct categorization/prediction of new data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How can we estimate generalization performance?

A

Cross-validation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Characteristic of Cross-validation:

A
  • Train model on part of data
  • Test model on other part of data
  • Repeat on different cross-validation folds
  • Average performance on test set across all folds
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Algorithm 1: Cross-Validation (6)

A
Require: Data (x1,y1)...,(xN,yN), Number of CV
        folds F
1.# Split data in F disjunct 
2. folds for folds f = 1,...,F do
3.# Train model on folds {1,...,F} \ f
4.# Compute prediction error on fold f 
5.end for
6.# Average prediction error
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

BCI Based on Event-Related Potentials (ERPs)

A

• User concentrates on a symbol
• Rows and columns are intensified
randomly
• Target rows and columns elicit specific ERPs
• BCI detects target ERPs (averaged over few repetitions)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Linear Discriminant Algorithm (4)

A

Computes: Normal vector w of decision hyperplane, threshold (der Entscheidungs-Hyperebene, Schwelle) β
1. Compute class mean vectors
2. Compute within-class covariance matrices SW
Compute normal vector w
3. Compute normal vector w
4. Compute threshold

How well did you know this?
1
Not at all
2
3
4
5
Perfectly