Midterm Study Cards Flashcards

1
Q

What is a linear combination?

A

The linear combination of two vectors v and w is av + bw where a and b are scalars

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Are Point and Vector equivalent in linear algebra?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the symbol for and the formula for the length/magnitude of a n-vector v

A

The symbol and equation is
||v||, sqrt((v_1)^2+(v_2)^2+(v_n)^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you prove two vectors are linearly independent?

A

If there is no way to form a linear combination of them equal to 0 without a and b being 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the equation of the distance between two vectors v and w?

A

The equation is
||v - w||

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the zero vector?

A

The zero vector in R^n with coordinates [0,0,0,0,…,0]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a unit vector?

A

A unit vector is a vector with length 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When are vectors x and y perpendicular

A

When the dot product of x and y is 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the cosine of the angle between two vectors v and w?

A

The cosine is
(v dot w)/(||v||*||w||)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When are two vectors orthogonal?

A

When they are perpendicular to each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What 5 rules hold up for all vectors?

A

v dot w = w dot v
v dot v = ||v||^2
v dot (cw) = c(v dot w) (where c is a scalar)
v dot (w1 + w2) = (v dot w1) + (v dot w2)
v dot (c1w1 + c2w2) = c1(v dot w1) + c2(v dot w2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the correlation coefficient between two vectors v and w?

A

The equation is
(v dot w)/(||v||*||w||) where v_avg and w_avg = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does a correlation coefficient close to -1 imply?

A

The data points are close to a line of negative slope

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does a correlation coefficient close to 1 imply?

A

The data points are close to a line with positive slope

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does a correlation coefficient close to 0 imply?

A

x and y do not have a strong linear correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the basic equation of a plane?

A

ax + by + cz = d where a, b, c, and d are constants

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the parametric form of the equation of a plane?

A

P + te + t’e’ where t and t’ are all possible scalar values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the normal vector form of the equation of a plane?

A

A plane is defined by stating it is perpendicular to the vector n and passes through the point w.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

How are the normal vector n to the plane W and the plane W which passes through the orgin related?

A

For W = ax + by + cz = 0, n = [a, b, c] for R^3. This applies to all R^n

20
Q

What is the three point form of the equation of a plane?

A

A plane described by 3-points which are NOT collinear

21
Q

What is the span of vectors v1, v2, …, vk in R^n

A

The span of the vectors is all possible places in the space R^n that can be reached via a linear combination of the vectors
Or
span(v1,…,vk) = {all n-vectors x of the form c1v1,…,ckvk}

22
Q

What must be true of spans?

A

They pass through the origin

23
Q

What is a linear subspace of R^n?

A

A linear subspace of R^n is a subset of R^n that is the span of a finite collection of vectors in R^n

24
Q

Given a linear subspace V, what are true of the vectors within it?

A

For all vectors that exist in V, any linear combination of those vectors exist in V (aka they belong to the same linear subspace)

25
Q

What is Dim(V)

A

Dim V is the dimension of V aka the smallest number of vectors needed to span V.

26
Q

If W and V are both linear sub spaces in R^n and W is contained in V, what is true of W

A

W = V

27
Q

What is a basis

A

A basis is a set of n vectors which spans a linear subspace of dim = n

28
Q

When is a collection of vectors considered orthogonal?

A

When for vi,…,vk, vi dot vj = 0 when i!=j

29
Q

When is a collection of vectors orthonormal?

A

A collection of vectors is orthonormal if they’re all orthogonal and unit vectors

30
Q

How can you convert orthogonal basis’ to orthonormal basis’?

A

Divide each vector by it’s length

31
Q

What does the Fourier formula do and what is it?

A

The Fourier formula can be used to get the scalar multiples which create the vector v which lies on the span of a collection of orthogonal vectors, v = summation from 1 to k of
((v dot v_i)/(vi dot vi))vi where the collection of orthogonal vectors has the form {vi,…,vk}.

32
Q

What is the Projection and what’s it’s formula?

A

The Projection is the closest point from one vector to another, the formula for Proj_w(x) = ((x dot w)/(w dot w))w

33
Q

Given a collection of vectors which make up a linear subspace, how do you find the point closest to a point on that linear subspace?

A

Let V be the linear subspace = {v1,v2,…,vk} then Proj_V(x) =
Proj_v1(x) + Proj_v2(x) + … + Proj_vk(x)

34
Q

If V is a linear subspace of R^n how can every vector x in R^n be uniquely expressed?

A

x = v + v’ where v = Proj_V(x) and
v’ = x - Proj_V(x)

35
Q

How can you find the orthogonal basis of the span(x,y)

A

The orthogonal basis would be y and x’ = x - Proj_y(x)

36
Q

How do you find the line of best fit for a data set?

A

Break the points into vector X with all the x values and vector Y with all the y vectors. Then calculate x_avg and y_avg. Create a vector X’ = [x1-x_avg, x2 - x_avg, …, xk- x_avg]
Then these into the formula
Proj_X’(Y) + Proj_1(Y) the coefficients of these vectors are the equation of the line. Note that X’ = X - x_avg([1])

37
Q

What is the value of g(f(x)) for a Multivariable problem

A

G applied to F. So f’s output values plugged into G.

38
Q

What is a saddle point?

A

A saddle point is a point where both f_x and f_y are 0 but while in on direction it appears the point is a minimum, but the other it appears to be a maximum

39
Q

How do you calculate the linear approximation of a function f?

A

For x near a ∈ Rn
, the linear approximation to f is
f(x) ≈ f(a) + ((∇f)(a)) · (x − a).

40
Q

What is the equation of the subspace tangent to the curve on a contour plot?

A

At the point a, b the tangent vector is GradF(a,b) dot [x - a, y - b] = 0

41
Q

How can you immediately tell if a set of vectors is not linearly independent?

A

If it has a vector which is not non-zero

42
Q

When is a vector non-zero?

A

When it has at least one entry which is non-zero

43
Q

Upper left to lower right diagonal of inverse square matrix is always what?

A

1/The original value of the position in the diagonal

44
Q

REVIEW 11/29/2022 LECTURE (Specifically for inverse matrix review) Also reveal about midterm potentially at around 30 minutes in (go like 24 and watch should be on the slide with blue box on the top also around 52 (go to like 47)

A

YES

45
Q

What is the determinant of an upper or lower triangular matrix?

A

The product of it’s diagonal entries