Homework True or False Flashcards

1
Q

A linear system whose equations are all homogeneous must be consistent.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Multiplying a row of an augmented matrix through by zero is an acceptable elementary
row operation

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The linear system
π‘₯ βˆ’ 𝑦 = 3
2π‘₯ βˆ’ 2𝑦 = π‘˜
cannot have a unique solution, regardless of the value of π‘˜.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

A single linear equation with two or more unknowns must have infinitely many solutions

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If the number of equations in a linear system exceeds the number of unknowns, then the
system must be inconsistent.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

If each equation in a consistent linear system is multiplied through by a constant 𝑐, then
all solutions to the new system can be obtained by multiplying solutions from the original
system by 𝑐.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Elementary row operations permit one row of an augmented matrix to be subtracted from
another.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The linear system with corresponding augmented matrix
[
2 βˆ’1 4
0 0 βˆ’1
]
is consistent.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

If a matrix is in reduced row echelon form, then it is also in row echelon form.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

If an elementary row operation is applied to a matrix that is in row echelon form, the
resulting matrix will still be in row echelon form.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Every matrix has a unique row echelon form.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A homogeneous linear system in n unknowns whose corresponding augmented matrix
has a reduced row echelon form with π‘Ÿ leading 1’s has 𝑛 βˆ’ π‘Ÿ free variables.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

All leading 1’s in a matrix in row echelon form must occur in different columns.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

If every column of a matrix in row echelon form has a leading 1, then all entries that are
not leading 1’s are zero.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

If a homogeneous linear system of 𝑛 equations in 𝑛 unknowns has a corresponding
augmented matrix with a reduced row echelon form containing 𝑛 leading 1’s, then the
linear system has only the trivial solution.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

If the reduced row echelon form of the augmented matrix for a linear system has a row of
zeros, then the system must have infinitely many solutions.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

If a linear system has more unknowns than equations, then it must have infinitely many
solutions.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

The matrix [
1 2 3
4 5 6
] has no main diagonal.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

For every matrix 𝐴, it is true that (𝐴^𝑇)^𝑇 = 𝐴.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

For every square matrix 𝐴, it is true that π‘‘π‘Ÿ(𝐴^𝑇) = π‘‘π‘Ÿ(𝐴).

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

If 𝐴 is an 𝑛 Γ— 𝑛 matrix and 𝑐 is a scalar, then π‘‘π‘Ÿ(𝑐𝐴) = 𝑐 π‘‘π‘Ÿ(𝐴).

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

If 𝐴, 𝐡, and 𝐢 are matrices of the same size such that 𝐴 βˆ’ 𝐢 = 𝐡 βˆ’ 𝐢, then 𝐴 = 𝐡.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

An π‘š Γ— 𝑛 matrix has π‘š column vectors and 𝑛 row vectors.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

If 𝐴 and 𝐡 are 2 Γ— 2 matrices, then 𝐴𝐡 = 𝐡𝐴.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
The ith row vector of a matrix product 𝐴𝐡 can be computed by multiplying 𝐴 by the ith row vector of 𝐡.
False
26
If 𝐴 and 𝐡 are square matrices of the same order, then π‘‘π‘Ÿ(𝐴𝐡) = π‘‘π‘Ÿ(𝐴)π‘‘π‘Ÿ(𝐡)
False
27
If 𝐴 and 𝐡 are square matrices of the same order, then (𝐴𝐡)^𝑇 = 𝐴^𝑇 𝐡
False
28
If 𝐴 is a 6 Γ— 4 matrix and 𝐡 is an π‘š Γ— 𝑛 matrix such that 𝐡^𝑇 𝐴^𝑇 is a 2 Γ— 6 matrix, then π‘š = 4 and 𝑛 = 2.
True
29
If 𝐴, 𝐡, and 𝐢 are square matrices of the same order such that 𝐴𝐢 = 𝐡𝐢, then 𝐴 = 𝐢
False
30
If 𝐴𝐡 + 𝐡𝐴 is defined, then 𝐴 and 𝐡 are square matrices of the same size.
True
31
If 𝐡 has a column of zeros, then so does 𝐴𝐡 if this product is defined.
True
32
If 𝐡 has a column of zeros, then so does 𝐡𝐴 if this product is defined.
False
33
Two 𝑛 Γ— 𝑛 matrices, 𝐴 and 𝐡, are inverses of one another if and only if 𝐴𝐡 = 𝐡𝐴 = 0.
False
34
For all square matrices 𝐴 and 𝐡 of the same size, it is true that (𝐴 + 𝐡)^2 = 𝐴^2 +2𝐴𝐡 + 𝐡^2
False
35
For all square matrices 𝐴 and 𝐡 of the same size, it is true that 𝐴 2 βˆ’ 𝐡2 = (𝐴 βˆ’ 𝐡)(𝐴 + 𝐡)
False
36
If 𝐴 and 𝐡 are invertible matrices of the same size, then 𝐴𝐡 is invertible and (𝐴𝐡)^βˆ’1 = 𝐴^βˆ’1 𝐡^βˆ’1
False
37
If 𝐴 and 𝐡 are matrices such that 𝐴𝐡 is defined, then it is true that (𝐴𝐡)^𝑇 = 𝐴^𝑇 𝐡^𝑇
False
38
The matrix [ π‘Ž 𝑏 𝑐 𝑑 ] is invertible if and only if π‘Žπ‘‘ βˆ’ 𝑏𝑐 β‰  0.
True
39
If 𝐴 and 𝐡 are matrices of the same size and π‘˜ is a constant, then (π‘˜π΄ + 𝐡)^𝑇 = π‘˜π΄^𝑇 + 𝐡^𝑇
True
40
If 𝐴 is an invertible matrix, then so is 𝐴^𝑇
True
41
If 𝑝(π‘₯) = π‘Ž0 + π‘Ž1π‘₯ + π‘Ž2π‘₯^2 + β‹― + π‘Ž_π‘š π‘₯^π‘š and 𝐼 is an identity matrix, then 𝑝(𝐼) = π‘Ž0 + π‘Ž1 + π‘Ž2 + β‹― + π‘Žπ‘š.
False
42
A square matrix containing a row or column of zeros cannot be invertible
True
43
The sum of two invertible matrices of the same size must be invertible.
False
44
The product of two elementary matrices of the same size must be an elementary matrix.
False
45
Every elementary matrix is invertible.
True
46
f 𝐴 and 𝐡 are row equivalent, and if 𝐡 and 𝐢 are row equivalent, then 𝐴 and 𝐢 are row equivalent.
True
47
f 𝐴 is an 𝑛 Γ— 𝑛 matrix that is not invertible, then the linear system 𝐴𝒙 = 0 has infinitely many solutions.
True
48
If 𝐴 is an 𝑛 Γ— 𝑛 matrix that is not invertible, then the matrix obtained by interchanging two rows of 𝐴 cannot be invertible.
True
49
If 𝐴 is invertible and a multiple of the first row of 𝐴 is added to the second row, then the resulting matrix is invertible.
True
50
An expression of an invertible matrix 𝐴 as a product of elementary matrices is unique
False
51
It is impossible for a system of linear equations to have exactly two solutions.
True
52
If 𝐴 is a square matrix, and if the linear system 𝐴𝒙 = 𝒃 has a unique solution, then the linear system 𝐴𝒙 = 𝒄 also must have a unique solution.
True
53
If 𝐴 and 𝐡 are 𝑛 Γ— 𝑛 matrices such that 𝐴𝐡 = 𝐼_𝑛, then 𝐡𝐴 = 𝐼_𝑛.
True
54
If 𝐴 and 𝐡 are row equivalent matrices, then the linear systems 𝐴𝒙 = 𝟎 and 𝐡𝒙 = 𝟎 have the same solution set
True
55
Let 𝐴 be an 𝑛 Γ— 𝑛 matrix and 𝑆 is an 𝑛 Γ— 𝑛 invertible matrix. If 𝒙 is a solution to system (𝑆^βˆ’1 𝐴 𝑆)𝒙 = 𝒃, then 𝑆𝒙 is a solution system π΄π’š = 𝑆𝒃.
True
56
Let 𝐴 be an 𝑛 Γ— 𝑛 matrix. The linear system 𝐴𝒙 = 4𝒙 has a unique solution if and only if 𝐴 βˆ’ 4𝐼 is an invertible matrix.
True
57
Let 𝐴 and 𝐡 be 𝑛 Γ— 𝑛 matrices. If 𝐴 or 𝐡 (or both) are not invertible, then neither is 𝐴𝐡.
True
58
Let 𝐴 and 𝐡 be 𝑛 Γ— 𝑛 matrices. If 𝐴 or 𝐡 (or both) are not invertible, then neither is 𝐴𝐡.
True
59
The transpose of an upper triangular matrix is an upper triangular matrix.
False
60
The sum of an upper triangular matrix and a lower triangular matrix is a diagonal matrix.
False
61
All entries of a symmetric matrix are determined by the entries occurring on and above the main diagonal.
True
62
All entries of an upper triangular matrix are determined by the entries occurring on and above the main diagonal.
True
63
The inverse of an invertible lower triangular matrix is an upper triangular matrix
False
64
A diagonal matrix is invertible if and only if all of its diagonal entries are positive.
False
65
The sum of a diagonal matrix and a lower triangular matrix is a lower triangular matrix.
True
66
𝐴 matrix that is both symmetric and upper triangular must be a diagonal matrix
True
67
If 𝐴 and 𝐡 are 𝑛 Γ— 𝑛 matrices such that 𝐴 + 𝐡 is symmetric, then 𝐴 and 𝐡 are symmetric
False
68
If 𝐴 and 𝐡 are 𝑛 Γ— 𝑛 matrices such that 𝐴 + 𝐡 is upper triangular, then 𝐴 and 𝐡 are upper triangular.
False
69
f 𝐴^2 is a symmetric matrix, then 𝐴 is a symmetric matrix
False
70
If π‘˜π΄ is a symmetric matrix for some π‘˜ β‰  0, then 𝐴 is a symmetric matrix.
True
71
The determinant of the 2 Γ— 2 matrix [ π‘Ž 𝑏 𝑐 𝑑 ] is π‘Žπ‘‘ + 𝑏𝑐
False
72
Two square matrices that have the same determinant must have the same size.
False
73
The minor 𝑀_𝑖𝑗 is the same as the cofactor 𝐢_𝑖𝑗 if 𝑖 + 𝑗 is even.
True
74
If 𝐴 is a 3 Γ— 3 symmetric matrix, then 𝐢𝑖𝑗 = 𝐢𝑗𝑖 for all 𝑖 and 𝑗.
True
75
The number obtained by a cofactor expansion of a matrix 𝐴 is independent of the row or column chosen for the expansion
True
76
If 𝐴 is a square matrix whose minors are all zero, then 𝑑𝑒𝑑(𝐴) = 0.
True
77
The determinant of a lower triangular matrix is the sum of the entries along the main diagonal.
False
78
For every square matrix 𝐴 and every scalar 𝑐, it is true that 𝑑𝑒𝑑 (𝑐𝐴) = 𝑐 𝑑𝑒𝑑(𝐴)
False
79
For all square matrices 𝐴 and 𝐡, it is true that det(𝐴 + 𝐡) = det(𝐴) + det (𝐡)
False
80
For every 2 Γ— 2 matrix 𝐴 it is true that det(𝐴^2) = (det(𝐴))^2
True
81
f 𝐴 is a 4 Γ— 4 matrix and 𝐡 is obtained from 𝐴 by interchanging the first two rows and then interchanging the last two rows, then 𝑑𝑒𝑑(𝐡) = 𝑑𝑒𝑑(𝐴).
True
82
If 𝐴 is a 3 Γ— 3 matrix and 𝐡 is obtained from 𝐴 by multiplying the first column by 4 and multiplying the third column by 3/4 , then det(𝐡) = 3 det(𝐴).
True
83
If 𝐴 is a 3 Γ— 3 matrix and 𝐡 is obtained from 𝐴 by adding 5 times the first row to each of the second and third rows, then det(𝐡) = 25 det(𝐴).
False
84
If 𝐴 is an 𝑛 Γ— 𝑛 matrix and 𝐡 is obtained from 𝐴 by multiplying each row of 𝐴 by its row number, then det(𝐡) = 𝑛(𝑛 + 1) / 2 * det (𝐴)
False
85
If 𝐴 is a square matrix with two identical columns, then det(𝐴) = 0
True
86
If the sum of the second and fourth row vectors of a 6 Γ— 6 matrix 𝐴 is equal to the last row vector, then 𝑑𝑒𝑑(𝐴) = 0.
True
87
If 𝐴 is a 3 Γ— 3 matrix, then det(2𝐴) = 2 det(𝐴)
False
88
If 𝐴 and 𝐡 are square matrices of the same size such that det(𝐴) = det(𝐡), then det(𝐴 + 𝐡) = 2 det(𝐴)
False
89
If 𝐴 and 𝐡 are square matrices of the same size and 𝐴 is invertible, then det(𝐴^βˆ’1𝐡𝐴) = det (𝐡)
True
90
A square matrix 𝐴 is invertible if and only if det(𝐴) = 0.
False
91
If 𝐴 is a square matrix and the linear system 𝐴𝒙 = 𝟎 has multiple solutions for 𝒙, then 𝑑𝑒𝑑(𝐴) = 0.
True
92
If 𝐴 is an 𝑛 Γ— 𝑛 matrix and there exists an 𝑛 Γ— 1 matrix 𝒃 such that the linear system 𝐴𝒙 = 𝒃 has no solutions, then the reduced row echelon form of 𝐴 cannot be 𝐼_𝑛.
True
93
If 𝐸 is an elementary matrix, then 𝐸𝒙 = 𝟎 has only the trivial solution
True
94
If 𝐴 is an invertible matrix, then the linear system 𝐴𝒙 = 𝟎 has only the trivial solution if and only if the linear system 𝐴^βˆ’1 𝒙 = 𝟎 has only the trivial solution.
True
95
Two equivalent vectors must have the same initial point.
False
96
The vectors (π‘Ž, 𝑏) and (π‘Ž, 𝑏, 0) are equivalent.
False
97
If π‘˜ is a scalar and 𝒗 is a vector, then 𝒗 and π‘˜π’— are parallel if and only if π‘˜ β‰₯ 0.
False
98
The vectors 𝒗 + (𝒖 + π’˜) and (π’˜ + 𝒗) + 𝒖 are the same.
True
99
If 𝒖 + 𝒗 = 𝒖 + π’˜, then 𝒗 = π’˜.
True
100
If π‘Ž and 𝑏 are scalars such that π‘Žπ’– + 𝑏𝒗 = 0, then 𝒖 and 𝒗 are parallel vectors.
False
101
Collinear vectors with the same length are equal.
False
102
If (π‘Ž, 𝑏, 𝑐) + (π‘₯, 𝑦, 𝑧) = (π‘₯, 𝑦, 𝑧), then (π‘Ž, 𝑏, 𝑐) must be the zero vector.
True
103
If π‘˜ and π‘š are scalars and 𝒖 and 𝒗 are vectors, then (π‘˜ + π‘š)(𝒖 + 𝒗) = π‘˜π’– + π‘šπ’—
False
104
If the vectors 𝒗 and π’˜ are given, then the vector equation 3(2𝒗 βˆ’ 𝒙) = 5𝒙 βˆ’ 4π’˜ + 𝒗 can be solved for 𝒙.
True
105
The linear combinations π‘Ž1π’—πŸ + π‘Ž2π’—πŸ and 𝑏1π’—πŸ + 𝑏2π’—πŸ can only be equal if π‘Ž1 = 𝑏1 and π‘Ž2 = 𝑏2.
False
106
If each component of a vector in 𝑅^3 is doubled, the norm of that vector is doubled.
True
107
In 𝑅^2, the vectors of norm 5 whose initial points are at the origin have terminal points lying on a circle of radius 5 centred at the origin.
True
108
Every vector in 𝑅^𝑛 has a positive norm
False
109
f 𝒗 is a nonzero vector in 𝑅^𝑛, there are exactly two unit vectors that are parallel to 𝒗.
True
110
f ‖𝒖‖ = 2, ‖𝒗‖ = 1, and 𝒖 β‹… 𝒗 = 1, then the angle between 𝒖 and 𝒗 is πœ‹/3 radians.
True
111
The expressions (𝒖 β‹… 𝒗) + π’˜ and 𝒖 β‹… (𝒗 + π’˜) are both meaningful and equal to each other.
False
112
f 𝒖 β‹… 𝒗 = 𝒖 β‹… π’˜, then 𝒗 = π’˜.
False
113
If 𝒖 β‹… 𝒗 = 0, then either 𝒖 = 0 π‘œπ‘Ÿ 𝒗 = 0.
False
114
n 𝑅^2, if 𝒖 lies in the first quadrant and 𝒗 lies in the third quadrant, then 𝒖 β‹… 𝒗 cannot be positive.
True
115
For all vectors 𝒖, 𝒗, π‘Žπ‘›π‘‘ π’˜ in 𝑅 𝑛, we have ‖𝒖 + 𝒗 + π’˜β€– ≀ ‖𝒖‖ + ‖𝒗‖ + β€–π’˜β€–
True
116
The vectors (3, βˆ’1, 2) and (0, 0, 0) are orthogonal.
True
117
If 𝒖 and 𝒗 are orthogonal vectors, then for all nonzero scalars π‘˜ and π‘š, π‘˜π’– and π‘šπ’— are orthogonal vectors.
True
118
The orthogonal projection of 𝒖 on 𝒂 is perpendicular to the vector component of 𝒖 orthogonal to 𝒂.
True
119
If 𝒂 and 𝒃 are orthogonal vectors, then for every nonzero vector 𝒖, we have π‘π‘Ÿπ‘œπ‘—_π‘Ž(π‘π‘Ÿπ‘œπ‘—_𝑏(𝒖)) = 𝟎
True
120
If 𝒂 and 𝒖 are nonzero vectors, then π‘π‘Ÿπ‘œπ‘—_π‘Ž(π‘π‘Ÿπ‘œπ‘—_π‘Ž(𝒖)) = π‘π‘Ÿπ‘œπ‘—_π‘Ž(𝒖)
True
121
If the relationship π‘π‘Ÿπ‘œπ‘—π‘Ž 𝒖 = π‘π‘Ÿπ‘œπ‘—_π‘Ž 𝒗 holds for some nonzero vector 𝒂, then 𝒖 = 𝒗.
False
122
For all vectors 𝒖 and 𝒗, it is true that ‖𝒖 + 𝒗‖ = ‖𝒖‖ + ‖𝒗‖
False
123
The cross product of two nonzero vectors 𝒖 and 𝒗 is a nonzero vector if and only if 𝒖 and 𝒗 are not parallel.
True
124
A normal vector to a plane can be obtained by taking the cross product of two nonzero and noncollinear vectors lying in the plane.
True
125
The scalar triple product of 𝒖, 𝒗 and π’˜ determines a vector whose length is equal to the volume of the parallelepiped determined by 𝒖, 𝒗 and π’˜.
False
126
If 𝒖 and 𝒗 are vectors in 3‐space, then ‖𝒗 Γ— 𝒖‖ is equal to the area of the parallelogram determined by 𝒖 and 𝒗.
True
127
For all vectors 𝒖, 𝒗 and π’˜ in 3-space, the vectors (𝒖 Γ— 𝒗) Γ— π’˜ and 𝒖 Γ— (𝒗 Γ— π’˜) are the same.
False
128
If 𝒖, 𝒗 and π’˜ are vectors in 𝑅 3, where 𝒖 is nonzero and 𝒖 Γ— 𝒗 = 𝒖 Γ— π’˜, then 𝒗 = π’˜
False
129
A vector is any element of a vector space.
True
130
A vector space must contain at least two vectors.
False
131
The set of positive real numbers is a vector space if vector addition and scalar multiplication are the usual operations of addition and multiplication of real numbers.
False
132
If 𝒖 is a vector and π‘˜ is a scalar such that π‘˜π’– = 𝟎, then it must be true that π‘˜ = 0.
False
133
In every vector space the vectors (βˆ’1)𝒖 π‘Žπ‘›π‘‘ βˆ’ 𝒖 are the same.
True
134
n the vector space 𝐹(βˆ’βˆž, ∞) any function whose graph passes through the origin is a zero vector
False
135
An expression of the form π‘˜_1 𝒗_𝟏 + π‘˜_2 𝒗_𝟐 + β‹― + π‘˜_π‘Ÿ 𝒗_𝒓 is called a linear combination.
True
136
The span of a single vector in 𝑅^2 is a line.
True
137
The span of two vectors in 𝑅^3 is a plane.
False
138
The span of any finite set of vectors in a vector space is closed under addition and scalar multiplication.
False
139
A set containing a single vector is linearly independent.
False
140
No linearly independent set contains the zero vector.
True
141
Every linearly dependent set contains the zero vector.
False
142
If the set of vectors {𝒗_𝟏, 𝒗_𝟐, 𝒗_πŸ‘} is linearly independent, then {π‘˜π’—πŸ, π‘˜ 𝒗_𝟐, π‘˜ 𝒗_πŸ‘} is also linearly independent for every nonzero scalar π‘˜.
True
143
If 𝒗_𝟏, …, 𝒗_𝒏 are linearly dependent nonzero vectors, then at least one vector 𝒗_π’Œ is a unique linear combination of 𝒗_𝟏, … , 𝒗_π’Œβˆ’πŸ.
True
144
The set of 2 Γ— 2 matrices that contain exactly two 1's and two 0's is a linearly independent set in 𝑀_22.
False
145
The three polynomials (π‘₯ βˆ’ 1)(π‘₯ + 2), π‘₯(π‘₯ + 2), π‘Žπ‘›π‘‘ π‘₯(π‘₯ βˆ’ 1) are linearly independent.
True
146
The functions 𝑓_1 and 𝑓_2 are linearly dependent if there is a real number π‘₯ such that π‘˜_1 𝑓_1(π‘₯) + π‘˜_2 𝑓_2(π‘₯) = 0 for some scalars π‘˜_1 and π‘˜_2.
False
147
f 𝑉 = π‘ π‘π‘Žπ‘›{𝒗_𝟏, … , 𝒗_𝒏}, then {𝒗_𝟏, … , 𝒗_𝒏} is a basis for 𝑉.
False
148
Every linearly independent subset of a vector space 𝑉 is a basis for 𝑉.
False
149
f {𝒗_𝟏, …, 𝒗_𝒏} is a basis for a vector space 𝑉, then every vector in 𝑉 can be expressed as a linear combination of 𝒗_𝟏, … , 𝒗_𝒏.
True
150
The coordinate vector of a vector 𝒙 in 𝑅^𝑛 relative to the standard basis for 𝑅^𝑛 is 𝒙.
True
151
Every basis of 𝑃_4 contains at least one polynomial of degree 3 or less.
False
152
The zero vector space has dimension zero
True
153
There is a set of 17 linearly independent vectors in 𝑅^17.
True
154
There is a set of 11 vectors that span 𝑅^17.
False
155
Every linearly independent set of five vectors in 𝑅^5 is a basis for 𝑅^5
True
156
Every set of five vectors that spans 𝑅^5 is a basis for 𝑅^5
True
157
Every set of vectors that spans 𝑅^𝑛 contains a basis for 𝑅^𝑛
True
158
Every linearly independent set of vectors in 𝑅^𝑛 is contained in some basis for 𝑅^𝑛.
True
159
If 𝐴 has size 𝑛 Γ— 𝑛 and 𝐼_𝑛, 𝐴, 𝐴^2, … , 𝐴^𝑛^2 are distinct matrices, then {𝐼_𝑛, 𝐴, 𝐴^2, … , 𝐴^𝑛^2} is a linearly dependent set.
True
160
The span of v_1, …, v_n is the column space of the matrix whose column vectors are v_1, …, v_n.
True
161
The column space of a matrix A is the set of solutions of Ax = b
Fasle
162
If R is the reduced row echelon form of A, then those column vectors of R that contain the leading 1's form a basis for the column space of A.
False
163
The set of nonzero row vectors of a matrix A is a basis for the row space of A
False
164
If A and B are n Γ— n matrices that have the same row space, then A and B have the same column space
False
165
If E is an m Γ— m elementary matrix and A is an m Γ— n matrix, then the null space of EA is the same as the null space of A.
True
166
If E is an m Γ— m elementary matrix and A is an m Γ— n matrix, then the row space of EA is the same as the row space of A.
True
167
If E is an m Γ— m elementary matrix and A is an m Γ— n matrix, then the column space of EA is the same as the column space of A.
False
168
The system Ax = b is inconsistent if and only if b is not in the column space of A.
True
169
There is an invertible matrix A and a singular matrix B such that the row spaces of A and B are the same.
False
170
Either the row vectors or the column vectors of a square matrix are linearly independent
False
171
A matrix with linearly independent row vectors and linearly independent column vectors is square.
True
172
The nullity of a nonzero m Γ— n matrix is at most m
False
173
Adding one additional column to a matrix increases its rank by one
False
174
The nullity of a square matrix with linearly dependent rows is at least one
True
175
If A is square and Ax = b is inconsistent for some vector b, then the nullity of A is zero.
False
176
If a matrix A has more rows than columns, then the dimension of the row space is greater than the dimension of the column space.
False
177
If rank (A^T) = rank(A), then A is square.
False
178
There is no 3 Γ— 3 matrix whose row space and null space are both lines in 3-space.
True
179
f 𝐴 is an π‘š Γ— 𝑛 matrix, then the codomain of the transformation 𝑇_𝐴 is 𝑅^𝑛
False
180
If 𝐴 is a 2 Γ— 3 matrix, then the domain of the transformation 𝑇_𝐴 is 𝑅_2.
False
181
If 𝐴 is a square matrix and 𝐴𝒙 = πœ†π’™ for some nonzero scalar πœ†, then 𝒙 is an eigenvector of 𝐴.
False
182
If πœ† is an eigenvalue of a matrix 𝐴, then the linear system (πœ†πΌ βˆ’ 𝐴)𝒙 = 𝟎 has only the trivial solution.
False
183
If the characteristic polynomial of a matrix 𝐴 is 𝑝(πœ†) = πœ†^2 + 1 then 𝐴 is invertible.
True
184
If πœ† is an eigenvalue of a matrix 𝐴, then the eigenspace of 𝐴 corresponding to πœ† is the set of eigenvectors of 𝐴 corresponding to πœ†.
False
185
The eigenvalues of a matrix 𝐴 are the same as the eigenvalues of the reduced row echelon form of 𝐴.
False
186
If 0 is an eigenvalue of a matrix 𝐴, then the set of columns of 𝐴 is linearly independent.
False
187
An 𝑛 Γ— 𝑛 matrix with fewer than 𝑛 distinct eigenvalues is not diagonalizable.
False
188
An 𝑛 Γ— 𝑛 matrix with fewer than 𝑛 linearly independent eigenvectors is not diagonalizable.
True
189
If 𝐴 is diagonalizable, then there is a unique matrix 𝑃 such that 𝑃 βˆ’1 𝐴𝑃 is diagonal.
False
190
If every eigenvalue of a matrix 𝐴 has algebraic multiplicity 1, then 𝐴 is diagonalizable.
True
191
The dot product on 𝑅^2 is an example of a weighted inner product.
True
192
The inner product of two vectors cannot be a negative real number.
False
193
βŒ©π’–, 𝒗 + π’˜βŒͺ = βŒ©π’—, 𝒖βŒͺ + βŒ©π’˜, 𝒖βŒͺ.
True
194
βŒ©π‘˜π’–, π‘˜π’—βŒͺ = π‘˜2βŒ©π’–, 𝒗βŒͺ.
True
195
If βŒ©π’–, 𝒗βŒͺ = 0, then 𝒖 = 𝟎 or 𝒗 = 𝟎.
False
196
f ‖𝒗‖^2 = 𝟎, then 𝒗 = 0.
True
197
If 𝐴 is an 𝑛 Γ— 𝑛 matrix, then βŒ©π’–, 𝒗βŒͺ = 𝐴𝒖 β‹… 𝐴𝒗 defines an inner product on 𝑅^𝑛.
False
198
If 𝒖 is orthogonal to every vector of a subspace π‘Š, then 𝒖 = 𝟎.
False
199
f 𝒖 is a vector in both π‘Š and π‘ŠβŠ₯, then 𝒖 = 𝟎.
True
200
If 𝒖 and 𝒗 are vectors in π‘ŠβŠ₯, then 𝒖 + 𝒗 is in π‘ŠβŠ₯.
True
201
If 𝒖 is a vector in π‘ŠβŠ₯ and π‘˜ is a real number, then π‘˜π’– is in π‘ŠβŠ₯.
True
202
If 𝒖 and 𝒗 are orthogonal, then |βŒ©π’–, 𝒗βŒͺ| = ‖𝒖‖‖𝒗‖.
False
203
If 𝒖 and 𝒗 are orthogonal, then ‖𝒖 + 𝒗‖ = ‖𝒖‖ + ‖𝒗‖.
False
204
Every linearly independent set of vectors in an inner product space is orthogonal.
False
205
Every orthogonal set of vectors in an inner product space is linearly independent.
False
206
Every nontrivial subspace of 𝑅^3 has an orthonormal basis with respect to the Euclidean inner product.
True
207
Every nonzero finite‐dimensional inner product space has an orthonormal basis.
True
208
π‘π‘Ÿπ‘œπ‘—_π‘Šπ’™ is orthogonal to every vector of π‘Š.
False