Homework True or False Flashcards
A linear system whose equations are all homogeneous must be consistent.
True
Multiplying a row of an augmented matrix through by zero is an acceptable elementary
row operation
False
The linear system
π₯ β π¦ = 3
2π₯ β 2π¦ = π
cannot have a unique solution, regardless of the value of π.
True
A single linear equation with two or more unknowns must have infinitely many solutions
True
If the number of equations in a linear system exceeds the number of unknowns, then the
system must be inconsistent.
False
If each equation in a consistent linear system is multiplied through by a constant π, then
all solutions to the new system can be obtained by multiplying solutions from the original
system by π.
False
Elementary row operations permit one row of an augmented matrix to be subtracted from
another.
True
The linear system with corresponding augmented matrix
[
2 β1 4
0 0 β1
]
is consistent.
False
If a matrix is in reduced row echelon form, then it is also in row echelon form.
True
If an elementary row operation is applied to a matrix that is in row echelon form, the
resulting matrix will still be in row echelon form.
False
Every matrix has a unique row echelon form.
False
A homogeneous linear system in n unknowns whose corresponding augmented matrix
has a reduced row echelon form with π leading 1βs has π β π free variables.
True
All leading 1βs in a matrix in row echelon form must occur in different columns.
True
If every column of a matrix in row echelon form has a leading 1, then all entries that are
not leading 1βs are zero.
False
If a homogeneous linear system of π equations in π unknowns has a corresponding
augmented matrix with a reduced row echelon form containing π leading 1βs, then the
linear system has only the trivial solution.
True
If the reduced row echelon form of the augmented matrix for a linear system has a row of
zeros, then the system must have infinitely many solutions.
False
If a linear system has more unknowns than equations, then it must have infinitely many
solutions.
False
The matrix [
1 2 3
4 5 6
] has no main diagonal.
True
For every matrix π΄, it is true that (π΄^π)^π = π΄.
True
For every square matrix π΄, it is true that π‘π(π΄^π) = π‘π(π΄).
True
If π΄ is an π Γ π matrix and π is a scalar, then π‘π(ππ΄) = π π‘π(π΄).
True
If π΄, π΅, and πΆ are matrices of the same size such that π΄ β πΆ = π΅ β πΆ, then π΄ = π΅.
True
An π Γ π matrix has π column vectors and π row vectors.
False
If π΄ and π΅ are 2 Γ 2 matrices, then π΄π΅ = π΅π΄.
False
The ith row vector of a matrix product π΄π΅ can be computed by multiplying π΄ by the
ith row vector of π΅.
False
If π΄ and π΅ are square matrices of the same order, then
π‘π(π΄π΅) = π‘π(π΄)π‘π(π΅)
False
If π΄ and π΅ are square matrices of the same order, then
(π΄π΅)^π = π΄^π π΅
False
If π΄ is a 6 Γ 4 matrix and π΅ is an π Γ π matrix such that π΅^π π΄^π
is a 2 Γ 6 matrix,
then π = 4 and π = 2.
True
If π΄, π΅, and πΆ are square matrices of the same order such that π΄πΆ = π΅πΆ, then π΄ = πΆ
False
If π΄π΅ + π΅π΄ is defined, then π΄ and π΅ are square matrices of the same size.
True
If π΅ has a column of zeros, then so does π΄π΅ if this product is defined.
True
If π΅ has a column of zeros, then so does π΅π΄ if this product is defined.
False
Two π Γ π matrices, π΄ and π΅, are inverses of one another if and only if
π΄π΅ = π΅π΄ = 0.
False
For all square matrices π΄ and π΅ of the same size, it is true that (π΄ + π΅)^2 = π΄^2 +2π΄π΅ + π΅^2
False
For all square matrices π΄ and π΅ of the same size, it is true that π΄
2 β π΅2 =
(π΄ β π΅)(π΄ + π΅)
False
If π΄ and π΅ are invertible matrices of the same size, then π΄π΅ is invertible and (π΄π΅)^β1 = π΄^β1 π΅^β1
False
If π΄ and π΅ are matrices such that π΄π΅ is defined, then it is true that
(π΄π΅)^π = π΄^π π΅^π
False
The matrix [
π π
π π
] is invertible if and only if
ππ β ππ β 0.
True
If π΄ and π΅ are matrices of the same size and π is a constant, then
(ππ΄ + π΅)^π = ππ΄^π + π΅^π
True
If π΄ is an invertible matrix, then so is π΄^π
True
If
π(π₯) = π0 + π1π₯ + π2π₯^2 + β― + π_π π₯^π
and πΌ is an identity matrix, then
π(πΌ) = π0 + π1 + π2 + β― + ππ.
False
A square matrix containing a row or column of zeros cannot be invertible
True
The sum of two invertible matrices of the same size must be invertible.
False
The product of two elementary matrices of the same size must be an elementary matrix.
False
Every elementary matrix is invertible.
True
f π΄ and π΅ are row equivalent, and if π΅ and πΆ are row equivalent, then π΄ and πΆ are row
equivalent.
True
f π΄ is an π Γ π matrix that is not invertible, then the linear system π΄π = 0 has infinitely
many solutions.
True
If π΄ is an π Γ π matrix that is not invertible, then the matrix obtained by interchanging
two rows of π΄ cannot be invertible.
True
If π΄ is invertible and a multiple of the first row of π΄ is added to the second row, then the
resulting matrix is invertible.
True
An expression of an invertible matrix π΄ as a product of elementary matrices is unique
False
It is impossible for a system of linear equations to have exactly two solutions.
True
If π΄ is a square matrix, and if the linear system π΄π = π has a unique solution, then the
linear system π΄π = π also must have a unique solution.
True
If π΄ and π΅ are π Γ π matrices such that π΄π΅ = πΌ_π, then π΅π΄ = πΌ_π.
True
If π΄ and π΅ are row equivalent matrices, then the linear systems π΄π = π and π΅π = π
have the same solution set
True
Let π΄ be an π Γ π matrix and π is an π Γ π invertible matrix. If π is a solution to
system (π^β1 π΄ π)π = π, then ππ is a solution system π΄π = ππ.
True
Let π΄ be an π Γ π matrix. The linear system π΄π = 4π has a unique solution if and only
if π΄ β 4πΌ is an invertible matrix.
True
Let π΄ and π΅ be π Γ π matrices. If π΄ or π΅ (or both) are not invertible, then neither is π΄π΅.
True
Let π΄ and π΅ be π Γ π matrices. If π΄ or π΅ (or both) are not invertible, then neither is π΄π΅.
True
The transpose of an upper triangular matrix is an upper triangular matrix.
False
The sum of an upper triangular matrix and a lower triangular matrix is a diagonal matrix.
False
All entries of a symmetric matrix are determined by the entries occurring on and above
the main diagonal.
True
All entries of an upper triangular matrix are determined by the entries occurring on and
above the main diagonal.
True
The inverse of an invertible lower triangular matrix is an upper triangular matrix
False
A diagonal matrix is invertible if and only if all of its diagonal entries are positive.
False
The sum of a diagonal matrix and a lower triangular matrix is a lower triangular matrix.
True
π΄ matrix that is both symmetric and upper triangular must be a diagonal matrix
True
If π΄ and π΅ are π Γ π matrices such that π΄ + π΅ is symmetric, then π΄ and π΅ are
symmetric
False
If π΄ and π΅ are π Γ π matrices such that π΄ + π΅ is upper triangular, then π΄ and π΅ are
upper triangular.
False
f π΄^2 is a symmetric matrix, then π΄ is a symmetric matrix
False
If ππ΄ is a symmetric matrix for some π β 0, then π΄ is a symmetric matrix.
True
The determinant of the 2 Γ 2 matrix [
π π
π π ] is ππ + ππ
False
Two square matrices that have the same determinant must have the same size.
False
The minor π_ππ is the same as the cofactor πΆ_ππ if π + π is even.
True
If π΄ is a 3 Γ 3 symmetric matrix, then πΆππ = πΆππ for all π and π.
True
The number obtained by a cofactor expansion of a matrix π΄ is independent of the row or
column chosen for the expansion
True
If π΄ is a square matrix whose minors are all zero, then πππ‘(π΄) = 0.
True
The determinant of a lower triangular matrix is the sum of the entries along the main
diagonal.
False
For every square matrix π΄ and every scalar π, it is true that πππ‘ (ππ΄) = π πππ‘(π΄)
False
For all square matrices π΄ and π΅, it is true that
det(π΄ + π΅) = det(π΄) + det (π΅)
False
For every 2 Γ 2 matrix π΄ it is true that
det(π΄^2) = (det(π΄))^2
True
f π΄ is a 4 Γ 4 matrix and π΅ is obtained from π΄ by interchanging the first two rows and
then interchanging the last two rows, then πππ‘(π΅) = πππ‘(π΄).
True
If π΄ is a 3 Γ 3 matrix and π΅ is obtained from π΄ by multiplying the first column by 4 and
multiplying the third column by 3/4
, then det(π΅) = 3 det(π΄).
True
If π΄ is a 3 Γ 3 matrix and π΅ is obtained from π΄ by adding 5 times the first row to each
of the second and third rows, then det(π΅) = 25 det(π΄).
False
If π΄ is an π Γ π matrix and π΅ is obtained from π΄ by multiplying each row of π΄ by its
row number, then
det(π΅) = π(π + 1) / 2 * det (π΄)
False
If π΄ is a square matrix with two identical columns, then det(π΄) = 0
True
If the sum of the second and fourth row vectors of a 6 Γ 6 matrix π΄ is equal to the last
row vector, then πππ‘(π΄) = 0.
True
If π΄ is a 3 Γ 3 matrix, then det(2π΄) = 2 det(π΄)
False
If π΄ and π΅ are square matrices of the same size such that det(π΄) = det(π΅),
then det(π΄ + π΅) = 2 det(π΄)
False
If π΄ and π΅ are square matrices of the same size and π΄ is invertible, then
det(π΄^β1π΅π΄) = det (π΅)
True
A square matrix π΄ is invertible if and only if det(π΄) = 0.
False
If π΄ is a square matrix and the linear system π΄π = π has multiple solutions for π, then
πππ‘(π΄) = 0.
True
If π΄ is an π Γ π matrix and there exists an π Γ 1 matrix π such that the linear system
π΄π = π has no solutions, then the reduced row echelon form of π΄ cannot be πΌ_π.
True
If πΈ is an elementary matrix, then πΈπ = π has only the trivial solution
True
If π΄ is an invertible matrix, then the linear system π΄π = π has only the trivial solution if and
only if the linear system π΄^β1 π = π has only the trivial solution.
True
Two equivalent vectors must have the same initial point.
False
The vectors (π, π) and (π, π, 0) are equivalent.
False
If π is a scalar and π is a vector, then π and ππ are parallel if and only if π β₯ 0.
False
The vectors π + (π + π) and (π + π) + π are the same.
True
If π + π = π + π, then π = π.
True
If π and π are scalars such that ππ + ππ = 0, then π and π are parallel vectors.
False
Collinear vectors with the same length are equal.
False
If (π, π, π) + (π₯, π¦, π§) = (π₯, π¦, π§), then (π, π, π) must be the zero vector.
True
If π and π are scalars and π and π are vectors, then
(π + π)(π + π) = ππ + ππ
False
If the vectors π and π are given, then the vector equation
3(2π β π) = 5π β 4π + π
can be solved for π.
True
The linear combinations π1ππ + π2ππ and π1ππ + π2ππ can only be equal if π1 = π1
and π2 = π2.
False
If each component of a vector in π ^3 is doubled, the norm of that vector is doubled.
True
In π
^2, the vectors of norm 5 whose initial points are at the origin have terminal points
lying on a circle of radius 5 centred at the origin.
True
Every vector in π ^π has a positive norm
False
f π is a nonzero vector in π ^π, there are exactly two unit vectors that are parallel to π.
True
f βπβ = 2, βπβ = 1, and π β π = 1, then the angle between π and π is π/3 radians.
True
The expressions (π β
π) + π and π β
(π + π) are both meaningful and equal to each
other.
False
f π β π = π β π, then π = π.
False
If π β π = 0, then either π = 0 ππ π = 0.
False
n π
^2, if π lies in the first quadrant and π lies in the third quadrant, then π β
π cannot be
positive.
True
For all vectors π, π, πππ π in π
π, we have
βπ + π + πβ β€ βπβ + βπβ + βπβ
True
The vectors (3, β1, 2) and (0, 0, 0) are orthogonal.
True
If π and π are orthogonal vectors, then for all nonzero scalars π and π, ππ and ππ are
orthogonal vectors.
True
The orthogonal projection of π on π is perpendicular to the vector component of π
orthogonal to π.
True
If π and π are orthogonal vectors, then for every nonzero vector π, we have
ππππ_π(ππππ_π(π)) = π
True
If π and π are nonzero vectors, then
ππππ_π(ππππ_π(π)) = ππππ_π(π)
True
If the relationship
πππππ π = ππππ_π π
holds for some nonzero vector π, then π = π.
False
For all vectors π and π, it is true that
βπ + πβ = βπβ + βπβ
False
The cross product of two nonzero vectors π and π is a nonzero vector if and only if π and
π are not parallel.
True
A normal vector to a plane can be obtained by taking the cross product of two nonzero
and noncollinear vectors lying in the plane.
True
The scalar triple product of π, π and π determines a vector whose length is equal to the
volume of the parallelepiped determined by π, π and π.
False
If π and π are vectors in 3βspace, then βπ Γ πβ is equal to the area of the parallelogram
determined by π and π.
True
For all vectors π, π and π in 3-space, the vectors (π Γ π) Γ π and π Γ (π Γ π) are the
same.
False
If π, π and π are vectors in π
3, where π is nonzero and π Γ π = π Γ π, then π = π
False
A vector is any element of a vector space.
True
A vector space must contain at least two vectors.
False
The set of positive real numbers is a vector space if vector addition and scalar
multiplication are the usual operations of addition and multiplication of real numbers.
False
If π is a vector and π is a scalar such that ππ = π, then it must be true that π = 0.
False
In every vector space the vectors (β1)π πππ β π are the same.
True
n the vector space πΉ(ββ, β) any function whose graph passes through the origin is a
zero vector
False
An expression of the form π_1 π_π + π_2 π_π + β― + π_π π_π
is called a linear combination.
True
The span of a single vector in π
^2
is a line.
True
The span of two vectors in π
^3
is a plane.
False
The span of any finite set of vectors in a vector space is closed under addition and scalar
multiplication.
False
A set containing a single vector is linearly independent.
False
No linearly independent set contains the zero vector.
True
Every linearly dependent set contains the zero vector.
False
If the set of vectors {π_π, π_π, π_π} is linearly independent, then {πππ, π π_π, π π_π} is also
linearly independent for every nonzero scalar π.
True
If π_π, β¦, π_π are linearly dependent nonzero vectors, then at least one vector π_π is a
unique linear combination of π_π, β¦ , π_πβπ.
True
The set of 2 Γ 2 matrices that contain exactly two 1βs and two 0βs is a linearly
independent set in π_22.
False
The three polynomials (π₯ β 1)(π₯ + 2), π₯(π₯ + 2), πππ π₯(π₯ β 1) are linearly
independent.
True
The functions π_1 and π_2 are linearly dependent if there is a real number π₯ such that
π_1 π_1(π₯) + π_2 π_2(π₯) = 0 for some scalars π_1 and π_2.
False
f π = π πππ{π_π, β¦ , π_π}, then {π_π, β¦ , π_π} is a basis for π.
False
Every linearly independent subset of a vector space π is a basis for π.
False
f {π_π, β¦, π_π} is a basis for a vector space π, then every vector in π can be expressed as
a linear combination of π_π, β¦ , π_π.
True
The coordinate vector of a vector π in π ^π relative to the standard basis for π ^π is π.
True
Every basis of π_4 contains at least one polynomial of degree 3 or less.
False
The zero vector space has dimension zero
True
There is a set of 17 linearly independent vectors in π ^17.
True
There is a set of 11 vectors that span π ^17.
False
Every linearly independent set of five vectors in π ^5 is a basis for π ^5
True
Every set of five vectors that spans π ^5 is a basis for π ^5
True
Every set of vectors that spans π ^π contains a basis for π ^π
True
Every linearly independent set of vectors in π ^π is contained in some basis for π ^π.
True
If π΄ has size π Γ π and
πΌ_π, π΄, π΄^2, β¦ , π΄^π^2
are distinct matrices, then
{πΌ_π, π΄, π΄^2, β¦ , π΄^π^2} is
a linearly dependent set.
True
The span of v_1, β¦, v_n is the column space of the matrix whose column vectors
are v_1, β¦, v_n.
True
The column space of a matrix A is the set of solutions of Ax = b
Fasle
If R is the reduced row echelon form of A, then those column vectors of R that contain
the leading 1βs form a basis for the column space of A.
False
The set of nonzero row vectors of a matrix A is a basis for the row space of A
False
If A and B are n Γ n matrices that have the same row space, then A and B have the same
column space
False
If E is an m Γ m elementary matrix and A is an m Γ n matrix, then the null space of EA is
the same as the null space of A.
True
If E is an m Γ m elementary matrix and A is an m Γ n matrix, then the row space of EA is
the same as the row space of A.
True
If E is an m Γ m elementary matrix and A is an m Γ n matrix, then the column space
of EA is the same as the column space of A.
False
The system Ax = b is inconsistent if and only if b is not in the column space of A.
True
There is an invertible matrix A and a singular matrix B such that the row spaces
of A and B are the same.
False
Either the row vectors or the column vectors of a square matrix are linearly
independent
False
A matrix with linearly independent row vectors and linearly independent column vectors
is square.
True
The nullity of a nonzero m Γ n matrix is at most m
False
Adding one additional column to a matrix increases its rank by one
False
The nullity of a square matrix with linearly dependent rows is at least one
True
If A is square and Ax = b is inconsistent for some vector b, then the nullity of A is zero.
False
If a matrix A has more rows than columns, then the dimension of the row space is
greater than the dimension of the column space.
False
If rank (A^T) = rank(A), then A is square.
False
There is no 3 Γ 3 matrix whose row space and null space are both lines in 3-space.
True
f π΄ is an π Γ π matrix, then the codomain of the transformation π_π΄ is π ^π
False
If π΄ is a 2 Γ 3 matrix, then the domain of the transformation π_π΄ is π _2.
False
If π΄ is a square matrix and π΄π = ππ for some nonzero scalar π, then π is an eigenvector
of π΄.
False
If π is an eigenvalue of a matrix π΄, then the linear system (ππΌ β π΄)π = π has only the
trivial solution.
False
If the characteristic polynomial of a matrix π΄ is
π(π) = π^2 + 1
then π΄ is invertible.
True
If π is an eigenvalue of a matrix π΄, then the eigenspace of π΄ corresponding to π is the set
of eigenvectors of π΄ corresponding to π.
False
The eigenvalues of a matrix π΄ are the same as the eigenvalues of the reduced row echelon
form of π΄.
False
If 0 is an eigenvalue of a matrix π΄, then the set of columns of π΄ is linearly independent.
False
An π Γ π matrix with fewer than π distinct eigenvalues is not diagonalizable.
False
An π Γ π matrix with fewer than π linearly independent eigenvectors is not
diagonalizable.
True
If π΄ is diagonalizable, then there is a unique matrix π such that π
β1 π΄π is diagonal.
False
If every eigenvalue of a matrix π΄ has algebraic multiplicity 1, then π΄ is diagonalizable.
True
The dot product on π
^2
is an example of a weighted inner product.
True
The inner product of two vectors cannot be a negative real number.
False
β©π, π + πβͺ = β©π, πβͺ + β©π, πβͺ.
True
β©ππ, ππβͺ = π2β©π, πβͺ.
True
If β©π, πβͺ = 0, then π = π or π = π.
False
f βπβ^2 = π, then π = 0.
True
If π΄ is an π Γ π matrix, then β©π, πβͺ = π΄π β π΄π defines an inner product on π ^π.
False
If π is orthogonal to every vector of a subspace π, then π = π.
False
f π is a vector in both π and πβ₯, then π = π.
True
If π and π are vectors in πβ₯, then π + π is in πβ₯.
True
If π is a vector in πβ₯ and π is a real number, then ππ is in πβ₯.
True
If π and π are orthogonal, then |β©π, πβͺ| = βπββπβ.
False
If π and π are orthogonal, then βπ + πβ = βπβ + βπβ.
False
Every linearly independent set of vectors in an inner product space is orthogonal.
False
Every orthogonal set of vectors in an inner product space is linearly independent.
False
Every nontrivial subspace of π
^3
has an orthonormal basis with respect to the Euclidean
inner product.
True
Every nonzero finiteβdimensional inner product space has an orthonormal basis.
True
ππππ_ππ is orthogonal to every vector of π.
False