Practice Problems Flashcards
In some cases, a matrix may be reduced to more than one matrix in reduced echelon form, using different sequences of row operations
False
The row reduction algorithm applies only to augmented matrices for a linear system
False
A basic variable in a linear system is a variable that corresponds to a pivot column in the coefficient matrix
True
Finding a parametric description of the solution set of a linear system is the same as solving the system
True
If one row in an echelon form of an augmented matrix is [ 0 0 0 5 0] the associated linear system is inconsistent
False
The Echelon form of a matrix is unique
False
The pivot positions in a matrix, depending on whether row interchanges are used in the row reduction process
False
Reducing a matrix to echelon form is called the forward phase of the row reduction process
True
Whenever a system has free variables, the solution that contains many solutions
False. The existence of at least one solution is not related to the presence or absence of free variables. If the system is inconsistent, the solution set is empty.
A general solution of a system is an explicit description of all solutions of the system
True
Suppose a 3 x 5 coefficient matrix for a system has three pivot columns is the system consistent? Wire why not?
Yes, the system is consistent because with three pivots, there must be a pivot in the third bottom row of the coefficient matrix the reduced echelon form cannot contain a row of the form [ 0 0 0 0 0 1]
Suppose a system of linear equations has a 3 x 5 augmented matrix whose fifth column is a pivot column. Is the system consistent? Wire why not
The system is inconsistent because the pivot in the fifth column means that there is a row of [ 0 0 0 0 01] in the reduced echelon form. Since the matrix is the augmented matrix for our system, then this is an evil row.
Suppose the coefficient matrix of a system of linear equations has a pivot position in every row. Explain why the system is consistent.
If the coefficient matrix has a pivot position in every row, then there is a pivot position in the bottom row, and there is no room for fit in the augmented column. So the system is consistent by theorem two
Suppose the coefficient matrix of a linear system of three equations in three variables has a pivot in each column explain why the system has a unique solution
Since there are three pivots one in each row, the augmented matrix must reduce no matter what the values of ABC the solution exists and is unique.
Restate the last sentence in theorem two using the concept of pivot columns.” if a linear system is consistent, then the solution is unique if and only if_____”
Every column in the coefficient matrix is a pivot column otherwise there are infinitely many solutions
What would you have to know about the pivot columns in any augment matrix in order to know that the linear system is consistent and has a unique solution
Every column in the augmented matrix except the right most column is a pivot column, and the right most column is not a pivot column.
A system of linear equations with fewer equations than known is sometimes called an undetermined system. Suppose that such a system happens to be consistent. Explain why there must be an infinite number of solutions.
An undetermined system always has more variable equations. There cannot be more basic variables than there are equations. There must be at least one free variable. Such a variable may be assigned, infinitely many different values. If the system is consistent, each different values of a free variable will produce a different solution.
A system of linear equations with more equations than unknown is sometimes called an overdetermined system. Can such a system be consistent?
Yes, our system of linear equations with more equations than unknown can be consistent
Another notation for the vector [-4/3] is [-4 3]
False, the alternative notation for a column vector is (-4, 3) using parentheses and commas
The point in the plane corresponding to the column vectors (-2, 5) and (-5, 2) lie on a line through the origin
False. Plot the points to verify this.
An example of a linear combination of vectors V1 and V2 is the vector 1/2 V1
True
The solution to a set of linear systems whose augmented matrix is [a1 a2 a3 b] is the same as the solution set of the equation x1a1 + x2a2 + x3a3 = b
True
The set span{u, v} It’s always visualized as a plane through the origin.
False. The statement is often true, but the span can also be a line or the zero vector.
Any list of five real numbers as a vector in R5
True
The vector u results when a vector u-v is added to the vector v
True
The weights c1,…….cp in a linear combination c1v1 +… cpvp cannot all be zero
False
When u and c are nonzero vectors, Span {u,v} contains the line throughout u and the origin
True
Asking whether the linear system corresponding to an augmented matrix [a1 a2 a3 b] has a solution amounts to asking whether B is in the span{a1 a2 a3}
True
The equation AX = B is referred to as a vector equation
False, that is the matrix equation
A vector B is a linear combination of the column of matrix a F and only if the equation Ax= b has at least one solution
True
The equation Ax=b is consistent if the augmented matrix [A b] has a pivot position in every row
False
The first entry in the product Ax is a sum of product
True
If the columns of a nxm matrix A span Rm then the equation Ax= b is consistent first each b in R^m
True
If A is an mxn matrix and if the equation Ax=b is inconsistent for some b in R^m then A cannot have a pivot position in every row
True
Every matrix equation Ax=b corresponds to a vector equation with the same solution set
True
Any linear combination of vectors can always be written in the form Ax for a suitable matrix A and vector x
True
The solution set of a linear system whose augmented matrix is [a1 a2 a3 b] is the same as the solution set of Ax=b if A =[a1 a2 a3]
True
If the equation Ax=b is inconsistent then b is not in the set spanned but the columns of A
True
In the augmented matrix [A b] has a pivot position in every row then the equation Ax=b is inconsistent
False
In A is an mxn matrix whose columns do not span Rm then the equation Ax=b is inconsistent for some b in R^m
True
Let A be a 3x2 matrix. Explain why the equation Ax=b cannot be consistent for all b in R^3. Generalize your argument to the case of an arbitrary A with more rows than columns.
A 3x2 matrix has three rows and two columns. With only two columns, A can have at most two pivot columns and so A has at most two pivot positions, which is not enough to fill all three rows. By Theorem 4, the equation Ax=b cannot be consistent for all b in R^3. Generally, if A is an mxn matrix m>n then A can have at most n pivot positions which is not enough to fill all m rows. Thus the equation Ax=b cannot be consistent for all b in R^3
Could a set of three vectors in R^4 span all of R^4? Explain. What about n vectors in R^m when n is less than m?
A set of three vectors in cannot span R^4. Reason: the matrix A whose columns are these three vectors has four rows. To have a pivot in each row, A would have to have at least 4 columns which is not the case. Since A does not have a pivot in every row, its columns do not span R^4 by theorem 4.
A homogeneous equation is always consistent
True
The equation Ax=0 gives an explicit description of its solution set.
False. It gives an implicit description of
The homogeneous equation Ax=0 has the trivial solution if an only if the equation has at least one free variable
False
The equation x = p+tv describes a line though v parallel to p
False
The solution set of Ax=b is the set of all vectors of the form w=p+vh where vh is any solution of the equation Ax=0
False
If x is a nontrivial solution of Ax=0 then every entry in x is nonzero
False
The equation x= x2u + x3v with x2 and x3 free and neither u nor v a multiple of the other, describes a plane through the origin
False
The equation Ax=b is homogeneous if the zero vector is a solution
False
The effect of adding p to a vector is to move the vector in a direction parallel to p
False
The solution set of Ax=b is obtained by translating the solution set of Ax=0
False
The columns of a matrix A are linearly independent if the equation Ax=0 has the trivial solution
False
If S is a linearly dependent set then each vector is a linear combination of the other vectors in S.
False
The columns of any 4x5 matrix are linearly dependent
True
If x and y are linearly independent and if {x,y,z} is linearly dependent, then z is in span{x,y}
True
Two vectors are linearly dependent if and only if they lie on a line through the origin
True
If a set contains fewer vectors than there are entries in the vectors then the set is linearly independent
False
If x and y are linearly independent and if z is in son {x,y} then {x,y,z} is linearly dependent
True
If a set in R^n is linearly dependent then the set contains more vectors than there are entries in each vector
False
A linear transformation is a special type of function
True
If A is a 3x5 matrix and T is a transformation defined by T(x) =Ax then the domain of T is R^3
False
If A is an mxn matrix then the range of the transformation x-> Ax is R^m
False
Every linear transformation is a matrix transformation
False
A transformation T is linear if and only if T(c1v1 + c2v2) = c1T1(v1) + c2T(v2). For all v1 and v2 in the domain of T and for all scalars c1 and c2
True
Every matrix transformation x-> Axis is the set of all linear combinations of the columns of A
True
The codomain of the transformation x-> Axis is the set of all linear combinations if the columns of A
False
If T: R^n -> R^m is a linear transformation and if c is in R^m, then a uniqueness question is “Is c in the range of T?”
False
A linear transformation preserves the operations of vector addition and scalar multiplication
True
The superposition principle is a physical description of a linear transformation
True
A linear transformation T: R^n -> R^m is completely determined by its effect on the columns of the nxn identity matrix
True
If T: R^2 -> R^2 rotates vectors about the origin through the angle y, then T is a linear transformation
True
When two linear transformations are performed one after another, the combined effects may not always be a linear transformation
False
A mapping T: R^n -> R^m is onto R^m if every vector x in R^n maps onto some vector in R^m
False
If A is a 3x2 matrix then the transformation x -> Axis can not be one-one
False
Not every linear transformation from R^n to R^m is a matrix transformation.
False
The columns of the standard matrix for a linear transformation from R^n to R^m are the images of the columns of the nxn identity matrix
True
The standard matrix of a linear transformation from R^2 to R^2 that reflects points through a horizontal axis the vertical axis or the origin has the form ( a 0, 0 d)
True
A mapping T: R^n -> R^n is one to one if each vector in R^n maps onto a unique vector in R^m
False
If A is a 3x2 matrix then the transformation x-> Axis can not map R^2 onto R^3
True
The determinant of A is the product of the diagonal entries in A
False
An elementary row operation on A does not change the determinant
False
Det(A) det(B) = detAB
True
If lambda + 5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A
False
If A is 3x3 with columns a1 a2 and a3 then det A equals the volume of the parallelepiped determined by a1 a2 and a3
False
Det(A^T) = (-1) det(A)
False
The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A
True
The row replacement operation on A does not change the eigenvalue
False
If Ax= lamdba x for some vector x then lambda is an eigenvalue of A.
False. It must have a no trivial solution
A matrix A is nit invertible if and only if 0 is an eigenvalue of A
True
The number c is an eigenvalue of A if and only if 0 is an eigenvalue of A
True
A number c is an eigenvalue of A if and only if the equation (A-cI)c =0 has a nontrivial solution
True
Finding an eigenvector of A may be difficult but checking whether a given vector is in fact an eigenvector is easy
True
To find eigenvalues of A reduce A to echelon form
False
If Ax= lambda x for some scalar lmbda then x is an eigenvector of A
False
If v1 and v2 are linearly independent eigenvectors then they correspond to distinct eigenvalues
False
A steady state vector if a stochastic matrix is actually an eigenvector
True
The eigenvalue of a matrix are on its main diagonal
False
An eigenspace of A is a null space of a certain matrix
True