We have introduced Guassian elimination in 2.2. Here we study whether the given system of linear equation has a solution. To start with, we consider the system of linear equation with the unkowns and the number of equations are different.
be a matrix,
be the augmented matrix,
be the number of unknowns,
be the row vectors, and
be the column vetors of the order
. Then the system of linear equations is written as
and this vector is called solution vector. In general, the following holds.
has a solution if and only if
Proof
Let
be a matrix of the form
. Suppose that
. Then the dimension of the column space of
is
and
. Thus,
st column vectorb of the column space
is a linear combination of
Conversely, think of columns of
as column vectors. Thne the equation
can be expressed as
, then this equation implies that
is a linear combination of
and the column space of
are the same. This shows that
.
so that the following equation has a solution.
Answer
![]() |
![]() |
![]() |
|
![]() |
![]() |
. Thus
must be 0. Therefore
.
General solution of system of linear equation
be a pair of solution to the system of linear equations
. Then every solution of
is given by
. Here,
is a solution of
.
Proof
Let
be a solution of
. Then,
is a solution of
since
. Then
is a solution of
. Thus, we have
.
With this theorem, to solve the system of linear equations
, we first need to solve the homogeneous system
. So, we consider the homogeneous system of linear equations
.
Note that
. Then by the theorem 2.3, a solution exists. In fact,
is a solution. This solution is called trivial solution. Next Let
numbers of nontrivial solutions of the system of linear equations be the followings:
satisfies
Thus,
is a subspace. This subspace is called solution space. The basis of this solution space is called an (elementary solution of the system of linear equation
.
It is interesting to know the relationship between the dimension of the solution space and the rank of the matrix.
unknowns
. Suppose the rank of the coefficient matrix
is
, then the fundamental solution is composed of the
solution vectors.
Proof
For
, apply the elementry row operation on
to obtain the followings:
are solution vectors. Also, let
. Thus,
is linearly independent. Therefore the dimension of the solution space is
.
Degree of freedom
The number of elementary solutions
is called a degree of freedom.
Answer
Apply Gaussian elimination..
![]() |
![]() |
![]() |
|
![]() |
![]() |
which implies that this system of linear equation has a solution. If we write
as a system of linear equatios:
. Thus the defree of freedom is
. Then we let
and we have the followings:
Inverse Matrix
Consider the matrix
.
is not zeor matrix. Now we ask you a question. Do we have a matrix
so that
. Unfortunately, the answer is no. In the world of matrices, some nonzero matrix does not have inverse.
Regular Matrix
For the square matrix
of the order
, if there exists a matrix
so that
, then the matrix
is called a regular matrix. The matrix
is called an inverse matrix of
. Now how many inverse matrices of
exist? For example,
. Then
. Thus, the inverse matrix of
is one. Then we write the inverse matrix of
as
.
Using this idea, we solve the system of linear equations
. Let
be a regular matrix. Then we have
. From this, we have
. Therefore, to solve the system of linear equations, we can use the inverse matrix
. In short, it is enough to find a matrix
so that
We know a quick way to find
. Before introducing the technique, we cover the relation between the regular matrix and the rank of the matrix.
is a squre matrix with the order of
. Then the followings are equivalent.
is regular matrix.
Proof
By theorem 2.2,
and
are equivalent.
In other words, for
, we can choose the product of elementary matrices
so that
. Thus,
is regular.
Conversely, for
is a regular matrix, we can choose a product of elementary matrices
so that
. Note that
is the product of elementary matrices. Then
is regular. Thus,
exists. Suppose that
. Then the entries of the lowest row of
have to be 0. But then this matrix is not regular. Thus,
.
Now we are ready to introduce how to find a matrix
satisfying
. We note that since
is regular, for some product
of elementary matrices,
. So, we multiply this
to
from the left. Then
. Thus to find
is the same as to find
. This shows that if you can get the identiry matrix by applying elementary row operation to
, the the same operation on
gives rise to
.
is regular. If so, find the inverse matrix of
where
.
Answer
![]() |
![]() |
![]() |
|
![]() |
![]() |
||
![]() |
![]() |
is regular and
.
Summarize what we have studied so far, we have
is the square matrix of the order
. Then the followings are equivalent.
is regular
Proof
By theorem 2.2, we have
. By theorem 2.3, we have
.
1. Solve the following system of linear equations using Gaussian elimination.
2. Determine the value of
so that the following system of linear equations has a solution.
3. Determine whether the following matrix is regular. If so, find the inverse matrix,
(a)
4. Determine the value
so that the following matrix is regular.
5. Show that the following matrix is regular, and show the following matrix as a product of elementary matrices.
.
6. Suppose that all entries of one row of the square matrix are 0. The show that
is not regular.
7. Suppose that
are regular matrices of the order
. Then show thatthe product of
is also regular and satisfies