What Does It Mean to Be Row Equivalent

This lecture defines the concept of row equivalence and proves some propositions nearly row equivalent matrices that prevarication at the center of many important results in linear algebra.

Table of Contents

Table of contents

  1. Definition

  2. Equivalence relation

  3. Column correspondence property

  4. Dominant columns

  5. Row equivalent matrices in reduced row echelon form

  6. Rank and equivalence

We get-go with a definition of row equivalence.

Definition Let A and $B$ be ii $K	imes L$ matrices. We say that A is row equivalent to $B$ if and only if at that place exist $K	imes K$ elementary matrices [eq1] such that [eq2]

Retrieve that pre-multiplying A by an elementary matrix is the same equally performing an simple row operation on A . Therefore, A is row equivalent to $B$ if and only if A tin exist transformed into $B$ by performing a sequence of elementary row operations on A .

Row equivalence is an equivalence relation because it is:

Proof

Suppose A is row equivalent to $B$ . Since an elementary matrix is invertible and its inverse is an elementary matrix, we take that [eq3] where [eq4] are elementary matrices. Therefore, $B$ is equivalent to A . This proves symmetry. If A is equivalent to $B$ and $B$ is equivalent to $C$ , and then [eq2] and [eq6] where [eq7] and [eq8] are elementary matrices. Now, pre-multiply both sides of the get-go equation past [eq9] : [eq10] Then, A is equivalent to $C$ , that is, row equivalence is transitive. Finally, for whatever elementary matrix E , we can write [eq11] Since $E^{-1}$ is uncomplicated, this means that we can transform A into itself past means of elementary row operations. As a consequence, row equivalence is reflexive.

The next proposition states an important property of row equivalence, known as column correspondence property.

Proposition Let A and $B$ be ii $K	imes L$ matrices. Permit A be row equivalent to $B$ . Denote by $A_{ullet l}$ and $B_{ullet l}$ the $l$ -th columns of A and $B$ respectively. Then, [eq12] for an $L	imes 1$ vector $v$ if and just if [eq13]

Proof

In other words, when A and $B$ are row equivalent, the $l$ -thursday cavalcade of A can exist written every bit a linear combination of a given set of columns of A itself, with coefficients taken from the vector $v$ , if and only if the $l$ -th column of $B$ is a linear combination of the corresponding fix of columns of $B$ , with coefficients taken from the same vector $v$ .

A useful corollary of the previous proffer follows.

Proposition Allow A and $B$ be two row equivalent matrices. And so, a fix of columns of A is linearly independent if and only if the corresponding set of columns of $B$ is linearly independent.

Proof

This department introduces the concept of dominant columns, which will be used beneath to study the properties of row equivalent matrices.

Definition Let A be a $K	imes L$ matrix. Denote its $l$ -thursday column by $A_{ullet l}$ . We say that $A_{ullet l}$ is a dominant column if and only if it cannot be written every bit a linear combination of the columns to its left.

A first unproblematic result about ascendant columns follows.

Suggestion Two equivalent matrices A and $B$ have the same set of dominant columns, that is, the set of indices of the ascendant columns of A coincides with the gear up of indices of the dominant columns of $B$ .

Proof

For instance, if the ascendant columns of A are the second, third and fifth, and then the dominant columns of $B$ are the second, third and 5th.

The propositions above permit us to prove some properties of matrices in reduced row echelon form.

Call back that a matrix is in reduced row echelon form (RREF) if and only if:

  • all its non-zip rows contain an element, called pivot, that is equal to 1 and has just zero entries in the quadrant below it and to its left;

  • each pivot is the only non-nothing element in its column;

  • all the naught rows (if there are any) are below the non-zero rows.

Furthermore, the Gauss-Jordan elimination algorithm can be used to transform any matrix into an RREF matrix past elementary row operations. Therefore, any matrix is row equivalent to an RREF matrix.

Remember that a basic column is a column containing a pivot, while a non-basic column does not contain any pivot.

The bones columns of an RREF matrix are vectors of the canonical basis, that is, they have ane entry equal to 1 and all the other entries equal to zero. Furthermore, if an RREF matrix has $b$ basic columns, then those columns are the starting time $b$ vectors of the canonical basis, equally stated by the following proposition.

Proposition Let $R$ be a matrix in reduced row echelon grade. Then, the $l$ -th bones column of $R$ , counting from the left, is equal to the $l$ -thursday vector of the canonical footing, that is, information technology has a 1 in position $l$ and all its other entries are equal to 0.

Proof

Nosotros now land some simple results concerning bones and non-basic columns.

Proposition A basic column of a matrix in reduced row echelon form is a dominant column.

Proof

A basic cavalcade contains a pivot, equal to 1, and all the entries to the left of the pivot are equal to 0. Therefore, the basic column cannot exist written as a linear combination of the columns to its left (no linear combination of 0s can be equal to i). Hence, it is a dominant cavalcade.

Proffer A not-basic column of a matrix in reduced row echelon form is not a dominant column.

Proof

If a cavalcade $R_{ullet l}$ is non-basic, that is, it has no pivot, and then it tin can be written every bit [eq27] where k is the number of bones columns to its left (the entries below the k -th must be zero because the $m$ -thursday pivot, with $m>k$ , has merely 0s to its left). Therefore, the not-basic column $R_{ullet l}$ can be written equally a linear combination of the columns to its left. For case, if $k=3$ and the showtime, third and fourth columns are basic, then [eq28] Thus, if a column $R_{ullet l}$ is non-basic it is not linearly contained from the columns to its left. Hence, it is not a dominant column.

Past combining the two simple propositions in a higher place, we get the following one.

Proffer If a matrix is in reduced row echelon form, so one of its columns is basic if and simply if it is ascendant, and it is non-basic if and only if information technology is non dominant.

Proof

By the previous proposition, if a column is dominant, and then it cannot be non-basic. Therefore, it is basic. We take already established the opposite implication (basic implies dominant). Therefore, a column is dominant if and only if it is basic. The proof of equivalence for not-dominant columns is analogous.

Thus, when a matrix is in reduced row echelon course, we tin use the concepts of basic and dominant cavalcade interchangeably.

Nosotros are now ready to state the near of import proposition of this lecture.

Proposition Any matrix is row equivalent to a unique matrix in reduced row echelon grade.

Proof

We have already explained that whatever matrix A is row equivalent to a matrix in reduced row echelon form which tin be derived past using the Gauss-Jordan elimination algorithm. We need to prove uniqueness. Suppose that two matrices $R_{A}$ and $S_{A}$ are in reduced row echelon form and that they are both row equivalent to A . Since row equivalence is transitive and symmetric, $R_{A}$ and $S_{A}$ are row equivalent. Therefore, the positions of their dominant columns coincide. Equivalently, the positions of their basic columns coincide. But we accept proved above that the $l$ -th basic column of an RREF matrix, counting from the left, is equal to the $l$ -th vector of the canonical basis. Therefore, non only the bones columns of $R_{A}$ and $S_{A}$ have the same positions, only their corresponding entries coincide. The non-basic columns are linear combinations of the bones ones. By the cavalcade correspondence property above, the coefficients of the linear combinations are the same for $R_{A}$ and $S_{A}$ . But besides the vectors being combined linearly coincide because the basic columns of $R_{A}$ and $S_{A}$ coincide. As a upshot, each non-basic column of $R_{A}$ is equal to the corresponding not-basic column of $S_{A}$ . Thus, $R_{A}=S_{A}$ , which proves that the row equivalent RREF of a matrix is unique.

A consequence of this uniqueness result is that if 2 matrices are row equivalent, then they are equivalent to the same RREF matrix.

Proposition Let A be row equivalent to $B$ . Then, A and $B$ are equivalent to the same RREF matrix $R_{A}$ .

Proof

In this section we present some corollaries of the results we have proved in the previous sections.

Proof

Conspicuously, since the identity matrix I is a matrix in reduced row echelon grade, any invertible matrix is equivalent to the unique RREF matrix I .

An immediate consequence of the previous proposition follows.

Proposition Let A exist a $K	imes K$ invertible matrix. And then, A can be written every bit a production of elementary matrices: [eq33] where [eq7] are elementary matrices.

Proof

While the previous 2 propositions business organization square invertible matrices, the following proffer applies to matrices that can be non-foursquare and non-invertible.

Proposition Let $R$ be an RREF matrix that is row equivalent to a matrix A . Then A and $R$ take the same rank. The rank is equal to 1) the number of not-naught rows of $R$ or, equivalently, to two) the number of basic columns of $R$ .

Proof

Delight cite as:

Taboga, Marco (2021). "Row equivalence", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/row-equivalence.

cainthorry.blogspot.com

Source: https://www.statlect.com/matrix-algebra/row-equivalence

0 Response to "What Does It Mean to Be Row Equivalent"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel