Lecture Note: §2 Matrix Algebra

Published:

Last Update: 2026-04-09


The Matrix is everywhere. It is all around us. Even now, in this very room.

We have already used matrices—in the form of augmented matrices—to record information and streamline calculations for systems of linear equations.

Matrices appear in many other forms besides augmented matrices, and they possess their own algebraic properties that allow us to compute with them under the rules of matrix algebra.

Furthermore, matrices are not static data-recording objects; they represent functions that act on vectors, transforming them into other vectors. These matrix transformations will play a key role in linear algebra and provide new perspectives on vectors and systems of linear equations.

§2.1 Matrix Operations

  • Definition
    • A matrix is a rectangular array of numbers called the entries, or ­elements, of the matrix, $A = \begin{bmatrix}a_{ij}\end{bmatrix}$.
    • If $A$ is an $m\times n$ matrix, that is, a matrix with $m$ rows and $n$ columns.
    • Column form: $A = \begin{bmatrix} \vec{a}_1 & \vec{a}_2 & \cdots & \vec{a}_n\end{bmatrix}$
    • Diagonal entrices: $a_{11}, a_{22}, a_{33},\ldots$.
    • Diagonal matrix, zero matrix, square matrix.
  • Linear Operations
    • Sum: $A + B = \begin{bmatrix} a_{ij} + b_{ij}\end{bmatrix} = \begin{bmatrix} \vec{a}_1 + \vec{b}_1 & \vec{a}_2 + \vec{b}_2 & \cdots & \vec{a}_n + \vec{b}_n\end{bmatrix}$
    • Scalar multiple: $rA = \begin{bmatrix} r a_{ij}\end{bmatrix} = \begin{bmatrix} r\vec{a}_1 & r\vec{a}_2 & \cdots & r\vec{a}_n\end{bmatrix}$
    • Properties of linear operations
  • Matrix Multiplication
    • Multiplication of matrices corresponds to composition of matrix transformations, that is, $\forall \vec{x}, A(B\vec{x}) = (AB)\vec{x}$.
    • If $A$ is an $m\times n$ matrix, and if $B$ is an $n\times p$ matrix with columns $\vec{b}_1,\cdots,\vec{b}_p$, then the product $AB$ is the $m \times p$ matrix, \(AB=A\begin{bmatrix} \vec{b}_1 & \vec{b}_2 & \cdots & \vec{b}_p \end{bmatrix}=\begin{bmatrix} A\vec{b}_1 & A\vec{b}_2 & \cdots & A\vec{b}_p \end{bmatrix}\)
      • Each column of $AB$ is a linear combination of the columns of $A$ using weights from the corresponding column of $B$.
    • Row — Column Rule for Computing $AB$: \((AB)_{ij} = a_{i1}b_{1j}+a_{i2}b_{2j} + \cdots + a_{in}b_{nj}\)
    • Properties of Matrix Multiplication
      • No commutive law, $AB \neq BA$.
    • Power of a Matrix, $A^k$
  • Transpose of a Matrix
    • Given an $m\times n$ matrix $A$, the transpose of $A$ is the $n\times m$ matrix, denoted by ${A^{T}}$, whose columns are formed from the corresponding rows of $A$.
      • $(A^{T}){ij} = A{ji}$ for all $i$ and $j$.
    • Inner Product (dot product): $\vec{u}\cdot\vec{v} = u_1 v_1 + \cdots + u_n v_n = \vec{u}^{T}\vec{v} = \vec{v}^{T}\vec{u}$
    • Properties of the Transpose
      • $(AB)^{T} = B^{T}A^{T}$

§2.2 Inverse of a Matrix

  • Definition
    • An $n\times n$ matrix $A$ is said to be invertible if there is an $n\times n$ matrix $C$ such that $CA=I$ and $AC=I$.
    • If $A$ is invertible, then the inverse is unique, and denoted by ${A^{-1}}$.
    • Singular matrix — not invertible
    • Nonsingular matrix — invertible
  • Properties of Inverses
    • $(A^{-1})^{-1} = A$
    • If $A$ and $B$ are $n\times n$ invertible matrices, then so is $AB$, and $(AB)^{-1} = B^{-1}A^{-1}$
    • If $A$ is an invertible matrix, then so is $A^T$, and $(A^T)^{-1} = (A^{-1})^T$
  • Solution to Linear Systems. If $A$ is an invertible $n \times n$ matrix, then for each $\vec{b} \in R^n$, the equation $A\vec{x}= \vec{b}$ has the unique solution $\vec{x} = A^{-1}\vec{b}$.

  • Elementary Matrices and Matrix Form of A Single Elementary Row Operation
    • Elementary Matrix. An elementary matrix $E$ is obtained by performing a single elementary row operation on an identity matrix.
    • If an elementary row operation is performed on an $m \times n$ matrix $A$, the resulting matrix can be written as ${EA}$, where the $m \times m$ matrix $E$ is created by performing {the same row operation} on $I_m$.
    • Each elementary matrix $E$ is invertible. The inverse of $E$ is the elementary matrix of the same type that transforms $E$ back into $I$.
  • Theorem: An $n \times n$ matrix $A$ is invertible if and only if $A$ is row equivalent to $I_n$, and in this case, any sequence of elementary row operations that reduces $A$ to $I_n$ also transforms $I_n$ into $A^{-1}$.
    • An Algorithm for Finding $A^{-1}$. Row reduce the augmented matrix ${[A \ \ I ]}$. If $A$ is row equivalent to $I$, then $[A \ \ I]$ is row equivalent to ${[I \ \ A^{-1}]}$. Otherwise, A does not have an inverse.

§2.3 Characterizations of Invertible Matrices

  • The Invertible Matrix Theorem. Let $A$ be a square $n \times n$ matrix. Then the following statements are equivalent.
    1. $A$ is an invertible matrix.
    2. $A$ is row equivalent to the $n\times n$ identity matrix.
    3. $A$ has $\vec{n}$ pivot positions.
    4. The equation $A\vec{x}=\vec{0}$ has only the trivial solution.
    5. The equation $A\vec{x}=\vec{b}$ has at least one solution for each $\vec{b} \in \mathbb{R}^n$.
    6. The columns of $A$ span $\mathbb{R}^n$.
    7. There is an $n\times n$ matrix $C$ such that $CA=I$.
    8. There is an $n\times n$ matrix $D$ such that $AD=I$.
    9. $A^T$ is an invertible matrix.
    10. $A$ is the product of a sequence of elementrary matrices.
  • Theorem. Let $A$ and $B$ be square matrices. If ${AB=I}$, then $A$ and $B$ are both invertible, with $B=A^{-1}$ and $A=B^{-1}$.

§2.4 Partitioned Matrices

  • It will often be convenient to regard a matrix as being composed of a number of smaller submatrices. By introducing vertical and horizontal lines into a matrix, we can partition it into blocks.

  • Linear Operations
    • If matrices $A$ and $B$ are the same size and are partitioned in exactly the same way, then it is natural to make the same partition of the ordinary matrix sum $A + B$.
    • $rA$
  • Multiplication of Partitioned Matrices.
    • Partitioned matrices can be multiplied by the usual row–column rule as if the block entries were scalars, provided that for a product $AB$, the column partition of $A$ {matches} the row partition of $B$.

    • Column–Row Expansion of $AB$。 If $A$ is $m \times n$ and $B$ is $n \times p$, then \(\begin{aligned} AB &= \begin{bmatrix} \operatorname{col}_1(A) & \operatorname{col}_2(A) & \cdots & \operatorname{col}_n(A) \end{bmatrix} \begin{bmatrix} \operatorname{row}_1(B) \\ \operatorname{row}_2(B) \\ \vdots \\ \operatorname{row}_n(B) \end{bmatrix} \\[4pt] &= \operatorname{col}_1(A)\operatorname{row}_1(B) + \cdots + \operatorname{col}_n(A)\operatorname{row}_n(B) \end{aligned}\)

  • Inverses of Partitioned Matrices
    • Block Elimination: row reduction of partitioned matrix.
    • A block upper triangular matrix is invertible if and only if each block on the diagonal is invertible.
    • A block diagonal matrix is invertible if and only if each block on the diagonal is invertible, and \(A = \begin{bmatrix} A_{11} & O & O & O\\ O & A_{22} & O & O\\ O & O & \ddots & O\\ O & O & O & A_{nn} \end{bmatrix} \Longrightarrow A^{-1} = \begin{bmatrix} A^{-1}_{11} & O & O & O\\ O & A^{-1}_{22} & O & O\\ O & O & \ddots & O\\ O & O & O & A^{-1}_{nn} \end{bmatrix}\)