1. What is an invertible matrix?
A square matrix is invertible (also called non-singular) if there exists another matrix such that:
where is the identity matrix.
If such an inverse exists, we can βundoβ the effect of multiplying by , just like dividing by a number.
2. When is a matrix invertible?
Key conditions:
- Square: Must be ().
- Full rank: Rank must be (its columns/rows are linearly independent).
- Determinant nonzero: .
- Eigenvalues nonzero: None of the eigenvalues are 0.
All of these are equivalent ways of saying the same thing.
3. How to check if a matrix is invertible
Practical ways:
-
Determinant test:
If , is invertible.
(Quick for small matrices, but unstable for large ones numerically.) -
Rank test:
If , itβs invertible.
(Used in practice; can be computed with Gaussian elimination or SVD.) -
Row reduction (Gaussian elimination):
If you can reduce to the identity without hitting a row of zeros, itβs invertible.
4. Why does it matter in regression?
In linear regression, the matrix we want to invert is:
- If features (columns of ) are linearly independent, then is invertible.
- If some features are redundant (collinear), then is singular , and we cannot invert it β we use the pseudoinverse instead.