1. What is an invertible matrix?
A square matrix
where
If such an inverse exists, we can “undo” the effect of multiplying by
2. When is a matrix invertible?
Key conditions:
- Square: Must be (
). - Full rank: Rank must be
(its columns/rows are linearly independent). - Determinant nonzero:
. - Eigenvalues nonzero: None of the eigenvalues are 0.
All of these are equivalent ways of saying the same thing.
3. How to check if a matrix is invertible
Practical ways:
-
Determinant test:
If, is invertible.
(Quick for small matrices, but unstable for large ones numerically.) -
Rank test:
If, it’s invertible.
(Used in practice; can be computed with Gaussian elimination or SVD.) -
Row reduction (Gaussian elimination):
If you can reduceto the identity without hitting a row of zeros, it’s invertible.
4. Why does it matter in regression?
In linear regression, the matrix we want to invert is:
- If features (columns of
) are linearly independent, then is invertible. - If some features are redundant (collinear), then
is singular , and we cannot invert it → we use the pseudoinverse instead.