Exam zone
On Least Squares Inversion A problem of importance that we will see appearoften inoptimal estimation is theconceptofleast-squaresinversion. Herewewilllookatthebasicideabehind theleast-squaresestimationhoopla. Considerasystemof m linearequationsin n unknowns: Ax = b : Our goal is to look at two specific situations of interest: 1. rank( A ) = m , i.e., A is of full row-rank. 2. rank( A ) = n , i.e., A is of full column-rank. Letusfirstlookatthecasewherethesystemhasfullrow-rank. Therectangular system described above does not have aunique solution because b 2= range(A ). If the matrix A is of full row-rank then the matrix AA H is self-adjoint and furthermore: rank( AA H ) =rank( A ) = m: Upon making the substitution x = A Hy equations: AA yH Since the matrix AA system as:
H
we obtain a modified system of linear = b:
is invertible we can obtain the solution to the modified y =( AA
H ¡ 1
)
b:
Substituting this solution back into the original system we obtain: x ls = A Hy
= A H ( AA
H ¡ 1
)
b = A yrb :
The matrix A yr is refered to as the le ast squar es generalized right inverse. Note that this is a true right inverse because: AA
y r
= AA
H
( AA
H ¡ 1
)
= I:
Note further that the following relation holds: y AA A r
= A:
On the other hand the matrix P = A yrA is idempotent since: £ ¤£ ¤ P 2 = A H ( AA H ) ¡ 1 A A H ( AA H ) ¡ 1 A = A H ( AA H ) ¡ 1 A = P : This relation implies that the eigenvalues of P are either zero or one since: P 2v = Pv = ¸ v = ¸ 2 v :
The matrix P is in particular the projection matrix that projects vectors on to the row space of the matrix A . Further note that this projection matrix is a self-adjoint projection matrix, i.e., ¡ ¢H P H = A H ( AA H ) ¡ 1 A = A H ( AA H ) ¡ H A = A H (( AA H H) ) ¡ 1 A = P : The projection matrix onto the orthogonal complement is given by: ˜ = I ¡ P = I ¡ A yrA = I ¡ A H ( AA P
H ¡ 1
)
A:
This constitutes the projection onto the null space of the matrix A because for any b 2 R m : y ˜ = A ( I ¡ A yrA ) b =( A ¡ AA A APb ) b = 0b : r
The error in the least-squares solution is evaluated as: ˜ = b ¡ A yAb : e1 = Pb The energy in the least-squares error is therefore given by: H ˜ ² 21 = jj e1 jj 22 = b HP ˜Pb :
When the matrix A has full column-rank then the following observations hold: ² The matrix A HA
is self-adjoint.
² The matrix A HA
is of full-rank, i.e., rank( A ) =rank( A HA ) = n .
Multiplying both sides of the original equation system by A H yields: A HAx
ls
= A Hb :
The solution to this modified system is given by: x ls =( A HA ) ¡ 1 A Hb
= A ylb :
y
Thematrix A l isreferedtoastheleast-squaresgeneralizedleft-inversebecause: A ylA =( A HA ) ¡ 1 A HA
y = I ; AA A l
= A:
The projection matrix P r onto the range space of the matrix A is given by: P r = AA
y l
= A ( A HA ) ¡ 1 A H :
The corresponding projection operator onto the left-null space of the matrix A is given by: ˜ ln = I ¡ AA y = I ¡ A ( A HA ) ¡ 1 A H P l Furthermore, this is indeed a projection onto the left-null space because for b 2 Rn: b HP Aln
= b H ( I ¡ A ( A HA ) ¡ 1 A H ) A = b H ( A ¡ A ) = 0:
The error in the corresponding least-squares estimate is given via: ˜ lnb ; ² e2 = P
2 2
= b HP ˜
H ln
˜ lnb : P