程序代写代做代考 algorithm Orthogonal Decompositions¶

Orthogonal Decompositions¶

Orthogonal transformations¶
Let $Q\in\mathbb{R}^{n\times n}$. $Q$ is called orthogonal if
$$Q^TQ=I.$$

Hence, all rows and columns of $Q$ are orthogonal to each other.

Orthogonal transformations are angle and norm preserving. Consider the inner product $\langle x,y\rangle:=x^Ty$ of two vectors $x$ and $y$. We have
$$ \langle Qx, Qy\rangle = x^TQ^TQy = x^Ty =\langle x, y\rangle. $$
In particular $$ \|Qx\|_2^2 = \langle Qx,Qx\rangle = \|x\|^2. $$
Designing algorithms using orthogonal transformations is a stable process since we preserve norms angles.

The QR factorization¶
Consider a matrix $A\in\mathbb{R}^{m\times n}$, $m\geq n$. We want to orthogonalize the columns of $A$. The algorithm described as follows is called the Gram Schmidt process.
Let $\tilde{q}_1:=a_1$ be the first column of $A$. Define $q_1:=\tilde{q}_1/\|\tilde{q}_1\|_2$. We now proceed as follows:
\begin{eqnarray} \tilde{q}_2 &= & a_2 – \langle a_2,q_1\rangle q_1,\quad q_2 = \tilde{q}_1/\|\tilde{q}_2\|_2\nonumber\\ \tilde{q}_3 &= & a_3 – \langle a_3,q_2\rangle q_2-\langle a_3,q_1\rangle q_1, \quad q_3 = \tilde{q}_3/\|\tilde{q}_3\|_2\\ \tilde{q}_k & = & a_k – \sum_{j=1}^{k-1} \langle a_k,q_j\rangle q_j, \quad q_k = \tilde{q}_k/\|\tilde{q}_k\|_2. \end{eqnarray}

Rearranging the above equations we arrive at the QR factorization $$ A = QR $$ with $Q\in\mathbb{R}^{m\times n}$ and $R\in\mathbb{R}^{n\times n}$. The matrix $Q$ is the matrix whose columns are the $q_k$, and $R$ is an upper triangular matrix with $r_{ij} = \langle a_j,q_i\rangle$, $j \gt i$, and $r_{ii} = \|\tilde{q}_i\|_2$.
In practice Gram-Schmidt orthogonalization is not always stable. A stable way to compute the factors $Q$ and $R$ can be implemented using so-called Householder reflectors.

Theorem [QR Factorization]
Let $A\in\mathbb{R}^{m\times n}$, $m\geq n$. Then the QR factorization $A=QR$ with $Q\in\mathbb{R}^{m\times n}$, $Q^TQ=I$, $R\in\mathbb{R}^{n \times n}$ upper triangular exists and is unique up to the choice of signs in the diagonal elements $r_{ii}$.

We can use the $QR$ factorization to solve linear systems of equations. Let $Ax=b$, and $A=QR$. Multiply with $Q^T$ to obtain
$$ Rx = Q^Tb. $$
Now solve $Rx = Q^Tb$ via backward substitution.

Using the $QR$ decomposition computed via Householder reflectors it can be shown that the above procedure is backward stable. LU decomposition is not backward stable. Why don’t we always use $QR$? There are two main arguments:
1.) Computing the QR decomposition is twice as costly as computing the $LU$ decomposition. Stability here comes with a price!
2.) The cases where $LU$ with pivoting is unstable are so rare that in practice we do not worry about this too much.

Reduced and full QR decomposition¶

The definition of the QR decompositon we have given above is also called reduced QR decomposition. One can also define a full QR decomposition as $$ A = \begin{bmatrix}\tilde{Q} & V\end{bmatrix}\begin{bmatrix}\tilde{R}\\ 0\end{bmatrix}=: QR. $$

The columns of the matrix $V$ are orthogonal to the columns of $Q$ and span the complement to the range of $A$. The matrices $\tilde{Q}$ and $\tilde{R}$ form the reduced QR decomposition as defined above, that is $A=\tilde{Q}\tilde{R}$. But now ${Q}\in\mathbb{R}^{m\times m}$ and ${R}\in\mathbb{R}^{m\times n}$. Switching between the full and the reduced QR decomposition is often a technicality. For computations the reduced QR decomposition is usually preferred. But there are cases, where also the full QR decomposition is useful.
In [ ]: