Search
Duplicate

김영길/ 선형대수학/ factorization of covariance matrix for random process

Covariance matrix

어떤 랜덤 vector의 covariance matrix KzK_{z}는 다음과 같다.
Kz=E{[z(u)mz][z(u)mz]}=E{z(u)z(u)mzz(u)z(u)mz+mzmz}=E{z(u)z(u)}mzE{z(u)}E{z(u)}mz+mzmz=RzmzmzK_{z} = E\{[z(u) - m_{z}] [z(u) - m_{z}]^{\dagger}\} \\ = E\{z(u)z^{\dagger}(u) - m_{z}z^{\dagger}(u) - z(u)m_{z}^{\dagger} + m_{z}m_{z}^{\dagger} \} \\ = E\{z(u)z^{\dagger}(u)\} - m_{z}E\{z^{\dagger}(u)\} - E\{z(u)\}m_{z}^{\dagger} + m_{z}m_{z}^{\dagger} \\ = R_{z} - m_{z} m_{z}^{\dagger}
mzm_{z}는 평균벡터
\dagger는 conjugate transpose로 켤레전치행렬이 된다. (복소수의 부호를 바꾸고 전치시킨 행렬)

Pseudo-correlation, Pseudo-covariance matrix

Pseudo-correlation
T~z=E{z(u)zt(u)}\tilde{T}_{z} = E \{z(u)z^{t}(u)\}
Pseudo-covariance
K~z=E{[z(u)mz][z(u)mz]t}\tilde{K}_{z} = E \{ [z(u) - m_{z}][z(u) - m_{z}]^{t} \}
임의의 벡터 aCna \in C^{n} 에 대하여
aKza0a^{\dagger}K_{z}a \geq 0
이러한 성질에 때문에 Covariance matrix KzK_{z}는 non-negative definite라고 한다. 이는 correlation matrix RzR_{z}도 마찬가지다.
(non-negative definite은 positive semi definite(양의 준정부호)이라고도 한다)

Theorem

Correlation function은 non-negative definite이다.
증명)
w(u)j=1najz(u,tj)w(u) \triangleq \sum_{j=1}^{n} a_{j} z(u, t_{j})
E{w(u)2}=E{j=1najz(u,tj)2}=j=1nk=1najE{z(u,tj)z(u,tk)}ak=j=1nk=1najRz(tj,tk)ak0\mathbb{E}\{|w(u)|^{2}\} = \mathbb{E} \{ | \sum_{j=1}^{n} a_{j} z(u, t_{j}) |^{2} \} \\ = \sum_{j=1}^{n} \sum_{k=1}^{n} a_{j} \mathbb{E}\{ z(u, t_{j}) z^{*} (u, t_{k})\} a_{k}^{*} \\ = \sum_{j=1}^{n} \sum_{k=1}^{n} a_{j} R_{z}(t_{j}, t_{k}) a_{k}^{*} \geq 0

Linear transformation or random vectors

y(u)=[y(u,1)y(u,2)...y(u,n)]y(u) = \left[ \begin{array}{rrrr} y(u, 1) \\ y(u, 2) \\ ... \\ y(u, n) \end{array} \right]
y(u)y(u)는 랜덤 벡터
=[h11h12...h1nh21h22...h2n...hm1hm2...hmn][z(u,1)z(u,2)...z(u,n)]=Hz(u)= \left[ \begin{array}{rrrr} h_{11} & h_{12} & ... & h_{1n} \\ h_{21} & h_{22} & ... & h_{2n} \\ ... \\ h_{m1} & h_{m2} & ... & h_{mn} \end{array} \right] \left[ \begin{array}{rrrr} z(u, 1) \\ z(u, 2) \\ ... \\ z(u, n) \end{array} \right] =Hz(u)
z(u)z(u)는 랜덤 벡터
HH 는 선형변환 (행렬)
E{y(u,t)}=E{τ=1nhtτz(u,τ)}=τ=1nhtτE{z(u,τ)}=τ=1nhtτmz(τ)E \{ y(u, t) \} = E \{ \sum_{\tau = 1}^{n} h_{t \tau} z(u, \tau) \} = \sum_{\tau =1}^{n} h_{t \tau} E\{z(u, \tau) \} = \sum_{\tau =1}^{n} h_{t \tau} m_{z}(\tau)
E{y(u,t1)y(u,t2)}E\{y(u, t_{1}) y^{*}(u, t_{2})\}
=E{[τ1=1nht1τ1z(u,τ1)][τ2=1nht2τ2z(u,τ2)]}=τ1=1nτ2=1nht1τ1ht2τ2E[z(u,τ1)z(u,τ2)]= E \{ [ \sum_{\tau_{1} = 1}^{n} h_{t_{1} \tau_{1}} z(u, \tau_{1}) ] [ \sum_{\tau_{2} = 1}^{n} h_{t_{2} \tau_{2}}^{*} z^{*}(u, \tau_{2}) ] \} \\ = \sum_{\tau_{1} = 1}^{n} \sum_{\tau_{2} = 1}^{n} h_{t_{1} \tau_{1}} h_{t_{2} \tau_{2}}^{*} E[ z(u, \tau_{1}) z^{*}(u, \tau_{2}) ]
E{y(u,t1)y(u,t2)}E\{ y(u, t_{1}) y^{*}(u, t_{2}) \}
=τ1=1nτ2=1nht1τ1ht2τ2Rz(τ1,τ2)= \sum_{\tau_{1} = 1}^{n} \sum_{\tau_{2} = 1}^{n} h_{t_{1} \tau_{1}} h_{t_{2} \tau_{2}}^{*} R_{z} (\tau_{1}, \tau_{2})
Thus,
my=E{y(u)}=E{Hz(u)}=HE{z(u)}=Hmzm_{y} = E\{y(u)\} = E \{Hz(u)\} = HE\{z(u)\} = Hm_{z}
Ry=E{y(u)y(u)}R_{y} = E\{y(u) y^{\dagger}(u) \}
=E{Hz(u)(Hz(u))}=E{Hz(u)z(u)H}=HE{z(u)z(u)}H=HRzH= E\{Hz(u)(Hz(u))^{\dagger}\} \\ = E\{Hz(u) z^{\dagger}(u) H^{\dagger}\} = HE\{z(u) z^{\dagger}(u)\}H^{\dagger} \\ = HR_{z} H^{\dagger}
R~y=E{Hz(u)zt(u)Ht}\tilde{R}_{y} = E \{ Hz(u) z^{t}(u) H^{t} \}
HE{z(u)zt(u)}Ht=HR~zHtHE\{z(u) z^{t}(u)\}H^{t} = H \tilde{R}_{z} H^{t}
Centered output vector
y0(u)=y(u)my=Hz(u)Hmz=H[z(u)mz]=Hz0(u)y_{0}(u) = y(u) - m_{y} = Hz(u) - Hm_{z} = H[z(u) - m_{z}] = Hz_{0}(u)
Covariance matrix
Ky=HKzHK_{y} = HK_{z}H^{\dagger}
Pseudo-covariance matrix
K~y=HK~zHt\tilde{K}_{y} = H \tilde{K}_{z} H^{t}

Simulation problem

Real white random vector w(u)w(u)
mw=0,Kw=σ2Im_{w} = 0, K_{w} = \sigma^{2} I
Complex white random vector wc(u)w_{c}(u)
mwc=0,Kwc=2σ2I,K~wc=0m_{w_{c}} = 0, K_{w_{c}} = 2 \sigma^{2} I, \tilde{K}_{w_{c}} = 0
Simulation block diagram (다이어그램 이미지 생략)
z(u)=Hw(u)+cz(u) = Hw(u) + c
mz=E{Hw(u)+c}m_{z} = \mathbb{E} \{ Hw(u) + c\}
=E{Hw(u)}+E{c}=HE{w(u)}+c=c = \mathbb{E} \{ Hw(u) \} + \mathbb{E} \{ c \} \\ = H \mathbb{E} \{ w(u) \} + c \\ = c
z0(u)=z(u)mzz_{0}(u) = z(u) - m_{z}
z0(u)=Hw(u)z_{0}(u) = Hw(u)
Kz=Rz0=HHK_{z} = R_{z_{0}} = HH^{\dagger}

Covariance Matrix Structure and Factorization

Ke=λe,(e0)Ke = \lambda e, (e \neq 0)
λ\lambda는 eigenvalue, ee는 eigenvector
Hermitian matrix KK 의 Eigenvalue λ\lambda는 항상 real이 된다.
(eKe)=eKe=λe2(e^{\dagger} Ke)^{\dagger} = e^{\dagger}Ke = \lambda |e|^{2}
eKee^{\dagger} Ke는 자기자신의 conjugate와 같다. conjugate 했는데 자기 자신이 된다는 것은 real이라는 뜻
non-negative definite matrix KK의 eigenvalue λ\lambda는 항상 non-negative하다.
만일 KK가 positive definite 하면 λ\lambda는 반드시 positive하다.
Hermitian matrix KK의 distinct한 eigenvalue들은 orthogonal하다.
λ1λ2\lambda_{1} \neq \lambda_{2}
λ1e1e2=(Ke1)e2=e1Ke2=e1Ke2=λ2e1e2\lambda_{1} e_{1}^{\dagger} e_{2} = (Ke_{1})^{\dagger} e_{2} = e_{1}^{\dagger} K^{\dagger} e_{2} = e_{1}^{\dagger}K e_{2} = \lambda_{2} e_{1}^{\dagger} e_{2}
λ1λ2\lambda_{1} \neq \lambda_{2} 이므로 e1e2=0e_{1}^{\dagger} e_{2} = 0
같은 eigenvalue를 갖는 eigenvector들의 집합은 linear space의 subspace가 된다.
K(e1+e2)=Ke1+Ke2=λ(e1+e2)K(e_{1} + e_{2}) = Ke_{1} + Ke_{2} = \lambda(e_{1} + e_{2})
K(ae1)=aKe1=λ1(ae1)K(a e_{1}) = aKe_{1} = \lambda_{1} (ae_{1})
Hermitian matrix들은 항상 대각화 가능하다.

Unitary matrix

EE=I=EEE^{\dagger}E = I = E E^{\dagger}EE를 unitary matrix라 한다.
K=EΛEK = E \Lambda E^{\dagger}
diagonal matrix Λ\Lambda에 루트를 씌우면
Λ12=[λ1120...0λn12]\Lambda^{1 \over 2} = \left[ \begin{array}{rrr} \lambda_{1}^{1 \over 2} & & 0 \\ & ... & \\ 0 & & \lambda_{n}^{1 \over 2} \end{array} \right]
K=EΛ12Λ12E=(EΛ12)(EΛ12)K = E\Lambda^{1 \over 2} \Lambda^{1 \over 2} E^{\dagger} = (E \Lambda^{1 \over 2})(E \Lambda^{1 \over 2})^{\dagger}
KK의 다른 factorization도 가능하다.
K=(EΛ12U)(EΛ12U)K = (E \Lambda^{1 \over 2} U)(E \Lambda^{1 \over 2} U)^{\dagger}
UU는 unitary matrix
Simulation Solution
$z0(u)=EΛ12Uw(u)z_{0}(u) = E \Lambda^{1 \over 2} U w(u)