日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

机器学习导论(张志华):主元分析

發布時間:2025/3/15 编程问答 17 豆豆
生活随笔 收集整理的這篇文章主要介紹了 机器学习导论(张志华):主元分析 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

前言

這個筆記是北大那位老師課程的學習筆記,講的概念淺顯易懂,非常有利于我們掌握基本的概念,從而掌握相關的技術。

basic concepts

exp(?tz12)=∫exp(?tuz)dF(u)exp(-tz^{\frac{1}{2}}) =\int exp(-tuz) dF(u)exp(?tz21?)=exp(?tuz)dF(u)
z=∣∣x∣∣2z=||x||^2z=x2
exp(?t∣∣x∣∣),exp(?t∣∣x∣∣).exp(-t||x||),exp(-t||x||).exp(?tx),exp(?tx).
The product of P.D is P.D
eul distance transformed into another space to get the distance.
∣∣?(x)??(y)∣∣22||\phi(x)-\phi(y)||^2_2?(x)??(y)22?
Part2 unsuperrised learning
CB dimensionlity reduction.

PCA(Principal Component Analysis)

Population PCA
Def. if x ̄?Rpisarandomvector,withmean:uandcovariancematrixσ\overline x \subset R^p \quad is\quad a\quad random \quad vector, \quad with \quad mean:u \quad and \quad covariance \quad matrix \sigmax?Rpisarandomvector,withmean:uandcovariancematrixσ
then the PCA is
x ̄?>y ̄=Ut(x?u)\overline x-> \overline y=U^t(x-u)x?>y?=Ut(x?u)
when U is orthgonal.

Spectral Decompistion

Thm,
Ifx?>N(μ,σ)If x->N(\mu,\sigma)Ifx?>N(μ,σ) Then,yN(0,n)y~N(0,n)y?N(0,n)
(2)E(y0)=0,E(y_0)=0,E(y0?)=0,
(3)Cov(Ym,Yi)=0fori!=jCov(Y_m,Y_i)=0 for i !=j Cov(Ym?,Yi?)=0fori!=j
(4)yisaorthangonaltransformxisuncorrelationbutotsqure.y \quad is\quad a \quad orthangonal \quad transform \quad x \quad is \quad uncorrelation \quad but \quad ot \quad squre. yisaorthangonaltransformxisuncorrelationbutotsqure.
(5)Var(Yi)=σiVar(Y_i)=\sigma_iVar(Yi?)=σi?

Sample Principal Component

LetX=[x ̄1...x ̄n]Tbean?pLet X=[\overline x_1 ...\overline x_n]^T be\quad a \quad n*p LetX=[x1?...xn?]Tbean?p

sample data matrix

x ̄=1n∑x=1nx ̄i,\overline x=\frac{1}{n} \sum_{x=1}^n \overline x_i,x=n1?x=1n?xi?,
S=1nXTHXS=\frac{1}{n}X^THXS=n1?XTHX
H:In=1nInInH:I_n=\frac{1}{n}I_nI_nH:In?=n1?In?In?
reduce the data to k-dimension ,you get the first k element.
keep most information,PCA.suppos.

SVD

U=eigenvectorof(AAT)U=eigenvectorof(AA^T)U=eigenvectorof(AAT)
D=AATD=\sqrt{AA^T}D=AAT?
V=eigenvector(ATA)V=eigenvector(A^TA)V=eigenvector(ATA)

PCO(Principal Coordinate Analysis)

S=XTHXS=X^THXS=XTHX
power equal : HH=H
B=HXXTHB=HXX^THB=HXXTH
variance matrix
AB=BA
Non-zero eigenvector are equal.

總結

以上是生活随笔為你收集整理的机器学习导论(张志华):主元分析的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。