Please enable JavaScript.
Coggle requires JavaScript to display documents.
KERNEL FUNCTIONS - Coggle Diagram
KERNEL FUNCTIONS
FEATURES MAP NELLE RIDGE REGRESSION
minimizzare errore quadratico medio
$$ L(w) = \frac {1}{2} \sum_{i=1}^{N_{Tr} }(y_i - w^{T} \phi (x_i))^2 + \frac {\lambda}{2} w^{T} w $$
$$ \phi=feature map$$
può essere un problema
dipendere da lui
$$ w^{*} = (\Phi^{T} \Phi + \lambda I_n)^-1 \phi^{T}y $$
$$e = y - \Phi w $$
$$ L(w) =\frac {1}{2} e^{T} e + \frac {\lambda}{2} w^{T} w + \alpha^{T} (e - \Phi_w - y)$$
DUALE
$$e^{*}$$
$$-\alpha$$
$$ L(w^{*}, \alpha,e^{*}) = \frac{1}{2} \alpha^T \alpha - \frac {1}{2 \lambda} ( \alpha^T \Phi \Phi^T \alpha ) + \alpha^T y $$
$$\alpha^{*} $$
$$\alpha^{*} = \lambda (K +\lambda I_N)^-1 y $$
NON IMPORTA PIU' IL NUMERO DI FEATURES
$$K=\Phi \Phi^T $$
GRAM MATRIX
KERNEL TRICK
non definisco phi, definisco direttamente le k
$$w^{*}$$
$$\frac {1}{\lambda} \Phi^T \alpha $$
Predizione
$$ w^{*} $$
$$ \Phi^T ((K+ \lambda I_n)^{-1} y) $$
$$ y' $$
$$(\Phi \phi(x'))^T ((K+\lambda I_N)^{-1}y) $$
$$ v= \Phi \phi(x') $$
vettore
$$\text {non ci interessa più la } \Phi \text { ma solo il dot product delle } \phi$$
KERNEL
deve costruire gram matrix semi definite positive
Polinomiali
$$k^p(a,b) = (a^T b +1)^p $$
Gaussiano
$$ k(a,b)= e^{- \frac {|| a-b ||^2} {2 \sigma^2}} $$