The integrated mean square error for the conditional KDE.
Parameters : | bw: array_like :
|
---|---|
Returns : | CV: float :
|
Notes
For more details see pp. 156-166 in [1]. For details on how to handle the mixed variable types see [3].
The formula for the cross-validation objective function for mixed variable types is:
CV(h,\lambda)=\frac{1}{n}\sum_{l=1}^{n} \frac{G_{-l}(X_{l})}{\left[\mu_{-l}(X_{l})\right]^{2}}- \frac{2}{n}\sum_{l=1}^{n}\frac{f_{-l}(X_{l},Y_{l})}{\mu_{-l}(X_{l})}
where
G_{-l}(X_{l}) = n^{-2}\sum_{i\neq l}\sum_{j\neq l} K_{X_{i},X_{l}} K_{X_{j},X_{l}}K_{Y_{i},Y_{j}}^{(2)}
where K_{X_{i},X_{l}} is the multivariate product kernel and \mu_{-l}(X_{l}) is the leave-one-out estimator of the pdf.
K_{Y_{i},Y_{j}}^{(2)} is the convolution kernel.
The value of the function is minimized by the _cv_ls method of the GenericKDE class to return the bw estimates that minimize the distance between the estimated and “true” probability density.