Maya Gupta
Title: Functional Bregman
Divergence, Bayesian Estimation of Distributions, and Completely Lazy
Classifiers
Abstract:
We generalize Bregman divergences to a functional Bregman divergence
and show that direct Bayesian estimation of a distribution such that
the expected functional Bregman risk is minimized leads to the mean
distribution, generalizing the well-known result that "the mean
minimizes the average squared error." This result and some intuition
from multiresolutional theory leads to the first effective Bayesian
quadratic discriminant analysis classifier. We propose a new approach
to reduce the bias of Bayesian QDA that avoids the difficulties of
Gaussian mixture models, and extend the Bayesian framework to create a
completely-lazy classifier that has average-performance guarantees and
in practice achieves state-of-the-art classification performance for
high-dimensional problems without requiring the cross-validation of any
parameters.