Gauss mixtures are a popular class of models in statistics and statistical signal processing because Gauss mixtures can provide good fits to smooth densities, because they have a rich theory, because they can yield good results in applications such as classification and image segmentation, and because the can be well estimated by existing algorithms such as the EM algorithm. We here use high rate quantization theory to develop a variation of an information theoretic extremal property for Gaussian sources and its extension to Gauss mixtures. This extends a method originally used for LPC speech vector quantization to provide a clustering approach to the design of Gauss mixture models. The theory provides formulas relating minimum discrimination information (MDI) selection of Gaussian components of a Gauss mixture and the mean squared error resulting when the MDI criterion is used in an optimized robust classified vector quantizer. It also provides motivation for the use of Gauss mixture models for robust compression systems for random vectors with estimated second order moments but unknown distributions.
business statistics by sp gupta pdf
2ff7e9595c
Comments