Kernel mixture model for probability density estimation in Bayesian classifiers
Estimating reliable classconditional probability is the prerequisite to implement Bayesian classifiers, and how to estimate the probability density functions (PDFs) is also a fundamental problem for other probabilistic induction algorithms. The finite mixture model (FMM) is able to represent arbitr... Full description
Journal Title:  Data Mining and Knowledge Discovery 2018, Vol.32(3), pp.675707 
Main Author:  Zhang, Wenyu 
Other Authors:  Zhang, Zhenjiang , Chao, HanChieh , Tseng, FanHsun 
Format:  Electronic Article 
Language: 
English 
Subjects:  
ID:  ISSN: 13845810 ; EISSN: 1573756X ; DOI: 10.1007/s1061801805505 
Link:  http://dx.doi.org/10.1007/s1061801805505 
Zum Text: 
SendSend as email
Add to Book BagAdd to Book Bag
Staff View
recordid:  springer_jour10.1007/s1061801805505 
title:  Kernel mixture model for probability density estimation in Bayesian classifiers 
format:  Article 
creator: 

subjects: 

ispartof:  Data Mining and Knowledge Discovery, 2018, Vol.32(3), pp.675707 
description:  Estimating reliable classconditional probability is the prerequisite to implement Bayesian classifiers, and how to estimate the probability density functions (PDFs) is also a fundamental problem for other probabilistic induction algorithms. The finite mixture model (FMM) is able to represent arbitrary complex PDFs by using a mixture of mutimodal distributions, but it assumes that the component mixtures follows a given distribution, which may not be satisfied for real world data. This paper presents a nonparametric kernel mixture model (KMM) based probability density estimation approach, in which the data sample of a class is assumed to be drawn by several unknown independent hidden subclasses. Unlike traditional FMM schemes, we simply use the k means clustering algorithm to partition the data sample into several independent components, and the regional density diversities of components are combined using the Bayes theorem. On the basis of the proposed kernel mixture model, we present a threestep Bayesian classifier, which includes partitioning, structure learning, and PDF estimation. Experimental results show that KMM is able to improve the quality of estimated PDFs of conventional kernel density estimation (KDE) method, and also show that KMMbased Bayesian classifiers outperforms existing Gaussian, GMM, and KDEbased Bayesian classifiers. 
language:  eng 
source:  
identifier:  ISSN: 13845810 ; EISSN: 1573756X ; DOI: 10.1007/s1061801805505 
fulltext:  fulltext 
issn: 

url:  Link 
@attributes 
 
PrimoNMBib 
