Show simple item record

FieldValueLanguage
dc.contributor.authorYu, Weichang
dc.date.accessioned2019-10-24
dc.date.available2019-10-24
dc.date.issued2019-08-23
dc.identifier.urihttps://hdl.handle.net/2123/21262
dc.description.abstractDiscriminant analysis (DA) based classifiers have several advantages over alternative classifiers. However, the models are constructed under strong assumptions and are subjected to major computational obstacles and reduced classification accuracy arises in high dimensional settings. A Bayesian treatment of DA has the potential to resolve these problems but is commonly hindered by poor choices of priors and intractable distributions. In Chapter 2, we consider the appropriate choices of priors for hypothesis testing and several paradoxes that threaten the validity of resulting decisions. We propose a class of priors known as cake priors that can be made to be arbitrarily diffuse and yet avoid some of these drawbacks. This proposed class of priors can be constructed from a general recipe. We derive a general expression for a Bayesian test statistic arising from cake priors and show how they achieve a property called Chernoff consistency for a pair of hypotheses. In Chapters 3 and 4 we assign the cake priors to the Gaussian parameters of the naive Bayes linear DA and naive Bayes quadratic DA classifiers. We fit these models using a novel variant of the variational Bayes algorithm and call the resulting classifiers VLDA and VQDA respectively. These proposed models perform both variable selection and classification on features in a unified framework to improve the resultant classification performance. In Chapter 5 we propose the variational nonparametric discriminant analysis (VNPDA) which is compatible with a diverse range of continuous distributions among the features. This proposed version of DA fuses variable selection and classification in a similar manner to VLDA and VQDA, but assigns treats the group-conditional distributions as unknown and assigns them with Polya tree priors. In Chapter 6, we conclude with a summary, limitations of our proposed work, and future directions.en_US
dc.rightsThe author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.en_AU
dc.subjectBayesian hypothesis testingen_US
dc.subjectdiscriminant analysisen_US
dc.titleBayesian Variable Selection for High-Dimension Discriminant Analysisen_US
dc.typeThesisen_AU
dc.type.thesisDoctor of Philosophyen_AU
usyd.facultyFaculty of Science, School of Mathematics and Statisticsen_AU
usyd.degreeDoctor of Philosophy Ph.D.en_AU
usyd.awardinginstThe University of Sydneyen_AU


Show simple item record

Associated file/s

Associated collections

Show simple item record

There are no previous versions of the item available.