|Title:||Random Effects Models with Deep Neural Network Basis Functions: Methodology and Computation|
|Publisher:||The University of Sydney Business School|
|Abstract:||Deep neural networks (DNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a deep neural network. The consideration of neural networks with random effects seems little used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for Bayesian inference are developed based on Gaussian variational approximation methods. A parsimonious but flexible factor parametrization of the covariance matrix is used in the Gaussian variational approximation. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix to perform fast matrix vector multiplications in iterative conjugate gradient linear solvers in natural gradient computations. The method can be implemented in high dimensions, and the use of the natural gradient allows faster and more stable convergence of the variational algorithm. In the case of random effects, we compute unbiased estimates of the gradient of the lower bound in the model with the random effects integrated out by making use of Fisher's identity. The proposed methods are illustrated in several examples for DNN random effects models and high-dimensional logistic regression with sparse signal shrinkage priors.|
|Type of Work:||Article|
|Type of Publication:||Pre-print|
|Appears in Collections:||Working Papers - Business Analytics|
|BAWP-2018-01.pdf||790.27 kB||Adobe PDF|
Items in Sydney eScholarship Repository are protected by copyright, with all rights reserved, unless otherwise indicated.