Conservative Sparsification for Efficient Approximate Estimation
Access status:
Open Access
Type
ThesisThesis type
Doctor of PhilosophyAuthor/s
Vial, John Francis StephenAbstract
Linear Gaussian systems often exhibit sparse structures. For systems which grow as a function of time, marginalisation of past states will eventually introduce extra non-zero elements into the information matrix of the Gaussian distribution. These extra non-zeros can lead to ...
See moreLinear Gaussian systems often exhibit sparse structures. For systems which grow as a function of time, marginalisation of past states will eventually introduce extra non-zero elements into the information matrix of the Gaussian distribution. These extra non-zeros can lead to dense problems as these systems progress through time. This thesis proposes a method that can delete elements of the information matrix while maintaining guarantees about the conservativeness of the resulting estimate with a computational complexity that is a function of the connectivity of the graph rather than the problem dimension. This sparsification can be performed iteratively and minimises the Kullback Leibler Divergence (KLD) between the original and approximate distributions. This new technique is called Conservative Sparsification (CS). For large sparse graphs employing a Junction Tree (JT) for estimation, efficiency is related to the size of the largest clique. Conservative Sparsification can be applied to clique splitting in JTs, enabling approximate and efficient estimation in JTs with the same conservative guarantees as CS for information matrices. In distributed estimation scenarios which use JTs, CS can be performed in parallel and asynchronously on JT cliques. This approach usually results in a larger KLD compared with the optimal CS approach, but an upper bound on this increased divergence can be calculated with information locally available to each clique. This work has applications in large scale distributed linear estimation problems where the size of the problem or communication overheads make optimal linear estimation difficult.
See less
See moreLinear Gaussian systems often exhibit sparse structures. For systems which grow as a function of time, marginalisation of past states will eventually introduce extra non-zero elements into the information matrix of the Gaussian distribution. These extra non-zeros can lead to dense problems as these systems progress through time. This thesis proposes a method that can delete elements of the information matrix while maintaining guarantees about the conservativeness of the resulting estimate with a computational complexity that is a function of the connectivity of the graph rather than the problem dimension. This sparsification can be performed iteratively and minimises the Kullback Leibler Divergence (KLD) between the original and approximate distributions. This new technique is called Conservative Sparsification (CS). For large sparse graphs employing a Junction Tree (JT) for estimation, efficiency is related to the size of the largest clique. Conservative Sparsification can be applied to clique splitting in JTs, enabling approximate and efficient estimation in JTs with the same conservative guarantees as CS for information matrices. In distributed estimation scenarios which use JTs, CS can be performed in parallel and asynchronously on JT cliques. This approach usually results in a larger KLD compared with the optimal CS approach, but an upper bound on this increased divergence can be calculated with information locally available to each clique. This work has applications in large scale distributed linear estimation problems where the size of the problem or communication overheads make optimal linear estimation difficult.
See less
Date
2013-03-31Faculty/School
Faculty of Engineering and Information Technologies, School of Aerospace, Mechanical and Mechatronic EngineeringAwarding institution
The University of SydneyShare