Robust Multiple-Sensing-Modality Data Fusion for Reliable Perception in Outdoor Scenarios
Access status:
USyd Access
Type
ThesisThesis type
Doctor of PhilosophyAuthor/s
Gerardo-Castro, Marcos PaulAbstract
To be reliable in outdoor operations, perception systems should use a combination of sensing modalities, such as laser, radar, visual camera, or IR camera, since they respond differently to distinct environmental conditions. In this thesis, a novel multisensory data fusion framework ...
See moreTo be reliable in outdoor operations, perception systems should use a combination of sensing modalities, such as laser, radar, visual camera, or IR camera, since they respond differently to distinct environmental conditions. In this thesis, a novel multisensory data fusion framework designed to appropriately combine data acquired by multiple exteroceptive sensing modalities in field robotics scenarios is presented. Compared with the traditional data fusion methods, the proposed framework will account for the fact that sensors perceive the environment in different ways. To this end, it is proposed to automatically detect commonalities and discrepancies between the data sources before performing data fusion or combination. The consistency test evaluates dependencies between data from multiple-sensing-modalities or representations and chooses intelligently the data that most likely leads to synergy. The data fusion methods described in this thesis create high fidelity representations taking advantage of the full potential of each sensing modality leading to resilient perception system. This thesis introduces a data fusion with a parametric consistency test framework for multiple sensing modalities using Gaussian process data fusion. This approach avoids the data quality degradation inherent to fusing multiple types of sensor data by using a parametric consistency test. In addition, a non-parametric consistency test for multiple-sensing-modalities using Gaussian process data fusion is presented. This approach avoids local geometric threshold parameters and can be more discriminatory because it takes into account the global model. An experimental analysis was evaluated for simulations and for real-world implementation using laser and radar sensor in different environmental conditions. More specifically the performance of 3D surface reconstruction in the context of field robotics was tested in a variety of scenarios. It was demonstrated that by identifying discrepancies and exploiting data commonalities between the data of each sensing modality the framework creates a resilient perception system to adverse environmental conditions.
See less
See moreTo be reliable in outdoor operations, perception systems should use a combination of sensing modalities, such as laser, radar, visual camera, or IR camera, since they respond differently to distinct environmental conditions. In this thesis, a novel multisensory data fusion framework designed to appropriately combine data acquired by multiple exteroceptive sensing modalities in field robotics scenarios is presented. Compared with the traditional data fusion methods, the proposed framework will account for the fact that sensors perceive the environment in different ways. To this end, it is proposed to automatically detect commonalities and discrepancies between the data sources before performing data fusion or combination. The consistency test evaluates dependencies between data from multiple-sensing-modalities or representations and chooses intelligently the data that most likely leads to synergy. The data fusion methods described in this thesis create high fidelity representations taking advantage of the full potential of each sensing modality leading to resilient perception system. This thesis introduces a data fusion with a parametric consistency test framework for multiple sensing modalities using Gaussian process data fusion. This approach avoids the data quality degradation inherent to fusing multiple types of sensor data by using a parametric consistency test. In addition, a non-parametric consistency test for multiple-sensing-modalities using Gaussian process data fusion is presented. This approach avoids local geometric threshold parameters and can be more discriminatory because it takes into account the global model. An experimental analysis was evaluated for simulations and for real-world implementation using laser and radar sensor in different environmental conditions. More specifically the performance of 3D surface reconstruction in the context of field robotics was tested in a variety of scenarios. It was demonstrated that by identifying discrepancies and exploiting data commonalities between the data of each sensing modality the framework creates a resilient perception system to adverse environmental conditions.
See less
Date
2017-06-30Licence
The author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.Faculty/School
Faculty of Engineering and Information Technologies, School of Aerospace, Mechanical and Mechatronic EngineeringAwarding institution
The University of SydneyShare