Robust Multiple-Sensing-Modality Data Fusion for Reliable Perception in Outdoor Scenarios
Field | Value | Language |
dc.contributor.author | Gerardo-Castro, Marcos Paul | |
dc.date.accessioned | 2018-05-01 | |
dc.date.available | 2018-05-01 | |
dc.date.issued | 2017-06-30 | |
dc.identifier.uri | http://hdl.handle.net/2123/18135 | |
dc.description.abstract | To be reliable in outdoor operations, perception systems should use a combination of sensing modalities, such as laser, radar, visual camera, or IR camera, since they respond differently to distinct environmental conditions. In this thesis, a novel multisensory data fusion framework designed to appropriately combine data acquired by multiple exteroceptive sensing modalities in field robotics scenarios is presented. Compared with the traditional data fusion methods, the proposed framework will account for the fact that sensors perceive the environment in different ways. To this end, it is proposed to automatically detect commonalities and discrepancies between the data sources before performing data fusion or combination. The consistency test evaluates dependencies between data from multiple-sensing-modalities or representations and chooses intelligently the data that most likely leads to synergy. The data fusion methods described in this thesis create high fidelity representations taking advantage of the full potential of each sensing modality leading to resilient perception system. This thesis introduces a data fusion with a parametric consistency test framework for multiple sensing modalities using Gaussian process data fusion. This approach avoids the data quality degradation inherent to fusing multiple types of sensor data by using a parametric consistency test. In addition, a non-parametric consistency test for multiple-sensing-modalities using Gaussian process data fusion is presented. This approach avoids local geometric threshold parameters and can be more discriminatory because it takes into account the global model. An experimental analysis was evaluated for simulations and for real-world implementation using laser and radar sensor in different environmental conditions. More specifically the performance of 3D surface reconstruction in the context of field robotics was tested in a variety of scenarios. It was demonstrated that by identifying discrepancies and exploiting data commonalities between the data of each sensing modality the framework creates a resilient perception system to adverse environmental conditions. | en_AU |
dc.rights | The author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission. | en_AU |
dc.subject | Field robotics | en_AU |
dc.subject | data-fusion | en_AU |
dc.subject | machine learning | en_AU |
dc.subject | perception | en_AU |
dc.subject | multiple-sensing-modalities | en_AU |
dc.subject | object reconstruction | en_AU |
dc.title | Robust Multiple-Sensing-Modality Data Fusion for Reliable Perception in Outdoor Scenarios | en_AU |
dc.type | Thesis | en_AU |
dc.type.thesis | Doctor of Philosophy | en_AU |
usyd.faculty | Faculty of Engineering and Information Technologies, School of Aerospace, Mechanical and Mechatronic Engineering | en_AU |
usyd.degree | Doctor of Philosophy Ph.D. | en_AU |
usyd.awardinginst | The University of Sydney | en_AU |
Associated file/s
Associated collections