Show simple item record

FieldValueLanguage
dc.contributor.authorGerardo-Castro, Marcos Paul
dc.date.accessioned2018-05-01
dc.date.available2018-05-01
dc.date.issued2017-06-30
dc.identifier.urihttp://hdl.handle.net/2123/18135
dc.description.abstractTo be reliable in outdoor operations, perception systems should use a combination of sensing modalities, such as laser, radar, visual camera, or IR camera, since they respond differently to distinct environmental conditions. In this thesis, a novel multisensory data fusion framework designed to appropriately combine data acquired by multiple exteroceptive sensing modalities in field robotics scenarios is presented. Compared with the traditional data fusion methods, the proposed framework will account for the fact that sensors perceive the environment in different ways. To this end, it is proposed to automatically detect commonalities and discrepancies between the data sources before performing data fusion or combination. The consistency test evaluates dependencies between data from multiple-sensing-modalities or representations and chooses intelligently the data that most likely leads to synergy. The data fusion methods described in this thesis create high fidelity representations taking advantage of the full potential of each sensing modality leading to resilient perception system. This thesis introduces a data fusion with a parametric consistency test framework for multiple sensing modalities using Gaussian process data fusion. This approach avoids the data quality degradation inherent to fusing multiple types of sensor data by using a parametric consistency test. In addition, a non-parametric consistency test for multiple-sensing-modalities using Gaussian process data fusion is presented. This approach avoids local geometric threshold parameters and can be more discriminatory because it takes into account the global model. An experimental analysis was evaluated for simulations and for real-world implementation using laser and radar sensor in different environmental conditions. More specifically the performance of 3D surface reconstruction in the context of field robotics was tested in a variety of scenarios. It was demonstrated that by identifying discrepancies and exploiting data commonalities between the data of each sensing modality the framework creates a resilient perception system to adverse environmental conditions.en_AU
dc.rightsThe author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.en_AU
dc.subjectField roboticsen_AU
dc.subjectdata-fusionen_AU
dc.subjectmachine learningen_AU
dc.subjectperceptionen_AU
dc.subjectmultiple-sensing-modalitiesen_AU
dc.subjectobject reconstructionen_AU
dc.titleRobust Multiple-Sensing-Modality Data Fusion for Reliable Perception in Outdoor Scenariosen_AU
dc.typeThesisen_AU
dc.type.thesisDoctor of Philosophyen_AU
usyd.facultyFaculty of Engineering and Information Technologies, School of Aerospace, Mechanical and Mechatronic Engineeringen_AU
usyd.degreeDoctor of Philosophy Ph.D.en_AU
usyd.awardinginstThe University of Sydneyen_AU


Show simple item record

Associated file/s

Associated collections

Show simple item record

There are no previous versions of the item available.