Event-based Satellite Docking
Access status:
Open Access
File/s
dataset_0307__calibration__calibration1_jul27.bag.tar
dataset_0307__calibration__calibration2_jul27.bag.tar
dataset_0307__camera-to-robot-calibration_dvs.yaml.tar
dataset_0307__training_no_port__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_no_port__T1_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_no_port__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_no_port__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_no_port__T1_RA_100_ES_1_full_sun_and_earthshine_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_no_port__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_100_ES_1_full_sun_and_earthshine_co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_no_port__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_no_port__T4_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_0_no_lights_co-located_proph.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_0_no_lights_co-located_proph.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_0_no_lights_co-located_proph.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
filter_bag_files.sh.tar
filter_single_bag_file.sh.tar
Generated_Data_Davis__T2_RA_01_ES_0_ambient_co-located.tar.xz.tar
Generated_Data_Davis__T4_RA_100_ES_1_full_sun_and_earthshine_co-located.tar.xz.tar
Generated_Data_Davis__T4_t1b2_RA_01_ES_0_C_0_ambient.tar.xz.tar
hard_cases_additional__reflect_direct.bag.tar
hard_cases_additional__sharp_shadow.bag.tar
hard_cases_additional__slow_push_ll.bag.tar
hard_cases_additional__slow_push_llearth.bag.tar
hard_cases_additional__slreflected_shad.bag.tar
hard_cases_benchmark__colight_deadslow.bag.tar
hard_cases_benchmark__full_reflect.bag.tar
hard_cases_benchmark__nolight_deadslow.bag.tar
hard_cases_benchmark__verylowlight_deadslow.bag.tar
hard_cases_benchmark__verylowlight_slow.bag.tar
srn_setup_3_projection_calibration_fixed-2024-08-03_14.10.12.mp4.tar
srn_setup_3_projection-2024-07-31_14.32.50.mp4.tar
test_port_1408.zip.tar
testing_2108.zip.tar
training_data_davis.zip.tar
training_data_davis_1408.zip.tar
training_data_rgb_2108.zip.tar
dataset_0307__calibration__calibration2_jul27.bag.tar
dataset_0307__camera-to-robot-calibration_dvs.yaml.tar
dataset_0307__training_no_port__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_no_port__T1_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_no_port__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_no_port__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_no_port__T1_RA_100_ES_1_full_sun_and_earthshine_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_no_port__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_no_port__T2_RA_100_ES_1_full_sun_and_earthshine_co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_no_port__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_no_port__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_no_port__T4_RA_0_ES_0_no_lights_non-co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_1__T1_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_1__T2_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_1__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_2__T1_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_2__T2_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307__training_texture_2__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_0_no_lights_co-located_proph.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T1_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_0_no_lights_co-located_proph.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T2_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_0_no_lights_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_0_no_lights_co-located_proph.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_1_earthshine_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_0_ES_1_earthshine_only_non-co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_100_ES_0_full_sun_only_co-located.bag.tar
dataset_0307_notexture__training_no_texture__T3_RA_100_ES_0_full_sun_only_non-co-located.bag.tar
filter_bag_files.sh.tar
filter_single_bag_file.sh.tar
Generated_Data_Davis__T2_RA_01_ES_0_ambient_co-located.tar.xz.tar
Generated_Data_Davis__T4_RA_100_ES_1_full_sun_and_earthshine_co-located.tar.xz.tar
Generated_Data_Davis__T4_t1b2_RA_01_ES_0_C_0_ambient.tar.xz.tar
hard_cases_additional__reflect_direct.bag.tar
hard_cases_additional__sharp_shadow.bag.tar
hard_cases_additional__slow_push_ll.bag.tar
hard_cases_additional__slow_push_llearth.bag.tar
hard_cases_additional__slreflected_shad.bag.tar
hard_cases_benchmark__colight_deadslow.bag.tar
hard_cases_benchmark__full_reflect.bag.tar
hard_cases_benchmark__nolight_deadslow.bag.tar
hard_cases_benchmark__verylowlight_deadslow.bag.tar
hard_cases_benchmark__verylowlight_slow.bag.tar
srn_setup_3_projection_calibration_fixed-2024-08-03_14.10.12.mp4.tar
srn_setup_3_projection-2024-07-31_14.32.50.mp4.tar
test_port_1408.zip.tar
testing_2108.zip.tar
training_data_davis.zip.tar
training_data_davis_1408.zip.tar
training_data_rgb_2108.zip.tar
Permalink
https://hdl.handle.net/2123/33938Metadata
Show full item recordType
DatasetAuthor/s
Le Gentil, CedricNaylor, Jack
Munasinghe, Nuwan
Mehami, Jasprabhijit
Dai, Benny
Asavkin, Mikhail
Dansereau, Donald G.
Vidal-Calleja, Teresa
Abstract
Dataset to accompany Le Gentil et al. "Mixing Data-driven and Geometric Models for Satellite Docking Port State Estimation using an RGB or Event Camera", IEEE International Conference on Robotics and Automation (ICRA) 2025.
In-orbit automated servicing is a promising path towards ...
See moreDataset to accompany Le Gentil et al. "Mixing Data-driven and Geometric Models for Satellite Docking Port State Estimation using an RGB or Event Camera", IEEE International Conference on Robotics and Automation (ICRA) 2025. In-orbit automated servicing is a promising path towards lowering the cost of satellite operations and reducing the amount of orbital debris. For this purpose, we present a pipeline for automated satellite docking port detection and state estimation using monocular vision data from standard RGB sensing or an event camera. Rather than taking snapshots of the environment, an event camera has independent pixels that asynchronously respond to light changes, offering advantages such as high dynamic range, low power consumption and latency, etc. This work focuses on satellite-agnostic operations (only a geometric knowledge of the actual port is required) using the recently released Lockheed Martin Mission Augmentation Port (LM-MAP) as the target. By leveraging shallow data-driven techniques to preprocess the incoming data to highlight the LM-MAP's reflective navigational aids and then using basic geometric models for state estimation, we present a lightweight and data-efficient pipeline that can be used independently with either RGB or event cameras. We demonstrate the soundness of the pipeline and perform a quantitative comparison of the two modalities based on data collected with a photometrically accurate test bench that includes a robotic arm to simulate the target satellite's uncontrolled motion.
See less
See moreDataset to accompany Le Gentil et al. "Mixing Data-driven and Geometric Models for Satellite Docking Port State Estimation using an RGB or Event Camera", IEEE International Conference on Robotics and Automation (ICRA) 2025. In-orbit automated servicing is a promising path towards lowering the cost of satellite operations and reducing the amount of orbital debris. For this purpose, we present a pipeline for automated satellite docking port detection and state estimation using monocular vision data from standard RGB sensing or an event camera. Rather than taking snapshots of the environment, an event camera has independent pixels that asynchronously respond to light changes, offering advantages such as high dynamic range, low power consumption and latency, etc. This work focuses on satellite-agnostic operations (only a geometric knowledge of the actual port is required) using the recently released Lockheed Martin Mission Augmentation Port (LM-MAP) as the target. By leveraging shallow data-driven techniques to preprocess the incoming data to highlight the LM-MAP's reflective navigational aids and then using basic geometric models for state estimation, we present a lightweight and data-efficient pipeline that can be used independently with either RGB or event cameras. We demonstrate the soundness of the pipeline and perform a quantitative comparison of the two modalities based on data collected with a photometrically accurate test bench that includes a robotic arm to simulate the target satellite's uncontrolled motion.
See less
Date
2025-05-27Publisher
The University of SydneyFunding information
NSW-SRN/RP220201
Licence
Creative Commons Attribution-NonCommercial 4.0Faculty/School
Australian Centre for RoboticsFaculty of Engineering, School of Aerospace Mechanical and Mechatronic Engineering
Share