This work presents an Online Supervised Training (OST) method to enable robust vision-based navigation about a non-cooperative spacecraft. Spaceborne Neural Networks (NN) are susceptible to domain gap as they are primarily trained with synthetic images due to the inaccessibility of space. OST aims to close this gap by training a pose estimation NN online using incoming flight images during Rendezvous and Proximity Operations (RPO). The pseudo-labels are provided by an adaptive unscented Kalman filter where the NN is used in the loop as a measurement module. Specifically, the filter tracks the target's relative orbital and attitude motion, and its accuracy is ensured by robust on-ground training of the NN using only synthetic data. The experiments on real hardware-in-the-loop trajectory images show that OST can improve the NN performance on the target image domain given that OST is performed on images of the target viewed from a diverse set of directions during RPO.
Online Supervised Training of Spaceborne Vision during Proximity Operations using Adaptive Kalman Filtering
Tae Ha Park, Simone D'Amico
Please send feedback and questions to Tae Ha "Jeff" Park
@INPROCEEDINGS{park_2024_icra_ost,
author={Park, Tae Ha and D'Amico, Simone},
booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
title={Online Supervised Training of Spaceborne Vision during Proximity Operations using Adaptive Kalman Filtering},
year={2024},
pages={11744-11752},
doi={10.1109/ICRA57147.2024.10610138}
}
This work is supported by the US Space Force SpaceWERX Orbital Prime Small Business Technology Transfer (STTR) contract number FA8649-23-P-0560 awarded to TenOne Aerospace in collaboration with SLAB.
This page is created with the Instant NGP template.