Aubrey Dunne, Ubotica Technologies
Juan Romero-Cañas, Ubotica Technologies
Peter O’Connor, Ubotica Technologies
Rosana Rodriguez-Bobada, Ubotica Technologies
Sandi Romih, Ubotica Technologies
David Rijlaarsdam, Ubotica Technologies
Aubrey Dunne, CTO, Ubotica Technologies
Satellite autonomy has the promise of reducing operational latency and costs, whilst simultaneously enabling reactivity and improved duty cycles, by converting satellites into intelligent robotic agents. Multiple approaches to achieving this are being investigated, and this paper explores the aspect of vision-based autonomous operations. Vision is a rich medium that has been shown terrestrially to hugely benefit from the application of AI, in particular of Convolutional Neural Networks (CNNs) and Vision Transformers (ViTs). This work explores a practical realisation of an element of this visual autonomy – that of an AI-enabled intelligent space camera for satellite in-orbit operations. It aims to demonstrate the feasibility and utility of deploying a COTS-based camera with integrated AI capabilities directly on satellite. The intelligent space camera has been designed for dynamic AI updates, and supports hardware ISP, hardware AI acceleration, and a hard block for video encoding. Firmware and software have been developed for efficient image capture, processing and delivery over Ethernet, as well as boot, control and configuration from the satellite’s On-Board Computer (OBC). A RESTful API enables dynamic configuration of the camera from the host OBC supporting changes to the AI network, the ISP, and the encoder, whilst additionally enabling the querying of telemetry and status information. This first flight camera has been configured with a full HD CMOS sensor and a wide FOV optical stack for wide field operations. The camera was integrated as a payload on the “Call to Adventure” mission from Apex Space. This mission consists of an Apex Aries SmallSat bus (SN1) that has a 150kg payload capacity. Mounted on the bus exterior and with an unobstructed field of view, the camera is controlled from the OBC via an operations manager script that calls a lightweight camera controller binary. On power-on the manager invokes the binary to first boot the camera over Ethernet, followed by updating its parameters using a JSON configuration file sent to the appropriate camera endpoint. The satellite was launched in March 2024 onboard SpaceX Transporter 10, and since then camera commissioning and early operations have been ongoing. The mission allows to validate the intelligent space camera operation in an orbital LEO environment over a multi-year duration and under operational conditions on a state-of-the-art SmallSat bus. Through pre-flight hardware-in-the-loop validations and in-flight commissioning phases, the camera unit and operational paradigm have been evaluated and benchmarked. Results are reported for the pre-flight phase and early results from mission commissioning and operations are additionally presented. Power, latency, AI inference performance, and image and video acquisition are addressed and discussed. Camera firmware updates in flight – to add new features – were successfully performed, and will be discussed. Further outcomes around the operational flow and general benchmarking are presented. The results of this ongoing work demonstrate the utility of targeted applications for which processing at the extreme edge – directly on camera – can be leveraged to reduce system load and efficiently distribute processing within a spacecraft. This has applicability to a range of use cases, in particular Resident Space Object (RSO) detection, autonomous threat detection (driving reactivity), Rendezvous and Proximity Operations (RPO), and visual Fault Detection, Isolation and Recovery (FDIR). A practical RSO use case that uses the intelligent space camera on the Aries SN1 spacecraft is discussed, including in-flight reconfigurations applied to improve the accuracy of operations.