ESA explores cognitive computing in space with FDL breakthrough experiments
In a series of world firsts, the Frontier Development Lab (FDL) programme in collaboration with ESA has achieved significant results in the field of Cognitive Cloud Computing in Space (3CS). In experiments on a D-Orbit InOrbit NOW satellite carrier mission, FDL has shown how a Machine Learning (ML) payload can reduce downlink latency, easily adapt to different optical instruments and be updated directly in space, while enabling fast information extraction and delivery to end users. FDL has also created an ML payload for third-party application hosting, launched on the latest D-Orbit mission in January.
It can take considerable time to fuse and extract actionable insights from space-derived data streams, which are often enormous. Just one image tile of Earth from ESA’s Sentinel-2 spacecraft, covering a 100 km by 100 km square, is 2.5 GB – the same size as a movie download. Flood information for emergency response is a prime example of where updated satellite-derived insights are indispensable for directing relief efforts, but bottlenecks in the downloading and analysis of images can lead to delays of several hours or even days in making actionable insights available.
Enter Cognitive Cloud Computing in Space (3CS), which has the potential to relieve such challenges by drawing on advances in federated Machine Learning combined with the provisioning of high-powered computational hardware in orbit. 3CS envisages large-scale intelligent swarm systems in space, where multiple spacecraft and instruments come together to empower Earth observation (EO), returning relevant and verified actionable insights to the ground within seconds. ESA is giving significant attention to 3CS and has already made inroads into understanding its onboard computing requirements through the Φ-sat concept, with the Φ-sat-1 experiment launching in 2020. ESA Discovery is funding 12 projects related to 3CS, and in the commercial EO environment, ESA InCubed is co-funding the AI-express (AIX) activity being developed by Planetek, D-Orbit and AIKO.
To prove the viability of some elements of the 3CS vision, FDL set up its NIO (Networked Intelligence in Orbit) experiments with funding and support from ESA Φ-lab. The NIO trials are based around the WorldFloods ML payload, which was developed by young data scientists in partnership with ESA Φ-lab in a 2019 FDL Europe sprint and subsequently published in Nature. The payload was launched in June of last year on the D-Orbit InOrbit NOW (ION) Wild Ride mission and runs on D-Orbit’s own Nebula cloud environment in tandem with Unibap’s SpaceCloud computer.
Firstly, in an emulation of onboard intelligent data processing, the WorldFloods payload took a pre-loaded Sentinel-2 tile of flood images and converted the pixel data to bounded polygons of flood areas. This resulted in a 10 000-fold reduction in data packet size, with the processed tile then rapidly downlinked and shown to perform comparably with conventionally produced flood maps.
Next, the WorldFloods ML payload was adapted to a different instrument. Instead of the high-resolution images from Sentinel-2, the ML payload received images from the Wild Ride onboard RGB D-Sense camera. Despite the fact that the camera has a fairly coarse resolution and was not designed for EO, the ML payload was successfully fine-tuned with only a few images and generated reasonably accurate vector maps of waterbodies, land and clouds. The maps were then downloaded in just 36 seconds.
Finally, the NIO team achieved another breakthrough by deploying an updated ML payload in orbit. Any ML pipeline needs to be maintained and refined, and so the ability to upload new model parameters or ‘weights’ to an onboard processor is a key component of any future 3CS infrastructure. The new model weights were uplinked to the D-Orbit/Unibap platform without a hitch, and subsequent functional testing demonstrated increased flood detection performance.
James Parr, whose company Trillium Technologies runs the FDL programme, sums up the value of the NIO results: “Taken together, these experiments give a tantalising glimpse of the promise of 3CS – how spacecraft working together with in-orbit cloud infrastructure can enable hybrid observation and adaptive space services. We have the potential to revolutionise how we respond to disasters, manage emissions and pollution, improve weather forecasts and foster next-gen space situational awareness.”
In a further NIO development, a set of advanced services tests has been funded by ESA for another D-Orbit ION satellite carrier. Launched in January, the Dashing Through the Stars mission includes a miniaturised hyperspectral camera, designed by research institution VTT. As part of the ESA-commissioned tests, FDL has created a customisable ML payload that allows third parties to upload and run Deep Learning models for onboard processing of the VTT camera images, delivering insights tailored to many different applications directly to the ground.
“The idea of intelligent, federated satellites operating in concert to provide insight faster and more accurately is a fundamental enabler for the future of EO, opening up new avenues for shaping the future of software-defined missions,” says Φ-lab data scientist Nicolas Longépé. “The NIO experiments on Wild Ride have served as a compelling proof of concept for 3CS, and with the launch of Dashing Through the Stars we’ll be able to see how access to such a system can be extended to new downstream applications.”
David Steenari, a data processing engineer from the ESA Directorate of Technology, Engineering and Quality, underlines the broad-based ESA support for Dashing Through the Stars: “The camera development, along with its integration on to the ION carrier and in-flight calibration, have been developed under a combination of GSTP and TDE funding. Coupling the imager to FDL’s ML payload will amply demonstrate exactly what tomorrow’s commercially accessible payloads will be able to deliver.”