Systematic data processing: from raw data to data products
The processing of data is one of the largest fields of responsibility in IT industry as it happens in most diverse areas of our every-day life. The photo filter of a messaging app processes the data of our picture to achieve its optimum presentation. We send an e-mail and the data are encrypted before dispatching them to the recipient. The vehicle control system in a car, too, is constantly processing data in order to identify dangerous situations and to optimize operating procedures.
Wherever and whenever data are periodically produced in large amounts, you will need an efficient mechanism that sees to a smooth execution of your tasks.
Publication in "Proceedings of the 2023 conference on Big Data from Space": Processing framework for scientific Earth observation missions
A processing framework for scientific Earth Observation missions is described and presented. This software system is developed for a number of Earth Explorer missions and matured to a multi-mission processing framework. The transformation of computational environments from onpremise clusters to public-cloud environments is supported by this micro-service-based framework. The framework’s flexibility allows to address a number of use-cases ranging from classical systematic processing facilities, on-demand processing facilities, archive services and data analytics environments. The cooperation with stakeholders of the scientific missions’ communities and industrial IT experts ensures the further development of this stable and flexible processing environment.
Processing framework for scientific Earth observation missions (see pages 309-312)
Richard Hofmeister, Knut Bernhardt, Alexander Strecker, Torben Keßler and Christophe Caspar
Proceedings of the 2023 conference on Big Data from Space, Soille, P., Lumnitz, S. and Albani, S. editor(s), Publications Office of the European Union, Luxembourg, 2023, doi:10.2760/46796, JRC135493.
Special challenges of satellite data processing
Especially in scientific areas, there are many use cases calling for data processing: in order to understand the Earth's eco and climate systems, scientists collect huge amounts of data across the whole world. In space, for example, satellites collect data recorded by sensors or cameras and send them to Earth as so-called payload data.
Every satellite produces up to 1 TB of raw data per day, all of which need to be processed, stored and supplied to downstream systems. Every single day they are processed and turned into data products by applying scientific algorithms and support of auxiliary data. Because those data volumes are so enormous, efficient control of the workflows of systematic data processing acquires utmost importance. The consequential large and even growing demand for storage capacities make extensible data centers or cloud systems suitable locations to store the data.
This is how you accomplish efficient execution of processing chains on Earth observation data
Efficient control of workflows is guaranteed by apt software platforms, which offer specific benefits depending on the individual field of application and are optimized for the system infrastructures and file structures typically encountered in the area of satellite data processing. Even with strongly varying processing complexity, e.g. at times of extraordinarily many data retrievals or re-processing of data products, the software employed is able to scale the performance by flexibly configuring cloud resources. This ensures fast and save processing of large data quantities while avoiding unnecessary consumptions of energy and infrastructure when it is not needed.
We are well aware of the special challenges of systematic data processing and have been providing reliable solutions in many projects. A customer project that is a telling example for the use of our proprietary systematic data processing framework is the ground segment software for some of ESA's Earth Explorer missions like Biomass, EarthCARE or SWARM. By order of ESA, we are operating the Copernicus Long-Term Archive to back up Sentinel satellite data in a European cloud system. The actual archiving of files does not involve all too complex processing operations but, with the large quantities of data, the archive service requires substantial computing and processing capacities to verify file contents, compress data and generate meta data for traceability purposes.
Learn more about our EO Processing Service: