Delivering a data-driven open-source platform integrating cloud, edge and HPC technologies for trustworthy, accurate, fair and green data mining workflows for high-quality actionable knowledge
Objectives
Enable the development of complex and secure data mining workflows
Develop novel data-driven orchestration mechanisms to efficiently deploy and execute data mining workflows
Deliver the EXTRACT software platform and demonstrate its benefits in two use cases
Fully exploit the performance capabilities of the compute continuum to effectively address extreme data characteristics (high volume, variety, velocity, veracity) holistically
Foster the adoption of EXTRACT technology by industrial and academic communitie
Use cases
Personalized Evacuation Routing (PER) System
A Personalized Evacuation Routing (PER) System will serve to guide citizens in an urban environment (the city of Venice) through a safe route in real time.
The EXTRACT platform will be used to develop, deploy and execute a data-mining workflow to generate personalized evacuation routes for each citizen, displayed in a mobile phone app, by processing and analysing extreme data composed of Copernicus and Galileo satellite data, IoT sensors installed across the city, 5G mobile signal, and a semantic data lake fusing all this information.
Transient Astrophysics with a Square Kilometer Array Pathfinder (TASKA)
The Transient Astrophysics with a Square Kilometer Array Pathfinder (TASKA) case will use EXTRACT technology to develop data mining workflows that effectively reduce the huge amount of raw data produced by NenuFAR radio-telescopes by a factor of 100. This will allow the populating of high-quality datasets that will be openly accessible to the astronomy community (through the EOSC portal) to be leveraged for multiple research activities.
EXTRACT deliverables now available
Nine deliverables detailing the project's work have been published in the results section of the website. The EXTRACT deliverables offer essential insights into the project and its progress. The following deliverables, created for public dissemination, have been...
TASKA “C” Use Case: Fast-paced data calibration and imaging on the “C”loud
In the TASKA use case “C”, radio telescopes are used for imaging solar activity, particularly at very low radio frequencies. At these frequencies, the Sun appears as a lower resolution disk with blurry, blobby solar ejectas. Imaging these phenomena is crucial as it...
TASKA “A” Use Case: “A”gile detection and analysis of solar activity using NenuFAR radiotelescopes
The TASKA “A” use case seeks to efficiently detect and model the solar activity received by the NenuFAR radiotelescope in Nançay, France. The goal is to develop an automatic data processing and identification system that will allow for real-time data processing and...
French astrophysics at radio frequencies, towards the SKA
The Action Spécifique SKA-LOFAR is hosting a national conference on radio astronomy from November 12 to 15, 2024, at the École Normale Supérieure in Paris. This event offers a platform to share scientific results, methodologies, and projects from French teams focused...
EXTRACT @SCEWC ’24: Big Data & AI for urban decision-making
The EXTRACT project will play an active role in the 2024 Smart City Expo World Congress, the biggest and most influential event for cities and urban innovation. Every year, leaders from global companies, governments and organizations participate to move cities...
EBDVF 2024- EXTRACT Joins DATANEXUS for booth & session
The EXTRACT project will share the technology behind its solutions for harnessing extreme data across the compute continuum at the 2024 European Big Data Value Forum. We are joining six fellow innovative projects that make up the DataNexus cluster of EU-funded...