A national federated Science Data Platform for FAIR fundamental physics research at large research infrastructures
- Room: 80/1-001 - Globe of Science and Innovation - 1st Floor
- Speaker:
- Christiane Schneide, Project Manager of PUNCH4NFDI, , Christiane Schneide was a PhD student at Hamburg Observatory and received her PhD by the University of Hamburg in 2017. Afterwards she worked as PostDoc at Leuphana University Lüneburg. Since 2022, Christiane Schneide is working at DESY as project manager of PUNCH4NFDI and as such regularly represents the consortium at scientific events as well as within the NFDI. , DESY, PUNCH4NFDI
This demo shows the joint analysis of ATLAS and CMS Open Data on the Science Data Platform developed by PUNCH4NFDI which provides access to federated compute and storage resources and a workflow engine via single sign-on. This allows very low-threshold access to complex particle physics analyses.
The objective of Open Data is on one hand to produce trust in science and scientists among the general public. On the other hand to enable the reuse of data for further analyses and better exploitation. Open Data itself does not necessarily allow the reuse of data – to this end, resources (like e.g. storage and compute) as well as appropriate software and metadata are needed.
PUNCH4NFDI, a consortium in the German National Research Data Infrastructure (NFDI), develops a Science Data Platform which integrates several resources and services necessary for the reuse of data from astronomy, astrophysics, astroparticle physics, particle physics, hadron and nuclear physics. To this end, the consortium builds wherever possible on the experience, existing services and tools of the different scientific communities and extends these or develops overarching layers wherever necessary.
In this demo we show how different elements of the Science Data Platform, provided and in parts also developed by PUNCH4NFDI partner institutions in Germany, can be used by means of one example of a joint particle physics analysis of ATLAS and CMS Open Data. The workflow is submitted via the workflow engine ReAna, executed on Compute4PUNCH (federated compute resources), and reads the data, which was previously transformed to a common format, from Storage4PUNCH (federated storage resources).