Convergent data-driven workflows for open radiation calculations: an exportable methodology to any field
he fast growth worldwide of linkable scientific datasets supposes significant challenges in their management and reuse. Large experiments, such as the Latin American Giant Observatory, generate volumes of data that can benefit other kinds of studies. In this sense, there is a modular ecosystem of ex...
- Autores:
-
Núñez Chongo, Osiris
Asorey, Hernán
Rubio Montero, Antonio Juan
Suárez Durán, Mauricio
Mayo García, Rafael
Carretero, Manuel
- Tipo de recurso:
- Article of investigation
- Fecha de publicación:
- 2025
- Institución:
- Corporación Universidad de la Costa
- Repositorio:
- REDICUC - Repositorio CUC
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.cuc.edu.co:11323/14156
- Acceso en línea:
- https://hdl.handle.net/11323/14156
https://repositorio.cuc.edu.co/
- Palabra clave:
- Astroparticles
Cloud
Convergence
FAIR
HPC-HTC
Radiation doses
Radiation therapies
- Rights
- closedAccess
- License
- Atribución 4.0 Internacional (CC BY 4.0)
Summary: | he fast growth worldwide of linkable scientific datasets supposes significant challenges in their management and reuse. Large experiments, such as the Latin American Giant Observatory, generate volumes of data that can benefit other kinds of studies. In this sense, there is a modular ecosystem of external radiation tools that should harvest and supply datasets without being part of the main pipeline. Workflows for personal dose estimation, muongraphy in volcanology or mining, or aircraft dose calculations are built with different privacy policies and exploitation licenses. Every numerical method has its own requirements and only parts could make use of the Collaboration’s resources, which implies the convergence with other computing infrastructures. Our work focuses on developing an agnostic methodology to address these challenges while promoting open science. Leveraging the encapsulation of software in nested containers, where the inner layers accomplish specific standardization slices and calculations, the wrapper compiles metadata and data generated and publishes them. All this allows researchers to build a data-driven computer continuum that complies with the findable, accessible, interoperable, and reusable principles. The approach has been successfully tested in the computer-demanding field of radiation-matter interaction with humans, showing the orchestration with the regular pipeline for diverse applications. Moreover, it has been integrated into public or federated cloud environments as well as into local clusters and personal computers to ensure the portability and scalability of the simulations. We postulate that this successful use case can be customized to any other field. |
---|