Link electronic laboratory log books (ELNs) and automatic metadata capture to the data collection, adding developments that addresses the needs of users and instruments in a technically advanced and efficient manner, aiming to collect information in structured and digital form early in the data generation process.
Create repositories of processed (reduced) data and analysis code to go with each publication in order to maximise data reuse and transparency. Also implement structured databases and catalogues for user experiments, samples, experimental data and calculations, thereby providing the opportunity to search for and find information before, during and after the experiments.
Create, curate and foster analysis software that can be deployed on ‘cloud-like’ services so that ordinary users can repeat and benefit from the work of power users, and to make the analysis of ‘big data’ technically simple, reproducible and sustainable, including the accessibility to machine-learning strategies.
Develop a common data policy between users and large-scale facilities which addresses common needs such as data curation, archiving standards and embargo policies. Align data policies and standards also on a European level by corporation with our European partners. Develop and promote efficient data flows and metadata definitions in agreement with other communities and NFDI consortia which enable and foster the reuse of all photon and neutron data in the NFDI.
Establish and enhance the awareness of FAIR principles and needs for research data
management in our community, especially within university curricula.
Manage and oversee the development and curation of software packages. Manage and coordinate the financial and organisational aspects of the consortium.