Skip to content

alan-turing-institute/climate-informatics-2024-ae

Repository files navigation

Climate Informatics 2024 Artefact Evaluation

This repository describes the process for artefact evaluation (AE) for full papers accepted at the Climate Informatics 2024 conference.

The Problem

Climate Informatics, like many other communities and fields, has software at its heart. Underlying most publications is a novel piece of software playing some critical role, e.g., embodying a model, processing or analysing data, or producing a visualisation.

In order for such software artefacts to have the most impact, they should be available, functional, and reusable, such that other researchers can benefit from the work, verify the claims of the paper, and then build upon the software to do more great work. These ideals are summarised by the FAIR principles of data, which can be applied to software: research software should be Findable, Accessible, Interoperable, and Reusable (FAIR).

The practicalities of achieving FAIR software are non-trivial, and require resourcing for both authors and reviewers, as well as community consensus on requirements and standards.

The Solution

In order to help promote FAIR software in our community, Climate Informatics is embarking, for the first time, on an optional Artefact Evaluation (AE) phase for accepted full paper submissions. Those submissions will published as the conference proceedings in Environmental Data Science following the traditional peer-review process.

AE provides an opportunity to embed the values of reproducibility into the publication process in a lightweight opt-in fashion, thus encouraging authors to make software available and the results of the paper reproducible. Submitted arfefacts will be assessed by a skilled team of reviewers, who will work with authors to help them develope and share their materials with the highest practical level of computational reproducibility.

What are we doing?

We have adopted the AE process of the Association for Computing Machinery artefact Review and Badging Version 1.1, and developed this to fit the specific context of the Climate Informatics community (with minimal changes to retain alignment with this well accepted standard - we do not need to re-invent the wheel!). We have worked with our partners at Cambridge University Press to deliver this process in a way which complements their publication workflow, and does not delay dissemination of the work.

The next stage is to design and deliver training and resources to the authors so the AE is a positive learning experience, and recruit and train a committee of reviewers to undertake the work of evaluation, with careful awareness of the workload and incentives to contribute their labour to this effort.

Take a look at published materials and decisions made to date:

  • process.md Describes in detail the rationale, process, and evaluation criteria in place;
  • badges.md Provides a checklist which will be used in reviewing and to assess whether the artefacts are available, functional, and reusable.
  • Issues is where we are openly recording our decision making processes.

What do we need?

We need authors who are keen to submit their artefacts for evaluation, and reviewers who would like to contribute to the growth of reproducibility in our community!

We will be recruiting reviewers in the coming weeks, and working with them to deliver the training and support they need to undertake this work. Take a look at the benefits for reviewers to understand how participating as a reviewers will be a valuable opportunity for you, and stay tuned on the Turing Environment and Sustainability Slack for invitations to join the review committee!

Who are we?

This work is being lead by the Reproducibility working group of the Climate Informatics 2024 organisers. We are pleased to connect with you if you would like to participate in the leadership or delivery of this work!

Contact us

Slack

Conenct with us via Turing Environment and Sustainability Slack - tag or dm Cassandra Gould van Praag (Turing Environment and Sustainability Senior Research Community Manager), or Alejandro Coca-Castro (CI2024 Reproducibility Chair)

Email

  • Cassandra Gould van Praag (Turing Environment and Sustainability Senior Research Community Manager): [email protected]
  • Alejandro Coca-Castro (CI2024 Reproducibility Chair): [email protected]

Acknowledgements and citation

The AE process is developed following Association for Computing Machinery artefact Review and Badging Version 1.1.

This repo and README follows the best practice for community participation of The Turing Way:

The Turing Way Community. (2022). The Turing Way: A handbook for reproducible, ethical and collaborative research (1.0.2). Zenodo. https://doi.org/10.5281/zenodo.7625728

About

Artefact evaluation (AE) of research objects submitted to Climate Informatics 2024

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published