MODELS 2023
Sun 1 - Fri 6 October 2023 Västerås, Sweden

About

MODELS will once again implement a separate evaluation process to assess the quality of the artifacts supporting the work presented in accepted papers. The purpose of the artifact evaluation process is to acknowledge the considerable effort required to obtain high-quality artifacts, to foster a culture of experimental reproducibility, and to provide a peer review and archiving process for artifacts analogous to that of research papers. The goal of artifact archiving is to ensure that the artifacts stay available for a long time, that they can be located easily, and can be reused by other researchers. Additionally, archiving allows designating exactly the version of the artifact that was used to produce the research results.

We focus on assessing the artifacts themselves and helping to improve them rather than evaluating the quality of the research linked to the artifact. This process assumes that the quality of research has been already assessed and approved for MODELS by the respective program committees. Thus, the main goal of our review process is constructive: to improve the submitted artifacts, not to reject or filter them. An artifact evaluation rejection may happen if we determine that improving the artifact to sufficient quality is impossible in the given time frame, the artifact is not consistent with the paper’s results, or the artifact itself is not of sufficient relevance to the scope of the main research paper or to the MODELS community at large.

To summarize, a good artifact is:

  • Consistent with the paper
  • As complete as possible
  • Well-documented
  • Easy to (re)use
  • Publicly available and archived

Benefits

Artifacts Evaluated

Artifacts Available

Results Validated

Functional

Reusable

Available

Reproduced

Replicated

https://www.acm.org/binaries/content/gallery/acm/publications/replication-badges/artifacts_evaluated_functional_dl.jpg

https://www.acm.org/binaries/content/gallery/acm/publications/replication-badges/artifacts_evaluated_reusable_dl.jpg

https://www.acm.org/binaries/content/gallery/acm/publications/replication-badges/artifacts_available_dl.jpg

https://www.acm.org/binaries/content/gallery/acm/publications/replication-badges/results_reproduced_dl.jpg

https://www.acm.org/binaries/content/gallery/acm/publications/replication-badges/results_replicated_dl.jpg

  • Artifact Evaluated Functional - The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.

  • Artifact Evaluated Reusable - The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.

  • Artifacts Available Results Validated Available - Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.

  • Artifacts Available Results Validated Reproduced - Τhe main results of the paper have been independently obtained in a subsequent study by a person or team other than the authors, without the use of author-supplied artifacts.

  • Artifacts Available Results Validated Replicated - Τhe main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author.

Important dates

  • Monday, June 26th, 2023 - Call for artifacts (corresponds to Technical Track notification)
  • Wednesday, July 5th, 2023 - Artifact submission deadline
  • Saturday, July 22th, 2023 - “Kick-the-tires”: 1st review period (PC only)
  • Monday July 24th, 2023 Monday July 26th, 2023, AE rebuttal: 2nd review period (PC/authors discussion)
  • Friday, August 2nd, 2023 - Notification to authors
  • Wednesday, 9th Aug 2023 - Camera ready (including AEC badges)

Submission process

If and when your paper has been accepted for MODELS 2023, you will be invited by the AEC chairs to submit the artifacts related to your work. This invitation will contain detailed instructions on how to submit your artifacts.

For the reusable and available badges, authors must offer “download information” showing how reviewers can access and easily execute (if appropriate) their artifact. The authors need to make the packaged artifact (installation package or simple package) available so that the Evaluation Committee can access it. We suggest a link to a public repository or to a single archive file in a widely available archive format.

If the authors are aiming for the badges “available” and beyond the artifact needs to be publicly accessible (a DOI is required!). In other cases, the artifacts do not necessarily have to be publicly accessible for the review process. In this case, the authors are asked to provide a private link or a password-protected link. In either case, we encourage authors to ensure that artifacts can be accessed with link only (e.g., no registration is necessary). It is worth noting that for the “available” badge a DOI is required. Github/Gitlab are not archival repositories, as required by ACM.However, there is an easy-to-use Github-to-Zenodo service here: https://docs.github.com/en/repositories/archiving-a-github-repository/referencing-and-citing-content

The authors need to write and submit documentation explaining how to obtain the artifact package, how to unpack the artifact, how to get started, and how to use the artifacts in more detail. The artifact submission must only describe the technicalities of the artifacts and uses of the artifact that are not already described in the paper.

The submission should contain the following documents (in plain text or pdf format) in a zip archive:

  • A README main file describing what the artifact does and where it can be obtained (with hidden links and access password if necessary). Also, there should be a clear description, of how to repeat/replicate/reproduce the results presented in the paper. Artifacts that focus on data should, in principle, cover aspects relevant to understanding the context, data provenance, ethical and legal statements (as long as relevant), and storage requirements. Artifacts that focus on software should, in principle, cover aspects relevant to how to install and use it (and be accompanied by a small example).
  • A REQUIREMENTS file for artifacts that focus on software. This file should, in principle, cover aspects of hardware environment requirements (e.g., performance, storage, or non-commodity peripherals) and software environments (e.g., Docker, VM, and operating system) but also, if relevant, a requirements.txt with explicit versioning information (e.g. for Python-only environments). Any deviation from standard environments needs to be reasonably justified. It is strongly recommended to make the execution as smooth as possible for the evaluation by the artifact evaluation committee.
  • A STATUS file stating what kind of badge(s) the authors are applying for as well as the reasons why the authors believe that the artifact deserves that badge(s).
  • A LICENSE file describing the distribution rights. Note that to score “available” or higher, then that license needs to be some form of open-source license. Details are also under the respective badges and the open science policies as adopted by ACM SIGSOFT. A copy of the accepted paper in PDF format.

Evaluation Process

Each submitted artifact will be evaluated by at least two members of the AEC. Thereby, the artifacts will be treated as confidential, as with the submitted paper.The evaluation consists of two steps:

  • Kicking-the-tires: Reviewers will check the artifact’s integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). In case of any problems, authors will be given 4 days to read and respond to the kick-the-tires reports of their artifacts and solve any issues preventing the artifact evaluation.
  • Artifact assessment: Reviewers evaluate the artifacts and decide on the approval of the artifact.

As the artifact evaluation notification will be after the camera-ready deadline, we will ensure that the published article will carry the corresponding ACM Artifact Evaluation Badge. Moreover, we advise the authors to provide a stable link to your artifact already in your camera-ready version, for instance, with a DOI link to a Zenodo repository.