MODELS 2022
Sun 23 - Fri 28 October 2022 Montréal, Canada

About


MODELS will once again implement a separate evaluation process to assess the quality of the artifacts supporting the work presented in accepted papers. The purpose of the artifact evaluation process is to acknowledge the considerable effort required to obtain high-quality artifacts, to foster a culture of experimental reproducibility, and to provide a peer review and archiving process for artifacts analogous to that of research papers.

The goal of artifact archiving is to ensure that the artifacts stay available for a long time, that they can be located easily, and can be reused by other researchers. Additionally, archiving allows to designate exactly the version of the artifact that was used to produce the research results.

We aim to assess the artifacts themselves and help improve them rather than evaluate the quality of the research linked to the artifact. This process assumes that the quality of research has been already assessed and approved for MODELS by the respective program committees. Thus, the main goal of our review process is constructive: to improve the submitted artifacts, not to reject or filter them. An artifact evaluation rejection may happen if we determine that improving the artifact to sufficient quality is impossible in the given time frame, the artifact is not consistent with the paper’s results, or the artifact itself is not of sufficient relevance to the scope of the main research paper or to the MODELS community at large.

To summarize, a good artifact is:

  • Consistent with the paper
  • As complete as possible
  • Well-documented
  • Easy to (re)use
  • Publicly available and persisted

For more detailed information on the expectations for an artifact, please check the Submissions tab. Submission to the artifact evaluation committee is optional and the result of the artifact evaluation process will not influence the existing decision on already accepted papers.

Benefits


Potential Badges - Detailed explanations can be found at Artifact Review Badging!

Artifacts Evaluated – Reusable v1.1 Artifacts Available v1.1 Results Reproduced v1.1 Results Replicated v1.1
ACMs Artifacts Evaluated – Reusable Badge ACMs Artifacts Available Badge ACMs Artifacts Replicated Badge ACMs Artifacts Reproduced Badge

Authors of papers with accepted artifacts will be invited to include an official ACM Artifact Evaluation badge on the first page of the camera-ready version of their paper. This badge explicitly communicates to the paper’s readers that the authors have undergone a specific evaluation process for their artifact.

Questions?


For any other questions, please send us an e-mail to models2022ae@easychair.org.

Important Dates AoE (UTC-12h)


  • Tuesday, July 12th 2022
    • Call for artifacts (corresponds to Technical Track notification)
  • Friday, July 22nd 2022
    • Artifact submission deadline
  • Monday, August 8th 2022
    • “Kick-the-tires” review and response deadline
  • Friday, August 19th 2022
    • Notification
  • Friday, August 26th 2022
    • Camera-ready (corresponds to Technical Track Camera Ready)

Submission


Together with your notification of acceptance, you will be invited by the AEC chairs to submit the artifacts related to your work. The invitation will contain detailed instructions on how to submit your work.

Expected contents and formats


A submission will comprise the following:

  • A preprint version of the accepted paper, in PDF format. This paper will be used to evaluate the consistency of the artifact with the paper.
  • A list of authors of the artifact, who may include people who are not authors of the paper, but contributed to the artifact.
  • An abstract (250 words) providing a short description of the artifact, for assigning artifacts to AEC members. This is likely to differ from the paper’s abstract.
  • An archive file (.tgz, .tar.gz, or .zip) containing everything needed to support a full evaluation of the artifact, including a README document (in .txt or .md format) with the following minimal information:
    • An overview of the contents of the archive.
    • A step-by-step setup / installation guide.
    • Step-by-step instructions on how to reproduce the experiments or any other activities supporting the conclusions in the paper.

If an accepted paper has multiple artifacts (e.g. a prototype tool, a basic example, and a user study), please collect them into a single archive for submission.

Please ensure that your submission is as accessible to the AEC as possible, by including some simple scenarios describing the intended use of the artifact. For example, in the case of a tool, provide a scenario with concrete steps and inputs and the expected outputs. In general, we encourage authors to consider the artifact quality guidelines set out for MDE research by Damasceno and Strüber in their MODELS 2021 paper, especially the 23 top guidelines. A full list of MDE-specific guidelines can be found in the MDE Artifacts website.

If you are submitting a tool as an artifact, you are strongly recommended to provide a virtualized / containerised environment with everything needed. A Docker image is preferred as the Dockerfile will document the necessary setup steps. Alternatively, a virtual machine image is acceptable (e.g. using VirtualBox, VMWare, or any solution that produces an Open Virtualisation Format export). Other similar widely available platforms are also acceptable.

Please use widely supported open formats in your artifacts. Here are some suggested formats:

  • Documents: PDF, HTML, Markdown, plain text.
  • Data: CSV, JSON.

Evaluation process


The submitted artifacts will be evaluated by the AEC concerning these criteria (following the ACM Artifacts Evaluated - Functional quality badge):

  • Documented: At minimum, an inventory of artifacts is included, and sufficient description provided to enable the artifacts to be exercised.
  • Consistent: The artifacts are relevant to the associated paper, and contribute in some inherent way to the generation of its main results.
  • Complete: To the extent possible, all components relevant to the paper in question are included. (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis.)
  • Exercisable: Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.

Each submitted artifact will be evaluated by at least two AEC members, treated with confidentiality in the same way as the accepted paper.

Artifacts that pass the evaluation will receive the above mentioned “Artifact Evaluated - Functional” badge and will be invited for inclusion in the electronic conference proceedings. High-quality artifacts that exceed these expectations and are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated will receive an “Artifact Evaluated - Reusable” badge.

In addition, if the artifact has been placed on a publicly accessible archival repository (e.g. Zenodo) with a persistent DOI or URL, it will receive the “Artifact Available” badge. If the main results have been obtained by a person or team other than the author, the “Results Reproduced” and “Results Replicated” badges will be considered.

For more detailed information on the badges, please visit the ACM Artifact Review Badging website.

The process will consist of two steps:

  • Kick-the-tires: reviewers will check the integrity of the artifact and look for any setup problems that may impede their fair evaluation (e.g. crashes or corrupted files). If there are problems, authors will be notified and will be given several days to respond and solve the issues.
  • Artifact assessment: reviewers will decide if the artifact meets the conditions for the ACM “Artifacts Evaluated - Functional v1.1” badge. If so, the reviewers will also consider whether the artifact meets the conditions for the other ACM badges.