Write a Blog >>
MSR 2019
Sun 26 - Mon 27 May 2019 Montreal, QC, Canada
co-located with ICSE 2019

The International Conference on Mining Software Repositories (MSR) has hosted a mining challenge since 2006. With this challenge, we call upon everyone interested to apply their tools to a common dataset. The challenge is for researchers and practitioners to bravely use their mining tools and approaches on a dare.

The important dates for the Mining Challenge are:

  • Abstracts due: February 1, 2019 (AOE)

  • Papers due: February 6, 2019 (AOE)

  • Author notification: March 1, 2019 (AOE)

  • Camera-ready: March 15, 2019 (AOE)

Please see the Call for Mining Challenge Papers for all details.

Dates
Sun 26 May 2019
Tracks
MSR Data Showcase
MSR Keynote
MSR Mining Challenge
MSR Paper Presentations
MSR Technical Papers
You're viewing the program in a time zone which is different from your device's time zone change time zone

Sun 26 May

Displayed time zone: Eastern Time (US & Canada) change

16:00 - 18:00
Mining Challenge presentationsMSR 2019 Mining Challenge at Place du Canada
16:00
10m
Talk
SOTorrent: Studying the Origin, Evolution, and Usage of Stack Overflow Code Snippets
MSR 2019 Mining Challenge
A: Sebastian Baltes University of Trier, A: Christoph Treude The University of Adelaide, A: Stephan Diehl Computer Science, University Trier, Germany
Pre-print
16:10
7m
Talk
Mining Rule Violations in JavaScript Code Snippets
MSR 2019 Mining Challenge
Pre-print
16:17
7m
Talk
Snakes in Paradise?: Insecure Python-related Coding Practices in Stack Overflow
MSR 2019 Mining Challenge
Akond Rahman North Carolina State University, Effat Farhana , Nasif Imtiaz North Carolina State University
Pre-print
16:24
7m
Talk
Man vs Machine -- A Study into language identification of Stackoverflow code snippets
MSR 2019 Mining Challenge
Jens Dietrich Victoria University of Wellington, Markus Luczak-Roesch , Elroy Dalefield
Pre-print
16:31
7m
Talk
Python Coding Style Compliance on Stack Overflow
MSR 2019 Mining Challenge
Nikolaos Bafatakis , Niels Boecker , Wenjie Boon , Martin Cabello Salazar , Jens Krinke University College London, Gazi Oznacar , Robert White University College London, UK
Pre-print Media Attached
16:38
7m
Talk
Towards Mining Answer Edits to Extract Evolution Patterns in Stack Overflow
MSR 2019 Mining Challenge
Themistoklis Diamantopoulos Electrical and Computer Engineering Dept, Aristotle University of Thessaloniki, Maria-Ioanna Sifaki Electrical and Computer Engineering Dept, Aristotle University of Thessaloniki, Andreas Symeonidis Aristotle University of Thessaloniki
Pre-print Media Attached
16:45
7m
Talk
Analyzing Comment-induced Updates on Stack Overflow
MSR 2019 Mining Challenge
Abhishek Soni , Sarah Nadi University of Alberta
Pre-print
16:52
7m
Talk
What Edits Are Done on Highly Answered Stack Overflow Questions? An Empirical Study
MSR 2019 Mining Challenge
Xianhao Jin Virginia Tech, USA, Francisco Servant Virginia Tech
Pre-print
16:59
7m
Talk
Can Duplicate Posts on Stack Overflow Benefit the Software Development Community?
MSR 2019 Mining Challenge
Durham Abric McGill University, Oliver Clark , Matthew Caminiti , Keheliya Gallaba McGill University, Shane McIntosh McGill University
Pre-print
17:06
7m
Talk
How Often and What StackOverflow Posts Do Developers Reference in Their GitHub Projects?
MSR 2019 Mining Challenge
Saraj Singh Manes , Olga Baysal Carleton University
Pre-print
17:13
7m
Talk
Characterizing Duplicate Code Snippets between Stack Overflow and Tutorials
MSR 2019 Mining Challenge
Manziba Nishi , Agnieszka Ciborowska , Kostadin Damevski Virginia Commonwealth University
Pre-print
17:20
7m
Talk
Challenges with Responding to Static Analysis Tool Alerts
MSR 2019 Mining Challenge
Nasif Imtiaz North Carolina State University, Akond Rahman North Carolina State University, Effat Farhana , Laurie Williams North Carolina State University
Pre-print
17:27
7m
Talk
Impact of stack overflow code snippets on software cohesion: a preliminary study
MSR 2019 Mining Challenge
DOI Pre-print
17:34
7m
Talk
We Need to Talk about Microservices: an Analysis from the Discussions on StackOverflow
MSR 2019 Mining Challenge
Alan Bandeira , Carlos Filho , Matheus Paixao State University of Ceara, Brazil, Paulo Maia State University of Ceará
Pre-print Media Attached
17:41
7m
Talk
What do developers know about machine learning: a study of ML discussions on StackOverflow
MSR 2019 Mining Challenge
Hareem-e-Sahar , Abdul Ali Bangash University of Alberta, Canada, Alexander William Wong , Shaiful Chowdhury University of Alberta, Abram Hindle University of Alberta, Karim Ali University of Alberta
17:48
12m
Recap + voting
MSR 2019 Mining Challenge

Accepted Papers

Title
Analyzing Comment-induced Updates on Stack Overflow
MSR 2019 Mining Challenge
Pre-print
Can Duplicate Posts on Stack Overflow Benefit the Software Development Community?
MSR 2019 Mining Challenge
Pre-print
Challenges with Responding to Static Analysis Tool Alerts
MSR 2019 Mining Challenge
Pre-print
Characterizing Duplicate Code Snippets between Stack Overflow and Tutorials
MSR 2019 Mining Challenge
Pre-print
How Often and What StackOverflow Posts Do Developers Reference in Their GitHub Projects?
MSR 2019 Mining Challenge
Pre-print
Impact of stack overflow code snippets on software cohesion: a preliminary study
MSR 2019 Mining Challenge
DOI Pre-print
Man vs Machine -- A Study into language identification of Stackoverflow code snippets
MSR 2019 Mining Challenge
Pre-print
Mining Rule Violations in JavaScript Code Snippets
MSR 2019 Mining Challenge
Pre-print
Python Coding Style Compliance on Stack Overflow
MSR 2019 Mining Challenge
Pre-print Media Attached
Recap + voting
MSR 2019 Mining Challenge

Snakes in Paradise?: Insecure Python-related Coding Practices in Stack Overflow
MSR 2019 Mining Challenge
Pre-print
SOTorrent: Studying the Origin, Evolution, and Usage of Stack Overflow Code Snippets
MSR 2019 Mining Challenge
Pre-print
Towards Mining Answer Edits to Extract Evolution Patterns in Stack Overflow
MSR 2019 Mining Challenge
Pre-print Media Attached
We Need to Talk about Microservices: an Analysis from the Discussions on StackOverflow
MSR 2019 Mining Challenge
Pre-print Media Attached
What do developers know about machine learning: a study of ML discussions on StackOverflow
MSR 2019 Mining Challenge
What Edits Are Done on Highly Answered Stack Overflow Questions? An Empirical Study
MSR 2019 Mining Challenge
Pre-print

Call for Mining Challenge Papers

This year, the challenge is about mining SOTorrent, a dataset providing the version history of Stack Overflow posts at the level of whole posts and individual text and code blocks. Moreover, the dataset connects Stack Overflow posts to other platforms by aggregating URLs from text blocks and comments, and by collecting references from GitHub files to Stack Overflow posts. Analyses can be based on SOTorrent alone or expanded to also include data from other resources such as GHTorrent. The overall goal is to study the origin, evolution, and usage of Stack Overflow code snippets. Questions that are, to the best of our knowledge, not sufficiently answered yet include:

  • How are code snippets on Stack Overflow maintained?
  • How many clones of code snippets exist inside Stack Overflow?
  • How can we detect buggy versions of Stack Overflow code snippets and find them in GitHub projects?
  • How frequently are code snippets copied from external sources into Stack Overflow and then co-evolve there?
  • How do snippets copied from Stack Overflow to GitHub co-evolve?
  • Does the evolution of Stack Overflow code snippets follow patterns?
  • Do these patterns differ between programming languages?
  • Are the licenses of external sources compatible with Stack Overflow’s license (CC BY-SA 3.0)?
  • How many code blocks on Stack Overflow do not contain source code (and are only used for markup)?
  • Can we reliably predict bug-fixing edits to code on Stack Overflow?
  • Can we reliably predict popularity of Stack Overflow code snippets on GitHub?

These are just some of the questions that could be answered using SOTorrent. We encourage challenge participants to adapt the above questions or formulate their own research questions about the origin, evolution, and usage of content on Stack Overflow.

How to Participate in the Challenge

First, familiarize yourself with the SOTorrent dataset:

  • Read our MSR 2018 paper about SOTorrent and the preprint of our mining challenge proposal, which contains exemplary queries.
  • Study the project page of SOTorrent, which includes the most recent database layout and links to the online and download versions of the dataset.
  • Create a new issue here in case you have problems with the dataset or want to suggest ideas for improvements.

Then, use the dataset to answer your research questions, report your findings in a four-page challenge paper (see information below), submit your abstract before February 1, 2019, and your final paper before February 6, 2019. If your paper is accepted, present your results at MSR 2019 in Montreal, Canada!

Submission

A challenge paper should describe the results of your work by providing an introduction to the problem you address and why it is worth studying, the version of the dataset you used, the approach and tools you used, your results and their implications, and conclusions. Make sure your report highlights the contributions and the importance of your work. See also our open science policy regarding the publication of software and additional data you used for the challenge.

Challenge papers must not exceed 4 pages plus 1 additional page only with references and must conform to the MSR 2019 format and submission guidelines. Each submission will be reviewed by at least three members of the program committee. Submissions should follow the IEEE Conference Proceedings Formatting Guidelines, with title in 24pt font and full text in 10pt type. LaTEX users must use \documentclass[10pt,conference]{IEEEtran} without including the compsoc or compsocconf option.

IMPORTANT: The mining challenge track of MSR 2019 follows the double-blind submission model. Submissions should not reveal the identity of the authors in any way. This means that authors should:

  • leave out author names and affiliations from the body and metadata of the submitted pdf
  • ensure that any citations to related work by themselves are written in the third person, for example “the prior work of XYZ” as opposed to “our prior work [2]”
  • not refer to their personal, lab or university website; similarly, care should be taken with personal accounts on github, bitbucket, Google Drive, etc.
  • not upload unblinded versions of their paper on archival websites during bidding/reviewing, however uploading unblinded versions prior to submission is allowed and sometimes unavoidable (e.g., thesis)

Authors having further questions on double blind reviewing are encouraged to contact the Mining Challenge Chairs via email.

Papers must be submitted electronically through EasyChair, should not have been published elsewhere, and should not be under review or submitted for review elsewhere for the duration of consideration. ACM plagiarism policy and procedures shall be followed for cases of double submission. The submission must also comply with the IEEE Policy on Authorship.

Upon notification of acceptance, all authors of accepted papers will be asked to complete a copyright form and will receive further instructions for preparing their camera ready versions. At least one author of each accepted paper is expected to register and present the results at MSR 2019 in Montreal, Canada. All accepted contributions will be published in the electronic conference proceedings.

The official publication date is the date the proceedings are made available in the ACM or IEEE Digital Libraries. This date may be up to two weeks prior to the first day of ICSE 2019. The official publication date affects the deadline for any patent filings related to the published work. Purchases of additional pages in the proceedings is not allowed.

If you use the SOTorrent dataset, please cite our challenge proposal:

@inproceedings{msr2019challenge,
title={SOTorrent: Studying the Origin, Evolution, and Usage of Stack Overflow Code Snippets},
author={Baltes, Sebastian and Treude, Christoph and Diehl, Stephan},
year={2019},
booktitle={Proceedings of the 16th International Conference on Mining Software Repositories (MSR 2019)},
preprint={http://empirical-software.engineering/assets/pdf/msr19-sotorrent.pdf}
}

Important Dates

Abstracts due: February 1, 2019 (AOE)

Papers due: February 6, 2019 (AOE)

Author notification: March 1, 2019 (AOE)

Camera-ready: March 15, 2019 (AOE)

Open Science Policy

Openness in science is key to fostering progress via transparency, reproducibility and replicability. Our steering principle is that all research output should be accessible to the public and that empirical studies should be reproducible. In particular, we actively support the adoption of open data and open source principles. To increase reproducibility and replicability, we encourage all contributing authors to disclose:

  • the source code of the software they used to retrieve and analyze the data
  • the (anonymized and curated) empirical data they retrieved in addition to the SOTorrent dataset
  • a document with instructions for other researchers describing how to reproduce or replicate the results

Already upon submission, authors can privately share their anonymized data and software on preserved archives such as Zenodo or Figshare (tutorial available here). Zenodo accepts up to 50GB per dataset (more upon request). There is no need to use Dropbox or Google Drive. After acceptance, data and software should be made public so that they receive a DOI and become citable. Zenodo and Figshare accounts can easily be linked with GitHub repositories to automatically archive software releases. In the unlikely case that authors need to upload terabytes of data, Archive.org may be used.

We encourage authors to self-archive pre- and postprints of their papers in open, preserved repositories such as arXiv.org. This is legal and allowed by all major publishers including ACM and IEEE and it lets anybody in the world reach your paper. Note that you are usually not allowed to self-archive the PDF of the published article (that is, the publisher proof or the Digital Library version).

Please note that the success of the open science initiative depends on the willingness (and possibilities) of authors to disclose their data and that all submissions will undergo the same review process independent of whether or not they disclose their analysis code or data. We encourage authors who cannot disclose industrial or otherwise non-public data, for instance due to non-disclosure agreements, to provide an explicit (short) statement in the paper.

Best Mining Challenge Paper Award

As mentioned above, all submissions will undergo the same review process independent of whether or not they disclose their analysis code or data. However, only accepted papers for which code and data are available on preserved archives, as described in the open science policy, will be considered by the program committee for the best mining challenge paper award.

Best Student Presentation Award

Like in the previous years, there will be a public voting during the conference to select the best mining challenge presentation. This award often goes to authors of compelling work who present an engaging story to the audience. To increase student involvement, starting with MSR 2019, only students can compete for this award.

Organization

Sebastian Baltes, University of Trier, Germany

Christoph Treude, The University of Adelaide, Australia

Stephan Diehl, University of Trier, Germany

:
: