The SBSE Challenge Track, held in coordination with the SSBSE Symposium, is an exciting opportunity for SBSE researchers to demonstrate the efficacy of their methods on real-world software. Participants target one of a selected suite of open source development projects and apply SBSE methods; the principal criterion is to produce interesting results.
Entrants will compete for cash prizes totaling €1000. Thanks to the CREST Centre for their generous sponsorship of this competition.
The deadline for submissions is May 25th 2017.
The winner will be announced at SSBSE 2017, which is collocated with FSE and will take place in Paderborn, Germany, Sept 9-11th 2017.
In order to participate, you should:
It is not mandatory for submissions to the SBSE Challenge track to implement a new tool or technique. However, we do expect that applying your existing or new tools/techniques to one of the challenge programs lead to practically interesting results.
The criteria for paper acceptance are the following:
Participants are invited to investigate and report upon one of the following open source projects. You are free to focus on any particular version or a comparison of different versions; you can also choose to analyse, test, improve, or apply any other SBSE-based activities to either parts or the whole of a project, including source code, documentation, or any other related resources (bug database, versioning histories, online discussions, etc) that can be found freely available online.
A wiki with links for some artefacts of the challenge’s systems is available here. Note that this wiki is being crowdsourced by challenge’s enthusiasts and has not been officially validated by the challenge chairs.
LibreOffice
LibreOffice is a large open-source productivity suite, implemented in several languages including C++, with a total of 8MLOC. The project incorporates three levels of regression testing.
SQLite
SQLite is arguably the most popular database in the world. It is designed for efficiency, simplicity, and can be deployed as a single C source code file. The project incorporates 338KLOC and three separately developed test suites.
Guava
https://github.com/google/guava
Guava is a widely adopted and extensive Java collections library developed by Google. It includes over 252KLOC and its test suite includes a thorough set of performance tests.
Please note that Guava was also used in the SBSE 2015 Challenge, and you should ensure that your work is not simply a duplication of previous effort. Details of that year’s papers and presentations can be found in the SSBSE 2015 programme.
Flask
Flask is a very popular minimalist web framework for Python, with 9KLOC. It comes with a full test suite.
Your paper should describe your method and findings. It should provide a brief introduction to the problem being addressed, the program that you used, your technique or tool, followed by your results, their implications, and your conclusions.
The papers must be at most 6 pages long in PDF format and should conform at time of submission to the SSBSE/Springer LNCS format and submission guidelines. They must not have been previously published, or be in consideration for, any journal, book, or other conference. Please submit your challenge paper to EasyChair on or before the Challenge track deadline (May 25th 2017). At least one author of each paper is expected to present the results at SSBSE 2017. Papers for the Challenge track are not required to follow the double blind restrictions. All accepted contributions will be published in the conference electronic proceedings.
Details are available via the SSBSE website.
If you have any questions about the challenge, please e-mail the Challenge Track chairs: