| |||||||||||||||
P-RECS 2018 : First International Workshop On Practical Reproducible Evaluation of Computer Systems | |||||||||||||||
Link: https://p-recs.github.io/2018/ | |||||||||||||||
| |||||||||||||||
Call For Papers | |||||||||||||||
# P-RECS '18
First International Workshop on Practical Reproducible Evaluation of Computer Systems. June 11, 2018. In conjunction with HPDC'18 (http://hpdc.org/2018/). In cooperation with SIGHPC (pending). Independent evaluation of experimental results in the area of computer and networking systems is a challenging task. Recreating the environment where an experiment originally ran is commonly considered impractical or even impossible. This workshop will focus heavily on practical, actionable aspects of reproducibility in broad areas of computational science and data exploration, with special emphasis on issues in which community collaboration can be essential for adopting novel methodologies, techniques and frameworks aimed at addressing some of the challenges we face today. The workshop will bring together researchers and experts to share experiences and advance the state of the art in the reproducible evaluation of computer systems, featuring contributed papers and invited talks. ## Topics We expect submissions from topics such as, but not limited to: * Experiment dependency management. * Software citation and persistence. * Data versioning and preservation. * Provenance of data-intensive experiments. * Tools and techniques for incorporating provenance into publications. * Automated experiment execution and validation. * Experiment portability for code, performance, and related metrics. * Experiment discoverability for re-use. * Cost-benefit analysis frameworks for reproducibility. * Usability and adaptability of reproducibility frameworks into already-established domain-specific tools. * Long-term artifact archiving for future reproducibility. * Frameworks for sociological constructs to incentivize paradigm shifts. * Policies around publication of articles/software. * Blinding and selecting artifacts for review while maintaining history. * Reproducibility-aware computational infrastructure. ## Submission Submit via EasyChair (https://easychair.org/conferences/?conf=precs18). We look for two categories of submissions: * **Position papers**. This category is for papers whose goal is to propose solutions (or scope the work that needs to be done) to address some of the issues outlined above. We hope that a research agenda comes out of this and that we can create a community that meets yearly to report on our status in addressing these problems. * **Experience papers**. This category consists of papers reporting on the authors' experience in automating one or more experimentation pipelines. The committee will look for submissions reporting on their experience: what worked? What aspects of experiment automation and validation are hard in your domain? What can be done to improve the tooling for your domain? As part of the submission, authors need to provide a URL to the automation service they use (e.g., [TravisCI](https://travis-ci.org), [GitLabCI](https://about.gitlab.com/gitlab-ci/), [CircleCI](https://circleci.com), [Jenkins](https://jenkins-ci.org), etc.) so reviewers can verify that there is one or more automated pipelines associated to the submission. ### Format Authors are invited to submit manuscripts in English not exceeding 5 pages of content. The 5-page limit includes figures, tables and appendices, but does not include references, for which there is no page limit. Submissions must use the [ACM Master Template](https://www.acm.org/publications/proceedings-template) (please use the `sigconf` format with default options). ### Proceedings The proceedings will be archived in both the ACM Digital Library and IEEE Xplore through SIGHPC. ### Tools These tools can be optionally used used to automate your experiments: [CWL](http://commonwl.org), [Popper](https://github.com/systemslab/popper), [ReproZip](http://reprozip.org), [Sciunit](http://sciunit.run), [Sumatra](https://github.com/open-research/sumatra). ## Important Dates * Submissions due: *April 9*, 2018 * Acceptance notification: April 30, 2018 * Camera-ready paper submission: May 6, 2018 * Workshop: June 11, 2018 ## Organizers * Ivo Jimenez, UC Santa Cruz * Carlos Maltzahn, UC Santa Cruz * Jay Lofstead, Sandia National Laboratories ## Program Committee * Divyashri Bhat, UMass Amherst * Michael Crusoe, CWL Project Lead * Anja Feldmann, TU Berlin * Todd Gamblin, LLNL * Mike Heroux, Sandia National Laboratories * Torsten Hoefler, ETH Zürich * Neil Chue Hong, Software Sustainability Institute / University of Edinburgh, UK * Dan Katz, NCSA * Kate Keahey, Argonne National Lab / ChameleonCloud * Ignacio Laguna, LLNL * Arnaud Legrand, Bâtiment IMAG * Reed Milewicz, Sandia National Laboratories * Robert Ricci, University of Utah / CloudLab * Victoria Stodden, UIUC * Violet R. Syrotiuk, ASU * Michela Taufer, University of Delaware * Michael Zink, UMass Amherst ## Contact Please address workshop questions to (ivo@cs.ucsc.edu). |
|