posted by user: xdevroey || 3733 views || tracked by 22 users: [display]

ISSTA 2021 : International Symposium on Software Testing and Analysis

FacebookTwitterLinkedInGoogle


Conference Series : International Symposium on Software Testing and Analysis
 
Link: https://conf.researchr.org/home/issta-2021
 
When Jul 12, 2021 - Jul 16, 2021
Where Aarhus, Denmark
Submission Deadline Jan 29, 2021
Notification Due Apr 19, 2021
Categories    software testing   software analysis   software validation   software verification
 

Call For Papers

Technical Papers

Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, methods for emerging systems, in-depth case studies, infrastructures of testing and analysis, or tools are welcome.

Experience Papers

Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience. Of special interest are experience papers that report on industrial applications of software testing and analysis methods or tools.

Reproducibility Studies

ISSTA would like to encourage researchers to reproduce results from previous papers. A reproducibility study must go beyond simply re-implementing an algorithm and/or re-running the artifacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, reproducibility studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A reproducibility study should clearly report on results that the authors were able to reproduce as well as on aspects of the work that were irreproducible. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artifact only, but instead to perform a comparative experiment of multiple related approaches. In particular, reproducibility studies should follow the ACM guidelines on reproducibility (different team, different experimental setup): The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts which they develop completely independently. This means that it is also insufficient to focus on repeatability (i.e., same experiment) alone. Reproducibility Studies will be evaluated according to the following standards:
- Depth and breadth of experiments
- Clarity of writing
- Appropriateness of conclusions
- Amount of useful, actionable insights
- Availability of artifacts
We expect reproducibility studies to clearly point out the artifacts the study is built on, and to submit those artifacts to artifact evaluation (see below). Artifacts evaluated positively will be eligible to obtain the highly prestigious badges Results Replicated or Results Reproduced.

Related Resources

SIGPRO 2021   International Conference on Data Science and Cloud Computing
IJCSA 2021   International Journal on Computational Science & Applications
SANER 2021   The 28th edition of the IEEE International Conference on Software Analysis, Evolution and Reengineering
ITTCS 2021   Information Technologies, Telecommunications and Control Systems
ICSEA 2021   The Sixteenth International Conference on Software Engineering Advances
JARES 2021   International Journal of Advance Robotics & Expert Systems
TORACLE 2021   The 1st International Workshop on Test Oracles
CMC 2021   7th International Conference on Control, Modeling and Computing
DMDBS 2021   7th International Conference on Data Mining and Database Management Systems
MDPI-SI-BDHA 2021   Call for Papers: Special Issue “Big Data for eHealth Applications” (MDPI Applied Sciences, IF 2.474 – Indexed on Scopus, Web of Science)