posted by user: akiss || 1685 views || tracked by 3 users: [display]

A-TEST 2022 : 13th Workshop on Automating Test Case Design, Selection and Evaluation

FacebookTwitterLinkedInGoogle

Link: https://a-test.org
 
When Nov 17, 2022 - Nov 18, 2022
Where Singapore
Abstract Registration Due Jul 24, 2022
Submission Deadline Jul 28, 2022
Notification Due Sep 2, 2022
Final Version Due Sep 9, 2022
Categories    software testing   test automation   test tools
 

Call For Papers

**********************************************************************
CALL FOR PAPERS
**********************************************************************
A-TEST 2022
13th Workshop on Automating Test Case Design, Selection and Evaluation
https://a-test.org

November 17-18, 2022
Singapore
Co-located with ESEC/FSE 2022
**********************************************************************


IMPORTANT DATES
***************
Abstract submission deadline: July 24, 2022 (non-mandatory)
Submission deadline: July 28, 2022
Author notification: September 2, 2022
Camera-ready submission: September 9, 2022 (hard)
(All dates are 23:59:59 AoE)

AIMS AND SCOPE
**************
Testing is at the moment the most important and mostly used quality assurance
technique applied in the software industry. However, the complexity of software,
and hence of their development amount, is increasing.

Even though many test automation tools are currently available to aid test
planning and control as well as test case execution and monitoring, all these
tools share a similar passive philosophy towards test case design, selection of
test data and test evaluation. They leave these crucial, time-consuming and
demanding activities to the human tester. This is not without reason; test case
design and test evaluation through oracles are difficult to automate with the
techniques available in current industrial practices. The domain of possible
inputs (potential test cases), even for a trivial method, program, model, user
interface or service is typically too large to be exhaustively explored.
Consequently, one of the major challenges associated with test case design is
the selection of test cases that are effective at finding flaws without
requiring an excessive number of tests to be carried out. Automation of the
entire test process requires new thinking that goes beyond test design or
specific test execution tools. These are the problems that this workshop aims to
attack.

For the past twelve years, the Workshop on Automating Test Case Design,
Selection and Evaluation (A-TEST) has provided a venue for researchers and
industry members alike to exchange and discuss trending views, ideas, state of
the art, work in progress, and scientific results on automated testing.

Following the success of the past years, the 13th edition of A-TEST will
continue to be co-located with and organised at ESEC/FSE 2022. A-TEST 2022 is
planned to take place over two days preferably in person in Singapore (if the
COVID-19 situation allows).

Topics of interest include, but are not limited to:

- Techniques and tools for automating test case design, generation, and
selection, e.g., model-based approaches, mutation approaches, metamorphic
approaches, combinatorial-based approaches, search-based approaches,
symbolic-based approaches, chaos testing, machine learning testing.
- New trends in the use of machine learning (ML) and artificial intelligence
(AI) to improve test automation, and new approaches to test ML/AI-based
systems.
- Test case and test process optimization.
- Test case evolution, repair, and reuse.
- Test case evaluation and metrics.
- Test case design, selection, and evaluation in emerging domains like graphical
user interfaces, social networks, the cloud, games, security, cyber-physical
systems, or extended reality.
- Case studies that have evaluated an existing technique or tool on real
systems, empirical studies, not only toy problems, to show the quality of the
resulting test cases compared to other approaches.
- Experience/industry reports.
- Education of (automating) testing.

SUBMISSIONS
***********
Authors are invited to submit papers to the workshop, and present and discuss
them at the event on topics related to automated software testing. Paper
submissions can be of the following types:

- Full papers (max. 8 pages) describing original, complete, and validated
research - either empirical or theoretical - in A-TEST related techniques,
tools, or industrial case studies.
- Work-in-progress papers (max. 4 pages) that describe novel, interesting, and
high-potential work in progress, but not necessarily reaching full completion
(e.g., not completely validated).
- Tool papers (max. 4 pages) presenting some testing tool in a way that it could
be presented to industry as a start of successful technology transfer.
- Technology transfer papers (max. 4 pages) describing industry-academia
co-operation.
- Position papers (max. 2 pages) that analyse trends and raise issues of
importance. Position papers are intended to generate discussion and debate
during the workshop.

A-TEST also offers an opportunity to introduce novel testing techniques or tools
to the audience in active hands-on sessions. The strong focus on tools in this
session complements the traditional conference style of presenting research
papers, in which a deep discussion of technical components of the implementation
is usually missing. Furthermore, presenting the actual tools and implementations
of testing techniques in the hands-on sessions allows other researchers and
practitioners to reproduce research results and to apply the latest testing
techniques in practice. The invited proposals should be:

- Hands-on proposals (max. 2 pages) that describe how the session (with a
preferred time frame of 3 hours) will be conducted.

All submissions must be in English and in PDF format. At the time of submission,
all papers must conform to the ESEC/FSE 2022 Format and Submission Guidelines.
A-TEST 2022 will employ a single-blind review process.

Papers and hands-on proposals must be submitted electronically through EasyChair:
https://easychair.org/conferences/?conf=atest2022

Each submission will be reviewed by at least three members of the program
committee. Full papers will be evaluated on the basis of originality, importance
of contribution, soundness, evaluation, quality of presentation, and appropriate
comparison to related work. Work-in-progress and position papers will be
reviewed with respect to relevance and their ability to start up fruitful
discussions. Tool and technology transfer papers will be evaluated based on
improvement on the state-of-the-practice and clarity of lessons learned.

Submitted papers must not have been published elsewhere and must not be under
review or submitted for review elsewhere during the duration of consideration.
To prevent double submissions, the chairs may compare the submissions with
related conferences that have overlapping review periods. The double submission
restriction applies only to refereed journals and conferences, not to unrefereed
pre-publication archive servers (e.g., arXiv.org). Submissions that do not
comply with the foregoing instructions will be desk rejected without being
reviewed.

PROCEEDINGS
***********
All accepted contributions will appear in the ACM Digital Library, providing a
lasting archived record of the workshop proceedings. At least one author of each
accepted paper must register and present the paper in person at A-TEST 2022 in
order for the paper to be published in the proceedings.

ORGANISING COMMITTEE
********************
Ákos Kiss, General Chair (University of Szeged, Hungary)
Beatriz Marín, Program Co-Chair (Universidad Politècnica de Valencia, Spain)
Mehrdad Saadatmand, Program Co-Chair (RISE Research Institutes of Sweden, Sweden)
Niels Doorn, Student Competition Co-Chair (Open Universiteit, The Netherlands)
Jeroen Pijpker, Student Competition Co-Chair (NHL Stenden, The Netherlands)
Antony Bartlett, Publicity & Web Chair (TU Delft, The Netherlands)

All questions about submissions should be emailed to the workshop organisers at
atest2022@easychair.org

Related Resources

JEDT 2024   International Journal of Electronic Design and Test
VST 2024   7th Workshop on Validation, Analysis and Evolution of Software Tests
IJCSES 2024   International Journal of Computer Science and Engineering Survey
ATS 2024   The 33rd IEEE Asian Test Symposium (ATS 2024)
SOFE 2025   11th International Conference on Software Engineering
WMT-Testsuites 2024   'Help us break LLMs' - Test suite sub-task of the Ninth Conference on Machine Translation (WMT24)
AME 2024   International Conference on Advances in Mechanical Engineering
DATE 2024   Design, Automation, and Test in Europe
IJAIT 2024   International Journal of Advanced Information Technology