posted by user: upuleegk || 2407 views || tracked by 4 users: [display]

MET 2016 : MET 2016: 1st International Workshop on Metamorphic Testing

FacebookTwitterLinkedInGoogle

Link: http://www.cs.montana.edu/met16/
 
When May 16, 2016 - May 16, 2016
Where Austin, Texas
Submission Deadline Jan 22, 2016
Notification Due Feb 19, 2016
Final Version Due Feb 26, 2016
Categories    software engineering   software testing   metamorphic testing
 

Call For Papers

*************** MET 2016 Call for Papers ******************

The 1st International Workshop on Metamorphic Testing (MET 2016) - http://www.cs.montana.edu/met16/

in conjunction with the 38th International Conference on Software Engineering (ICSE), Austin, TX, May 14-22, 2016


*IMPORTANT DATES*

Paper submissions due: January 22, 2016
Notification to author: February 19, 2016
Camera ready copies due: February 26, 2016
Workshop: May 16, 2016


*SCOPE OF THE WORKSHOP*

Metamorphic testing (MT) is a testing technique that exploits the relationships among the inputs and outputs of multiple executions of the program under test, so-called Metamorphic Relations (MRs). MT has been proven highly effective in testing programs that face the oracle problem, for which the correctness of individual output is difficult to determine. Since the introduction of MT in 1998, the interest in this testing methodology has grown immensely with numerous applications in various domains such as machine learning, bioinformatics, computer graphics, simulation, search engines, decision support, cloud computing, databases, and compilers.

The First International Workshop on Metamorphic Testing (MET 2016) will bring together researchers and practitioners in academia and industry to discuss research results and experiences in MT. The ultimate goal of MET is to provide a platform for the discussion of novel ideas, new perspectives, new applications, and state of research, related to or inspired by MT.


*TOPICS OF INTEREST*

The topics of interest include, but are not limited to:

- Guidelines and techniques for the construction of MRs or MT test cases.
- Prioritization and minimization of MRs or MT test cases.
- Quality assessment mechanisms for MRs or MT test cases (e.g. metrics)
- Automated generation of likely MRs.
- Combination of MRs.
- Generation of source test cases.
- Formal methods involving MRs.
- Case studies and applications.
- Tools.
- Surveys.
- Empirical studies.
- Integration/comparison with other techniques.
- Novel applications, perspectives, or theories inspired by MT, which can be beyond conventional software testing topics.


*SUBMISSION AND PUBLICATION*

Authors are invited to submit original, previously unpublished research papers. Papers should be written in English, strictly following the ICSE 2016 formatting and submission instructions: http://2016.icse.cs.txstate.edu/formatInstr

The following types of submissions are accepted:

- Full research papers with a maximum length of 7 pages, including references and appendices.
- Short papers with a maximum length of 4 pages, including references and appendices.

Papers must be submitted in PDF format via the electronic submission system, which is available at [TBD]

Submitted papers will be evaluated according to their rigor, significance, originality, technical quality and exposition, by at least three members of an international program committee.

At least one author of each accepted paper must register and participate in the workshop. Registration is subject to the terms, conditions and procedure of the main ICSE conference to be found at its website: http://2016.icse.cs.txstate.edu/

Accepted papers will be published in the ACM digital library.


*KEYNOTE SPEAKER*

Prof. T.Y. Chen, Swinburne University of Technology, Australia


*ORGANIZERS*

Upulee Kanewala, Montana State University, USA
Laura L. Pullum, Oak Ridge National Laboratory, USA
Sergio Segura, University of Seville, Spain
Dave Towey, The University of Nottingham Ningbo China, China
Zhi Quan (George) Zhou, University of Wollongong, Australia


*PROGRAM COMMITTEE (To be completed)*

James Bieman, Colorado State University, USA
Giovanni Denaro, University of Milano Bicocca, Italy
Phyllis Frankl, Polytechnic Institute of New York University, USA
Arnaud Gotlieb, Simula Research Laboratory, Norway
Mark Harman, University College London, UK
Robert M. Hierons, University of Brunel, UK
Gail Kaiser, Columbia University, USA
F.-C. Kuo, Swinburne University of Technology, Australia
Mikael Lindvall, Fraunhofer Center for Experimental Software Engineering, USA
Huai Liu, RMIT University, Australia
Chris Murphy, University of Pennsylvania, USA
Alberto Núñez, Universidad Complutense de Madrid, Spain
T. H. Tse, The University of Hong Kong, Hong Kong
Xiaoyuan Xie, Wuhan University, China


*CONTACT*

Upulee Kanewala. Email: upulee.kanewala@cs.montana.edu

Related Resources

ICCES 2024   The 30th International Conference on Computational & Experimental Engineering and Sciences
IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
VALID 2025   The Seventeenth International Conference on Advances in System Testing and Validation Lifecycle
IJCSES 2024   International Journal of Computer Science and Engineering Survey
SOFE 2025   11th International Conference on Software Engineering
ICTSS 2025   37th International Conference on Testing Software and Systems
PREDICTION SOLUTIONS 2025   International Conference on Prediction Solutions for Technical and Societal Systems
ACM SAC 2025   40th ACM/SIGAPP Symposium On Applied Computing
ICSEA 2025   The Twentieth International Conference on Software Engineering Advances
MAT 2024   10th International Conference of Advances in Materials Science and Engineering