| |||||||||||
BARS 2013 : ACM SIGIR Workshop on Benchmarking Adaptive Retrieval and Recommender Systems | |||||||||||
Link: http://www.bars-workshop.org | |||||||||||
| |||||||||||
Call For Papers | |||||||||||
In recent years, immense progress has been made in the development of recommendation, retrieval and personalisation techniques. The evaluation of these systems is still based on traditional information retrieval and statistics metrics, e.g. precision, recall and/or RMSE, often not taking the use-case and situation of the system into consideration. However, the rapid evolution of recommender and adaptive IR systems in both their goals and their application domains foster the need for new evaluation methodologies and environments.
The workshop will be followed by a special issue of ACM TiST on Recommender System Benchmarking. Authors of high-quality papers from the workshop will be encouraged to submit extended versions of their work to the journal. Submission topics We invite the submission of papers reporting relevant research in the area of benchmarking and evaluation of recommendation and adaptive IR systems. We welcome submissions presenting contributions in this scope, addressing the following topics: New metrics and methods for quality estimation of recommender and adaptive IR systems Novel frameworks for the user-centric evaluation of adaptive systems Validation of off-line methods with online studies Comparison of evaluation metrics and methods Comparison of recommender and IR approaches across multiple systems and domains Measuring technical constraints vs. accuracy New datasets for the evaluation of recommender and adaptive IR systems Benchmarking frameworks Multiple-objective benchmarking Submissions We invite the submission of papers reporting original research, studies, advances, or experiences in this area. Two submission types are accepted: long papers of up to 8 pages, and short papers up to 4 pages, in the standard ACM SIG proceedings format. Paper submissions and reviews will be handled electronically. Each paper will be evaluated by at least three reviewers from the Program Committee. The papers will be evaluated for their originality, contribution significance, soundness, clarity, and overall quality. The interest of contributions will be assessed in terms of technical and scientific findings, contribution to the knowledge and understanding of the problem, methodological advancements, or applicative value. Submission instructions can be found on the Submissions page. Important dates Paper submission: June 14, 2013 Notification: June 28th, 2013 Camera ready: July 13th, 2013 Workshop: August 1st, 2013 |
|