posted by system || 1773 views || tracked by 2 users: [display]

BARS 2013 : ACM SIGIR Workshop on Benchmarking Adaptive Retrieval and Recommender Systems

FacebookTwitterLinkedInGoogle

Link: http://www.bars-workshop.org
 
When Aug 1, 2013 - Aug 1, 2013
Where Dublin, Ireland
Submission Deadline Jun 14, 2013
Categories    data mining   informatioin retrieval
 

Call For Papers

In recent years, immense progress has been made in the development of recommendation, retrieval and personalisation techniques. The evaluation of these systems is still based on traditional information retrieval and statistics metrics, e.g. precision, recall and/or RMSE, often not taking the use-case and situation of the system into consideration. However, the rapid evolution of recommender and adaptive IR systems in both their goals and their application domains foster the need for new evaluation methodologies and environments.

The workshop will be followed by a special issue of ACM TiST on Recommender System Benchmarking. Authors of high-quality papers from the workshop will be encouraged to submit extended versions of their work to the journal.
Submission topics

We invite the submission of papers reporting relevant research in the area of benchmarking and evaluation of recommendation and adaptive IR systems. We welcome submissions presenting contributions in this scope, addressing the following topics:

New metrics and methods for quality estimation of recommender and adaptive IR systems
Novel frameworks for the user-centric evaluation of adaptive systems
Validation of off-line methods with online studies
Comparison of evaluation metrics and methods
Comparison of recommender and IR approaches across multiple systems and domains
Measuring technical constraints vs. accuracy
New datasets for the evaluation of recommender and adaptive IR systems
Benchmarking frameworks
Multiple-objective benchmarking

Submissions

We invite the submission of papers reporting original research, studies, advances, or experiences in this area. Two submission types are accepted: long papers of up to 8 pages, and short papers up to 4 pages, in the standard ACM SIG proceedings format. Paper submissions and reviews will be handled electronically.

Each paper will be evaluated by at least three reviewers from the Program Committee. The papers will be evaluated for their originality, contribution significance, soundness, clarity, and overall quality. The interest of contributions will be assessed in terms of technical and scientific findings, contribution to the knowledge and understanding of the problem, methodological advancements, or applicative value.

Submission instructions can be found on the Submissions page.
Important dates

Paper submission: June 14, 2013
Notification: June 28th, 2013
Camera ready: July 13th, 2013
Workshop: August 1st, 2013

Related Resources

SIGIR 2024   The 47th International ACM SIGIR Conference on Research and Development in Information Retrieval
ECAI 2024   27th European Conference on Artificial Intelligence
ACM NLPIR 2024   ACM--2024 8th International Conference on Natural Language Processing and Information Retrieval (NLPIR 2024)
ACM-Ei/Scopus-CCISS 2024   2024 International Conference on Computing, Information Science and System (CCISS 2024)
BIAS 2024   International Workshop on Algorithmic Bias in Search and Recommendation
AIM@EPIA 2024   Artificial Intelligence in Medicine
NLPIR 2024   ACM--2024 8th International Conference on Natural Language Processing and Information Retrieval (NLPIR 2024)
CVIV 2024   2024 6th International Conference on Advances in Computer Vision, Image and Virtualization (CVIV 2024) -EI Compendex
ECNLPIR 2024   2024 European Conference on Natural Language Processing and Information Retrieval (ECNLPIR 2024)
RSsCI 2024   FLINS 2024 Special Session on Recommender systems supported by computational intelligence: emerging topics and applications