posted by user: allxufang || 6564 views || tracked by 19 users: [display]

CSE 2010 : SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation


When Jul 23, 2010 - Jul 23, 2010
Where Geneva
Submission Deadline Jun 15, 2010
Notification Due Jun 30, 2010
Final Version Due Jul 7, 2010
Categories    information retrieval   artificial intelligence   NLP

Call For Papers

SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

Call For Papers:

The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE2010)
solicits submissions on topics including but are not limited to the
following areas:

* Novel applications of crowdsourcing for evaluating search systems
(see examples below)

* Novel theoretical, experimental, and/or methodological
developments advancing state-of-the-art knowledge of crowdsourcing for
search evaluation

* Tutorials on how the different forms of crowdsourcing might be
best suited to or best executed in evaluating different search tasks

* New software packages which simplify or otherwise improve general
support for crowdsourcing, or particular support for crowdsourced search

* Reflective or forward-looking vision on use of crowdsourcing in
search evaluation as informed by prior and/or ongoing studies

* How crowdsourcing technology or process can be adapted to
encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of
search evaluation involving significant use of a crowdsourcing platform
such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel
applications of crowdsourcing are of particular interest. This includes
but is not restricted to the following tasks:

* cross-vertical search (video, image, blog, etc.) evaluation,

* local search evaluation

* mobile search evaluation

* realtime/news search evaluation

* entity search evaluation

* discovering representative groups of rare queries, documents, and
events in the long-tail of search

* detecting/evaluating query alterations

For example, does the inherent geographic dispersal of crowdsourcing
enable better assessment of a query's local intent, its local-specific
facets, or diversity of returned results? Could crowd-sourcing be
employed in near real-time to better assess query intent for breaking
news and relevant information?

Most Innovative Awards --- Sponsored by Microsoft Bing

As further incentive to participation, authors of the most novel and
innovative crowdsourcing-based search evaluation techniques (e.g. using
Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be
recognized with "Most Innovative Awards" as judged by the workshop
organizers. Selection will be based on the creativity, originality, and
potential impact of the described proposal, and we expect the winners to
describe risky, ground-breaking, and unexpected ideas. The provision of
awards is thanks to generous support from Microsoft Bing, and the number
and nature of the awards will depend on the quality of the submissions
and overall availability of funds. All valid submissions to the workshop
will be considered for the awards.

Submission Instructions

Submissions should report new (unpublished) research results or ongoing
research. Long paper submissions (up to 8 pages) will be primarily
target oral presentations. Short papers submissions can be up to 4 pages
long, and will primarily target poster presentations. Papers should be
formatted in double-column ACM SIG proceedings format
( Papers
must be submitted as PDF files. Submissions should not be anonymized.

Important Dates

Submissions due: June 15, 2010
Notification of acceptance: June 30, 2010
Camera-ready submission: July 7, 2010
Workshop date: July 23, 2010

Vitor Carvalho, Microsoft Bing

Matthew Lease, University of Texas at Austin

Emine Yilmaz, Microsoft Research

Program Committee

Eugene Agichtein, Emory University

Ben Carterette, University of Delaware

Charlie Clarke,University of Waterloo

Gareth Jones, Dublin City University

Michael Kaisser. University of Edinburgh

Jaap Kamps, University of Amsterdam

Gabriella Kazai, Microsoft Research

Winter Mason, Yahoo! Research

Stefano Mizzaro, University of Udine

Gheorghe Muresan, Microsoft Bing

Iadh Ounis, University of Glasgow

Mark Sanderson, University of Sheffield

Mark Smucker, University of Waterloo

Siddharth Suri, Yahoo! Research

Fang Xu, Saarland University


Email the organizers at

Related Resources

SIGIR 2022   The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
AAAI 2023   The 37th AAAI Conference on Artificial Intelligence
WSDM 2023   Web Search and Data Mining
JCRAI 2022-Ei Compendex & Scopus 2022   2022 International Joint Conference on Robotics and Artificial Intelligence (JCRAI 2022)
WSDM 2022   Web Search and Data Mining
ACM-Ei/Scopus-ITNLP 2022   2022 2nd International Conference on Information Technology and Natural Language Processing (ITNLP 2022) -EI Compendex
CSE 2022   10th International Conference on Computational Science and Engineering
AIMLNET 2022   2nd International conference on AI, Machine Learning in Communications and Networks
RAMFP 2022   TAA (OA) - SI: Recent Advances on Metric Fixed Point Theory 2022
IEEE CSE 2022   The 25th IEEE International Conference on Computational Science and Engineering