posted by user: allxufang || 7082 views || tracked by 19 users: [display]

CSE 2010 : SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

FacebookTwitterLinkedInGoogle

Link: http://www.ischool.utexas.edu/~cse2010/call.htm
 
When Jul 23, 2010 - Jul 23, 2010
Where Geneva
Submission Deadline Jun 15, 2010
Notification Due Jun 30, 2010
Final Version Due Jul 7, 2010
Categories    information retrieval   artificial intelligence   NLP
 

Call For Papers

SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

Call For Papers: http://www.ischool.utexas.edu/~cse2010/call.htm

================
The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE2010)
solicits submissions on topics including but are not limited to the
following areas:

* Novel applications of crowdsourcing for evaluating search systems
(see examples below)

* Novel theoretical, experimental, and/or methodological
developments advancing state-of-the-art knowledge of crowdsourcing for
search evaluation

* Tutorials on how the different forms of crowdsourcing might be
best suited to or best executed in evaluating different search tasks

* New software packages which simplify or otherwise improve general
support for crowdsourcing, or particular support for crowdsourced search
evaluation

* Reflective or forward-looking vision on use of crowdsourcing in
search evaluation as informed by prior and/or ongoing studies

* How crowdsourcing technology or process can be adapted to
encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of
search evaluation involving significant use of a crowdsourcing platform
such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel
applications of crowdsourcing are of particular interest. This includes
but is not restricted to the following tasks:

* cross-vertical search (video, image, blog, etc.) evaluation,

* local search evaluation

* mobile search evaluation

* realtime/news search evaluation

* entity search evaluation

* discovering representative groups of rare queries, documents, and
events in the long-tail of search

* detecting/evaluating query alterations

For example, does the inherent geographic dispersal of crowdsourcing
enable better assessment of a query's local intent, its local-specific
facets, or diversity of returned results? Could crowd-sourcing be
employed in near real-time to better assess query intent for breaking
news and relevant information?

Most Innovative Awards --- Sponsored by Microsoft Bing

As further incentive to participation, authors of the most novel and
innovative crowdsourcing-based search evaluation techniques (e.g. using
Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be
recognized with "Most Innovative Awards" as judged by the workshop
organizers. Selection will be based on the creativity, originality, and
potential impact of the described proposal, and we expect the winners to
describe risky, ground-breaking, and unexpected ideas. The provision of
awards is thanks to generous support from Microsoft Bing, and the number
and nature of the awards will depend on the quality of the submissions
and overall availability of funds. All valid submissions to the workshop
will be considered for the awards.

Submission Instructions

Submissions should report new (unpublished) research results or ongoing
research. Long paper submissions (up to 8 pages) will be primarily
target oral presentations. Short papers submissions can be up to 4 pages
long, and will primarily target poster presentations. Papers should be
formatted in double-column ACM SIG proceedings format
(http://www.acm.org/sigs/publications/proceedings-templates). Papers
must be submitted as PDF files. Submissions should not be anonymized.

Important Dates

Submissions due: June 15, 2010
Notification of acceptance: June 30, 2010
Camera-ready submission: July 7, 2010
Workshop date: July 23, 2010
Organizers



Vitor Carvalho, Microsoft Bing

Matthew Lease, University of Texas at Austin

Emine Yilmaz, Microsoft Research



Program Committee



Eugene Agichtein, Emory University

Ben Carterette, University of Delaware

Charlie Clarke,University of Waterloo

Gareth Jones, Dublin City University

Michael Kaisser. University of Edinburgh

Jaap Kamps, University of Amsterdam

Gabriella Kazai, Microsoft Research

Winter Mason, Yahoo! Research

Stefano Mizzaro, University of Udine

Gheorghe Muresan, Microsoft Bing

Iadh Ounis, University of Glasgow

Mark Sanderson, University of Sheffield

Mark Smucker, University of Waterloo

Siddharth Suri, Yahoo! Research

Fang Xu, Saarland University


Questions?

Email the organizers at cse2010@ischool.utexas.edu

Related Resources

SIGIR 2024   The 47th International ACM SIGIR Conference on Research and Development in Information Retrieval
IEEE COINS 2024   IEEE COINS 2024 - London, UK - July 29-31 - Hybrid (In-Person & Virtual)
LREC-COLING 2024   The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation
ECAI 2024   27th European Conference on Artificial Intelligence
Security 2025   Special Issue on Recent Advances in Security, Privacy, and Trust
ICANN 2024   33rd International Conference on Artificial Neural Networks
AutoML 2024   AutoML Conference 2024
JCICE 2024   2024 International Joint Conference on Information and Communication Engineering(JCICE 2024)
WSDM 2024   17th ACM International Conference on Web Search and Data Mining
EAIH 2024   Explainable AI for Health