posted by system || 3029 views || tracked by 9 users: [display]

ENQOIR 2009 : First International Workshop on Aspects in Evaluating Holistic Quality of Ontology-based Information Retrieval

FacebookTwitterLinkedInGoogle

Link: http://events.idi.ntnu.no/enqoir09
 
When Apr 1, 2009 - Apr 4, 2009
Where Suzhou, China
Submission Deadline Nov 16, 2008
Notification Due Dec 10, 2008
Final Version Due Jan 5, 2009
Categories    ontology   information retrieval
 

Call For Papers

========================================================
Call for Paper

*** ENQOIR 2009 ***
First International Workshop on
Aspects in Evaluating Holistic Quality of
Ontology-based Information Retrieval
http://events.idi.ntnu.no/enqoir09/

To be held with the joint APWeb-WAIM 2009 conferences
April 1-4, 2009 | Suzhou, China
========================================================

The ENQOIR workshop targets to deeper understanding and disseminate
knowledge on advances in evaluation and application of ontology-based
information retrieval (ObIR). The main areas of the workshop is an
overlap between three evaluation aspects in ObIR, namely, evaluation of
information retrieval, evaluation of ontology quality?s impact on ObIR
results, and evaluation of user interaction complexity. The main objective
is to contribute to optimization of ObIR by systemizing existing body of
knowledge on ObIR and defining a set of metrics for evaluation of ontology-
based search. The long-term goal of the workshop is to establish a forum
to analyze and proceed towards a holistic evaluation method for evaluation
of ontology-driven information retrieval systems.


CALL FOR PAPERS:

In the recent years, a significant research effort has been devoted to
ontology-based information retrieval (ObIR). The progress and results in
this area offer a promising prospect to improve performance of current
information retrieval (IR) systems. Furthermore, existing sparse
evaluations of the ObIR tools report improvement compared to traditional
IR systems. However, the results lack indications whether this improvement
is optimal, causing difficulties to benchmark different ObIR systems.
Yet, majority of IR evaluation methods is mainly based on relevance of
retrieved information. While additional sophistication of the ObIR tools
adds complexity on user interaction to reach improved results. Therefore,
standard IR metrics as recall and precision do not suffice alone to
measure user satisfaction because of complexity and efforts needed to use
the ObIR systems. We need to investigate what ontology properties can
even further enhance IR, to assess whether this improvement comes at a
cost of interaction simplicity and user satisfaction, etc.

Furthermore, evaluation methods based on recall and precision do not
indicate the causes for variation in different retrieval results. There
are many other factors that influence the performance of ontology-based
information retrieval, such as query quality, ontology quality, complexity
of user interaction, difficulty of a searching topic with respect to
retrieval, indexing, searching, and ranking methods. The detail analysis
on how these factors and their interactions affect a retrieval process can
help to dramatically improve retrieval methods or processes.

From other hand, ontology?s ability to capture the content of the universe
of discourse at the appropriate level of granularity and precision and
offer the application understandable correct information is important.
An important body of work already exists in ontology quality assessment
field. However, most of ontology evaluation methods are generic quality
evaluation frameworks, which do not take into account application of
ontology. Therefore there is a need for task- and scenario-based quality
assessment methods that, in this particular case, would target and optimize
ontology quality for use in information retrieval systems.

In order to promote more efficient and effective ontology usage in IR,
there is a need to contemplate on analysis of ontology quality- and value-
added aspects for this domain, summarize use cases and identify best
practices. Several issues have been put forward by the current research,
like the workload for annotation, the scalability, and the balance between
the express power and reasoning capability. An approach to holistic
evaluation should assess both technological and economical performance
viewpoints. An aspect of value creation by semantics-based systems is
important to demonstrate that the benefits of the new technology will
overwhelm the payout.

The purpose of this workshop is to bring together researchers, developers,
and practitioners to discuss experiences and lessons learned, identify
problems solved and caused, synergize different views, analyse interplay
between ontology quality and IR performance, and brainstorm future
research/development directions. Particularly, we strongly encourage
submissions dealing with ontology quality aspects and their impact on IR
results, evaluation of usability of the ObIR systems, analysis of user
behaviour, new evaluation methods enabling thorough and fine-grained
analysis of ObIR technological and financial performance, etc.


TOPICS:

All submissions that focus on different aspects of a holistic evaluation
of the ontology-based information retrieval are invited. The topics of
interest are as follows:
- Evaluation of Ontology-based Information Retrieval
* Information retrieval evaluation
* Assessment of annotation quality/labour-load
* Evaluation and benchmarking techniques and datasets
* Quantitative / qualitative evaluation methods
* Cost/ utility ratio
- Ontology quality aspects in Information Retrieval
* Ontology quality evaluation
* Ontology utility
* Ontology maintenance
* Quantitative / qualitative evaluation methods
- User acceptance of semantic technology
* Usability evaluation
* Quantitative / qualitative evaluation methods
* Evaluation of human-computer interaction


SUBMISSIONS:

We invite submissions of two types: regular papers, and research in progress
papers. Papers are restricted to a maximum length of 12 pages. Submissions
must conform to Springer's LNCS format. All accepted papers will be
published in a combined APWeb-WAIM?09 workshops volume of Lecture Notes in
Computer Science series by Springer.


ORGANIZING COMMITTEE:

- Darijus Strasunskas (Dept. of Industrial Economics & Technology
Management, NTNU, Norway)
- Stein L. Tomassen (Dept. of Computer & Information Science, NTNU,
Norway),
- Jinghai Rao (AOL, China).

Contact at: enqoir09 [at] gmail.com


PROGRAM COMMITTEE (*TENTATIVE*):

- Per Gunnar Auran (Yahoo! Technologies, Norway)
- Xi Bai (University of Edinburgh, UK)
- Robert Engels (ESIS, Norway)
- Avigdor Gal (Technion, Israel)
- Jon Atle Gulla (Norwegian Univ. of Science and Technology, Norway)
- Monika Lanzenberger (Vienna University of Technology, Austria)
- Kin Fun Li (University of Victoria, Canada)
- James C. Mayfield (John Hopkins University, USA)
- Gabor Nagypal (disy Informationssysteme GmbH, Germany)
- David Norheim (Computas, Norway)
- Jaana Kekalainen (Univ. of Tampere, Finland)
- Atanas Kiryakov (Ontotext Lab, Sirma Group, Bulgaria)
- Iadh Ounis (Univ. of Glasgow, UK)
- Tetsuya Sakai (NewsWatch, Inc., Japan)
- Amanda Spink (Queensland Univ. of Technology, Australia)
- Peter Spyns (Vrije Universiteit Brussel, Belgium)
- Aleksander ?hrn (FAST Search and Transfer, Norway)


DATES:

November 16, 2008 Submission of papers
December 10, 2008 Notification about decision
January 5, 2009 Camera-ready versions due


FURTHER INFORMATION:

http://events.idi.ntnu.no/enqoir09/
enqoir09 [at] gmail.com

Related Resources

SLATe 2024   Symposium on Languages, Applications and Technologies
SIGML 2025   6th International Conference on Signal Processing and Machine Learning
KEOD 2024   16th International Conference on Knowledge Engineering and Ontology Development
IRCDL 2025   21st Conference on Information and Research Science Connecting to Digital and Library Science
CHASE 2025   Cooperative and Human Aspects of Software Engineering
NIAI 2025   6th International Conference on Natural Language Processing, Information Retrieval and AI
TASE 2025   19th International Symposium on Theoretical Aspects of Software Engineering
NIAI 2025   6th International Conference on Natural Language Processing, Information Retrieval and AI
Philosophical Approaches to Games and Ga 2024   Call For Papers - Philosophical Approaches to Games and Gamification: Ethical, Aesthetic, Technological and Political Perspectives
IMMM 2025   The Fifteenth International Conference on Advances in Information Mining and Management