posted by user: Jscholtz0416 || 5412 views || tracked by 10 users: [display]

Metrics for Visual Analytics 2007 : InfoVis Workshop: Metrics for the Evaluation of Visual Analytics

FacebookTwitterLinkedInGoogle

Link: http://www.cs.umd.edu/hcil/InfoVisworkshop/
 
When Oct 28, 2007 - Oct 28, 2007
Where Sacramento, CA
Submission Deadline Sep 21, 2007
Notification Due Sep 30, 2007
Categories    HCI   visualization   information technology
 

Call For Papers

The workshop is a full day and is scheduled for Sunday, October 28th

The field of visual analytics is now recognized as a research area in many universities and organizations. As new fields develop ways of assessing, progress in those fields also expands. In the field of visual analytics, we are fortunate in that we already have lessons learned about evaluating visualizations. Unfortunately, these lessons still point out that this is a difficult problem. Visual analytics compound this problem by adding more dimensions: not only are we concerned with some measure of the visualizations, but we are concerned with evaluating the impact these visualizations have in helping analysts in their work. User-centered evaluations are vital in visual analytics as they contribute greatly to adoption of research software. The issues we face in developing user-centered evaluations for visual analytics are selecting:
- The task: the tradeoff is between simple tasks that are easily evaluated and developing more realistic tasks that consume more time and are much less straightforward to evaluate.
- The corresponding dataset: the same issues as above plus the issues of developing a publicly releasable dataset that resembles a realistic dataset
- The system and environment: how much does the system or environment play a role in the utility or success of the task.
- The participants: access to senior analysts or junior analysts in evaluations and ensuring that analysts are open to new technology
- Training: how much training should be provided to analysts prior evaluations or whether analysts should be paired with technologists to operate the software
- The metrics: what combination of quantitative and qualitative measures will be accepted? How can we ensure that qualitative measures are collected with and meet some rigor? How can we measure insights that were derived from the visualization and interactions with the visualization? This is especially problematic as not all analysts approach problems in the same fashion. Most importantly, what measures are most helpful to the analytic community and to the research community?

. Selected participants will receive copies of all accepted position papers. These participants will present their ideas or current research during the morning (about 10-15 minutes each). Based on the position papers and these presentations, the organizers will develop a list of possible metrics. An initial list will be distributed to the participants prior to the workshop. After all the presentations, this list will be discussed and refined by the participants.

In the afternoon session, the organizers will provide representative examples of different types of visual analytics systems (The VAST 2007 contest winners have agreed to let us use their submissions) and the workshop participants will test the list of metrics by evaluating these systems using the metrics. A discussion session will follow to identify successes and difficulties, and refine the list of metrics. The organizers will generate a report evaluating the metrics based on the participants�?? usage and the discussions.

We will produce a poster from the workshop which will be included in the poster session. The poster will focus on the metrics used during the workshop and the lessons learned for each.

We will also consider a joint journal paper or future conference paper with the workshop participants contributing to the various metrics proposed by the workshop.


Submission of position papers:
Submissions should be no longer than 4 pages and should focus on metrics and methods for evaluating visual analysis environments. If participants have used these methods already, please include lessons learned and references. If the proposal has not yet been tried, please provide some estimates of the efforts that would be needed to implement these. Position papers should be submitted to the organizers (see e-mail addresses below) no later than September 15th. Please see http://www.cs.umd.edu/hcil/InfoVisworkshop/
for details - posted in early September. Participants will be notified of acceptance no later than September 30th.


Organizers:
Jean Scholtz
Pacific Northwest National Laboratory, 340 Northslope Way, Rockaway Beach, OR 97136
Jean.scholtz@pnl.gov

Georges Grinstein
University of Massachusetts Lowell, Lowell MA 01854
grinstein@cs.uml.edu

Catherine Plaisant
University of Maryland, College Park, MD 20742, U.S.A.
plaisant@cs.umd.edu


Related Resources

WMT-metrics 2024   WMT24 Metrics Task: Call for Participation
IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
SOTICS 2025   The Fifteenth International Conference on Social Media Technologies, Communication, and Informatics
SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
QuASoQ 2024   12th International Workshop on Quantitative Approaches to Software Quality
IEEE-Ei/Scopus-CNIOT 2025   2025 IEEE 6th International Conference on Computing, Networks and Internet of Things (CNIOT 2025) -EI Compendex
ARDUOUS 2024   Annotation of useR Data for UbiquitOUs Systems
ToMS 2024   Transactions on Maritime Science journal
FPC 2025   Foresight Practitioner Conference 2025
ICMIP 2025   ACM--2025 10th International Conference on Multimedia and Image Processing (ICMIP 2025)