| |||||||||||||||
BER 2017 : Behavior, Emotion and Representation: Building Blocks of Interaction | |||||||||||||||
Link: https://project.inria.fr/berworkshop/ | |||||||||||||||
| |||||||||||||||
Call For Papers | |||||||||||||||
Due to several requests, the deadline is extended to 15th of August (last extension), Midnight Paris time.
********************************************************************** Workshop ***“Behavior, Emotion and Representation: Building Blocks of Interaction”*** 5th International Conference on Human-Agent Interaction https://project.inria.fr/berworkshop/ Bielefeld (Germany), October 17, 2017 ********************************************************************** The 1st Workshop on ***“Behavior, Emotion and Representation: Building Blocks of Interaction”*** will be held in Bielefeld (Germany) in October 17, 2017. This Workshop is a satellite event of the HAI 2017 conference. This full day workshop is at the crossroad of Human perception, Affective Computing, Psychology and Cognitive Representation research domains. The first objective of this workshop is to stimulate exchange from a multidisciplinary point of view between these research communities on the specific topic of underlying tools and models for new building blocks for natural and intuitive interaction. The second objective is to discuss the cognitive abilities and physiological parameters of users to provide assistance in human-machine interactions. * Workshop context * Natural behavior skills based on cognitive abilities are key challenges for robots, virtual agents and intelligent machines while interacting with humans. This becomes particular evident by the expected increase in the use of intelligent interaction partners designed to support humans in everyday situations within the next coming decade (such as virtual coaches, companion robots, assistive systems and autonomous cars). These systems will need to develop their autonomy and they have to elicit social interaction and social synchrony. In order to achieve these goals, their perception of humans, as well as their behavior, must build on more complex inputs about emotion, mental state and models of the human partners compared to the mainly more low-level based approaches currently in use. Recent advances in multidisciplinary research on behavior, emotional states, visual behavior, neurofeedback, physiological parameters or mental memory representations help to understand the cognitive background of action and interaction in everyday interactions and therefore to pave the way for the design of new building blocks for a more natural and intuitive human-machine interaction. Collecting and analyzing multi-modal data from different measurements also allows constructing solid computational models. These blocks of interaction will serve as basis for building artificial cognitive systems being able to interact with humans in an intuitive way and to acquire new skills by learning from the user. This will result in new forms of human-computer interaction such as individualized, adaptive assistance systems for scaffolding cognitive, emotional and attentive learning processes. In this context, it is clearly advantageous for intelligent robots and virtual agents to know how these cognitive representations and physiological parameters are formed, stabilized and adapted during different phases in daily actions. This knowledge enables a technical system to specify and perceive individual’s current level of learning and performance, and therefore to shape the interaction. These interactions must be (socially) appropriate, not excessive. Such systems can assist users in developing (interaction) skills in a variety of domains and during different phases in daily-life actions. At the same time, interactive systems should fulfill constraints such as usability, acceptability and ethics. Topics of interest include all aspects of affective computing dedicated to robotics and interactive systems including, but not limited to, the following topics: - Acoustic, visual or multimodal processing for affect recognition - Real time and embedded perception into the wild - Human Behaviour analysis - Anticipation and Imitation of Human Behaviour - Affects and social interaction modeling - Affective computing in the human/robot interaction loop - Affects rendering and synthesis - Acceptability and usability while interacting - Affects in developmental robotics - Computational modeling of cognitive interaction components - Case Studies and Applications in Real-life Contexts - Social and Psychological studies of human/robot interaction - Eye-hand coordination in interactive / dyadic situations *Paper submission* As the main HAI 2017 conference, we will accept only online submission of PDF files in the ACM SIGCHI format (https://sigchi.org/templates/). Initial submissions can be a 1 page abstract or a 4-6 pages long paper (references do not count toward the page limits). They could be anonymized. Submission page will be opened timely. All submissions will be reviewed by the program committee members. Papers will be evaluated on the basis of research originality, novelty, quality and relevance to the workshop. A special issue in the Multimodal Technologies and Interaction Journal is envisioned with extended version of selected papers. *Online submission* Your article must be submited as a PDF file using the BER2017 submission webpage (https://easychair.org/conferences/?conf=ber2017). You may need to create an EasyChair account if you do not have already one. *Important dates* August 15, 2017: 1 page abstract or 4-6 pages paper submission September 4, 2017: Notification of acceptance October 2, 2017: Camera-ready paper submission deadline (4-6 pages) October 17, 2017: Workshop day *Organization committee:* - Thomas Küchelmann, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University, Germany - Dr. Kai Essig, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University, Germany - Assoc. Prof. Dominique Vaufreydaz, University Grenoble-Alpes, LIG, Inria (France) - Thomas Guntz, University Grenoble-Alpes, LIG, Inria (France) * Scientific committee:* - Prof. Dr. James Crowley, University Grenoble-Alpes, Grenoble INP, LIG, Inria (France) - Prof. Dr. Thomas Schack, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University, Germany * Program committee:* - Dr. Raffaela Balzarini, Inria, Grenoble (France) - Prof. Dr. Maurizio Bertollo, University of Chieti-Pescara, Pescara (Italy) - Dr. François Bremond, Inria, Grenoble (France) - Prof. Dr. Mohamed Chetouani, ISIR/Université Pierre et Marie-Curie, Paris (France) - Prof. Dr. Kerstin Dautenhahn, University of Hertfordshire, UK - Dr. Wafa Johal, CHILI/LSRO Labs, EPFL (Switzerland) - Dr. Andre Krause, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University (Germany) - Prof. Dr. Marc Pomplun, UMass Boston (USA) - Prof. Dr. Gershon Tenenbaum, Florida University (USA) - Dr. Kostas Velentzas, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University (Germany) - Prof. Dr. Matthias Weigelt, Department of Sports and Health, Paderborn University (Germany) |
|