posted by user: claudiogalicchio || 3454 views || tracked by 1 users: [display]

Reservoir Computing @ IJCNN 2020 : Challenges in Reservoir Computing - Special Session of IJCNN 2020

FacebookTwitterLinkedInGoogle

Link: https://sites.google.com/view/reservoir-computing-ijcnn2020/home
 
When Jul 19, 2020 - Jul 24, 2020
Where Glasgow, UK
Submission Deadline Jan 30, 2020
Notification Due Mar 15, 2020
Final Version Due Apr 15, 2020
Categories    neural networks   machine learning   echo state network   reservoir computing
 

Call For Papers

** Organizers **
Claudio Gallicchio (University of Pisa), Lukas Gonon (University of St. Gallen), Josef Teichmann (ETH Zurich), Juan-Pablo Ortega (University of St. Gallen, Switzerland and CNRS, France)

** Aim and Scope **
Reservoir Computing (RC) defines a class of recurrent neural systems where the dynamical memory component is left untrained after initialization. Only a simple - typically linear - readout layer is adapted on a set of training examples, thereby allowing the use of simple learning strategies. The overall approach has intriguing features that attracted researchers during the last decade. First, it gives a refreshing perspective on the use of dynamical systems in machine learning for time-series data. Moreover, the resulting ease of implementation and fast training compared to fully trained architectures made it greatly appealing for experimental usage, mostly in academia. Yet, at the current stage of neural networks/deep learning research development, RC-based methods do present several downsides that prevent extensive (e.g., industrial) applications to problems of Artificial Intelligence size with human-level performance. One such fundamental downside is that in real-world applications, the training efficiency of RC risks to vanish completely, colliding with the complexity involved by possibly gigantic reservoir spaces, and cost-intensive hyper-parameter search, often required to get state-of-the-art results. The difficulty in effectively dealing with huge input-output spaces is a related known RC issue that complicates matters further. Overcoming complexities of this kind represents a major challenge in RC research nowadays. On a different side, methodological, architectural and theoretical studies on RC have the potentiality to both develop a deeper understanding of the operation of (fading memory) dynamical neural systems, and to foster the progress of their training algorithms. Besides, novel ways to control the organization of neural dynamics, such is the case of conceptors, can originate from RC and transfer to more general ML setups. A further research-attractive dimension of RC systems is that they are inherently amenable to be implemented in neuromorphic hardware. In this regard, photonic reservoirs are certainly among the most exciting possibilities emerged in the last few years, promising both ultra-fast processing and very low energy consumption. However, designing full optical RC networks for real-world applications currently needs to pursue primary goals, such as implementing non-linear reservoirs with optical readout training.

This session intends to give a new impetus to RC research within the international neural networks community. We then invite to submit papers on both theoretical and application sides of RC. In particular, this session calls for novel, potentially groundbreaking, contributions that specifically address open challenges in the RC field.

A list of relevant topics for this session includes, without being limited to, the following:

- Reservoir Computing for Artificial Intelligence problems (e.g., vision, natural language processing, etc.)
- Reservoir Computing methods for fully trained Recurrent Neural Networks (including hybrid approaches)
- Neuromorphic Reservoir Computing
- Novel Reservoir Computing architectures, models and training algorithms
- Theory of dynamical systems in neural networks, including stability of input-driven temporal embeddings
- Statistical Learning Theory of Reservoir Computing networks
- Ensemble learning and Reservoir Computing
- Advancements in Reservoir Computing models, e.g. Echo State Networks and Liquid State Machines
- Conceptors
- Deep Reservoir Computing
- Reservoir dimensionality reduction, efficient reservoir hyper-parameter search and learning
- New applications of Reservoir Computing

** Papers Submission **
Papers submission for this Special Session follows the same process as for the regular sessions of WCCI 2020. When submitting your paper choose "Challenges in Reservoir Computing" as (main) research topic (among the Special Sessions topics).

For further information and news in this regard, please refer the WCCI 2020 website: https://wcci2020.org/submissions/

** Important Dates **
15 Jan 2020 Paper Submission Deadline
15 Mar 2020 Paper Acceptance Notification Date
15 April 2020 Final Paper Submission and Early Registration Deadline
19-24 July 2020 IEEE WCCI 2020, Glasgow, Scotland, UK

Related Resources

IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
IEEE-Ei/Scopus-CNIOT 2025   2025 IEEE 6th International Conference on Computing, Networks and Internet of Things (CNIOT 2025) -EI Compendex
ACM SAC 2025   40th ACM/SIGAPP Symposium On Applied Computing
21st AIAI 2025   21st (AIAI) Artificial Intelligence Applications and Innovations
CETA--EI 2025   2025 4th International Conference on Computer Engineering, Technologies and Applications (CETA 2025)
SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
AMLDS 2025   IEEE--2025 International Conference on Advanced Machine Learning and Data Science
ccpi 2025   16th International Workshop on Cloud-Edge Continuum Projects and Initiatives (CCPI)
ISCAI 2025   2025 4th International Symposium on Computing and Artificial Intelligence
MobiCASE 2025   16th EAI International Conference on Mobile Computing, Applications and Services