posted by organizer: andrea_ceni || 1989 views || tracked by 2 users: [display]

Reservoir Computing IJCNN 2023 : Special Session on Reservoir Computing: theory, models, and applications

FacebookTwitterLinkedInGoogle

Link: https://sites.google.com/view/reservoir-computing-tf/activities/ijcnn-2023-special-session?pli=1
 
When Jun 18, 2023 - Jun 23, 2023
Where Gold Coast Queensland Australia
Submission Deadline Feb 7, 2023
Notification Due Mar 31, 2023
Final Version Due Apr 28, 2023
Categories    reservoir computing   neural networks   artificial intelligence   machine learning
 

Call For Papers

SPECIAL SESSION ON RESERVOIR COMPUTING: THEORY, MODELS, AND APPLICATIONS

18-23rd June 2023, Gold Coast Convention and Exhibition Centre Queensland, Australia



IMPORTANT DATES:

- Papers submission EXTENDED DEADLINE: February 7, 2023 (11:59 pm AoE)

- Decision notification: March 31, 2023



LINKS:

More info at: https://sites.google.com/view/reservoir-computing-tf/activities/ijcnn-2023-special-session?pli=1

Paper submission Guidelines: https://2023.ijcnn.org/authors/paper-submission

Submission link: https://edas.info/newPaper.php?c=30081&track=116064



ORGANISERS:

Andrea Ceni (University of Pisa, Italy), Claudio Gallicchio (University of Pisa, Italy), Gouhei Tanaka (University of Tokyo, Japan).



DESCRIPTION:

Reservoir Computing (RC) is a popular approach for efficiently training Recurrent Neural Networks (RNNs), based on (i) constraining the recurrent hidden layers to develop stable dynamics, and (ii) restricting the training algorithms to operate solely on an output (readout) layer.

Over the years, the field of RC attracted a lot of research attention, due to several reasons. Indeed, besides the striking efficiency of training algorithms, RC neural networks are distinctively amenable to hardware implementations (including neuromorphic unconventional substrates, like those studied in photonics and material sciences), enable clean mathematical analysis (rooted, e.g., in the field of random matrix theory), and finds natural engineering applications in resource-constrained contexts, such as edge AI systems. Moreover, in the broader picture of Deep Learning development, RC is a breeding ground for testing innovative ideas, e.g. biologically plausible training algorithms beyond gradient back-propagation. Noticeably, although established in the Machine Learning field, RC lends itself naturally to interdisciplinarity, where ideas and inspirations coming from diverse areas such as computational neuroscience, complex systems and non-linear physics can lead to further developments and new applications.

This special session is intended to be a hub for discussion and collaboration within the Neural Networks community, and therefore invites contributions on all aspects of RC, from theory, to new models, to emerging applications.

We invite researchers to submit papers on all aspects of RC research, targeting contributions on theory, models, and applications.



TOPICS OF INTEREST:

A list of relevant topics for this session includes, without being limited to, the following:

- New Reservoir Computing models and architectures, including Echo State Networks and Liquid State Machines

- Hardware, physical and neuromorphic implementations of Reservoir Computing systems

- Learning algorithms in Reservoir Computing

- Reservoir Computing in Computational Neuroscience

- Reservoir Computing on the edge systems

- Novel learning algorithms rooted in Reservoir Computing concepts

- Novel applications of Reservoir Computing, e.g., to images, video and structured data

- Federated and Continual Learning in Reservoir Computing

- Deep Reservoir Computing neural networks

- Theory of complex and dynamical systems in Reservoir Computing

- Extensions of the Reservoir Computing framework, such as Conceptors



SUBMISSION GUIDELINES AND INSTRUCTIONS

Papers submission for this Special Session follows the same process as for the regular sessions of IJCNN 2023, which uses EDAS as submission system.

The review process for IJCNN 2023 will be double-blind. For prospected authors, it is therefore mandatory to anonymize their manuscripts. Each paper should have 6 to MAXIMUM 8 pages, including figures, tables and references. Please refer to the Submission Guidelines at https://2023.ijcnn.org/authors/paper-submission for full information.

Submit your paper at the following link https://edas.info/N30081 and choose the track "Special Session: Reservoir Computing: theory, models, and applications", or use the direct link: https://edas.info/newPaper.php?c=30081&track=116064.



Note that anonymizing your paper is mandatory, and papers that explicitly or implicitly reveal the authors' identities may be rejected.



Sincerely,
Organising Team

Related Resources

IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
DeepXplain 2025   IJCNN Special Session on Explainable Deep Neural Networks for Responsible AI: Post-Hoc and Self-Explaining Approaches
SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
IJCNN 2025   International Joint Conference on Neural Networks
FMM 2025   IJCNN Special Session on Foundation Models in Medicine (FMM)
IEEE-Ei/Scopus-CNIOT 2025   2025 IEEE 6th International Conference on Computing, Networks and Internet of Things (CNIOT 2025) -EI Compendex
AMLDS 2025   IEEE--2025 International Conference on Advanced Machine Learning and Data Science
21st AIAI 2025   21st (AIAI) Artificial Intelligence Applications and Innovations
SPIE-Ei/Scopus-CMLDS 2025   2025 2nd International Conference on Computing, Machine Learning and Data Science (CMLDS 2025) -EI Compendex & Scopus
ACM SAC 2025   40th ACM/SIGAPP Symposium On Applied Computing