posted by user: claudiogalicchio || 1536 views || tracked by 4 users: [display]

Random-Weights Neural Networks @ IWANN 2019 : Special Session on Random-Weights Neural Networks at IWANN 2019

FacebookTwitterLinkedInGoogle

Link: http://iwann.uma.es/?page_id=290#SS08
 
When Jun 12, 2019 - Jun 14, 2019
Where Gran Canaria, Spain
Submission Deadline Feb 1, 2019
Notification Due Mar 18, 2019
Final Version Due Mar 26, 2019
Categories    neural networks   machine learning
 

Call For Papers

Random-weights Neural Networks identify a class of artificial neural models that employ a form of randomization in both their architectural and training design. Typically, connections to the hidden layer(s) are left untrained after initialization, and only the output weights need to be adjusted through learning (typically, by means of non-iterative methods). Extreme efficiency of training algorithms, along with the ease of implementation, made the randomized approach to Neural Networks design an incredibly widespread and popular methodology among both researchers and practitioners. Besides, from a theoretical perspective, randomization enables an effective study of the inherent properties for various kinds of Neural Networks architectures, even in the absence of (or prior to) training of internal weights connections. In literature, the approach has been instantiated in several forms, both in the case of feed-forward models (e.g., Random Vector Functional Link, Extreme Learning Machine, No-prop and Stochastic Configuration Networks), and for recurrent architectures (e.g., Echo State Networks, Liquid State Machines). Moreover, the rise of the Deep Learning era in Machine Learning research has recently given a further impulse to the study of hierarchically organized neural architectures with multiple random-weights components. In this concern, the potentialities of combining the advantages of deep architectures and the efficiency of randomized Neural Networks approaches remain still largely unexplored.

This session calls for contributions in the area of random weights Neural Networks from all perspectives, from seminal works on breakthrough ideas to applications of consolidated learning methodologies. Topics of interest for this session include, but are not limited to, the following:

- Neural Networks with random weights
- Randomized algorithms for Neural Networks
- Non-iterative methods for learning
- Random Vector Functional Link, Extreme Learning Machines, No-prop, and Stochastic Configuration Networks
- Reservoir Computing, Echo State Networks, and Liquid State Machines
- Deep Neural Networks with Random Weights (e.g. Deep Extreme Learning Machines and Deep Echo State Networks)
- Theoretical analysis on advantages and downsides of randomized Neural Networks
- Comparisons with fully trained Neural Networks
- Real-world Applications

Related Resources

ICMLA 2019   18th IEEE International Conference on Machine Learning and Applications
AAAI 2020   The Thirty-Fourth AAAI Conference on Artificial Intelligence
EI IWoMR 2020   2020 International Workshop on Mobile Robotics (IWoMR 2020)
IWUAS 2020   2020 International Workshop on Unmanned Aircraft Systems (IWUAS 2020)
NeurIPS 2019   Thirty-third Conference on Neural Information Processing Systems
ICMV--SPIE, Scopus, Ei Compendex 2019   SPIE--2019 The 12th International Conference on Machine Vision (ICMV 2019)--EI, Scopus and ISI
NIPS 2019   Thirty-third Conference on Neural Information Processing Systems
SI-ARNIA 2019   Special Issue on Advances in Recent Nature-Inspired Algorithms for Neural Engineering. Computational Intelligence and Neuroscience (ISSN:1687-5265 JCR:1.649)
ISBDAI 2020   【Ei Compendex Scopus】2018 International Symposium on Big Data and Artificial Intelligence
DSN 2020   Dependable Systems and Networks