| |||||||||||||
SNL 2017 : First International Workshop on Symbolic-Neural Learning | |||||||||||||
Link: http://www.ttic.edu/SNL2017/ | |||||||||||||
| |||||||||||||
Call For Papers | |||||||||||||
Symbolic-neural learning involves deep learning methods in combination with symbolic structures. A "deep learning method" is taken to be a learning process based on gradient descent on real-valued model parameters. A "symbolic structure" is a data structure involving symbols drawn from a large vocabulary; for example, sentences of natural language, parse trees over such sentences, databases (with entities viewed as symbols), and the symbolic expressions of mathematical logic or computer programs. Natural applications of symbolic-neural learning include, but are not limited to, the following areas:
- Image caption generation and visual question answering - Speech and natural language interactions in robotics - Machine translation - General knowledge question answering - Reading comprehension - Textual entailment - Dialogue systems Various architectural ideas are shared by deep learning systems across these areas. These include word and phrase embeddings, recurrent neural networks (LSTMs and GRUs) and various attention and memory mechanisms. Certain linguistic and semantic resources may also be relevant across these applications. For example dictionaries, thesauri, WordNet, FrameNet, FreeBase, DBPedia, parsers, named entity recognizers, coreference systems, knowledge graphs and encyclopedias. Deep learning approaches to the above application areas, with architectures and tools subjected to quantitative evaluation, loosely define the focus of the workshop. We invite submissions of high-quality, original papers within the workshop focus. The workshop will consist of a half-day of invited talks and a full day of presentations of accepted papers. |
|