posted by organizer: krausea || 3441 views || tracked by 7 users: [display]

NIPS DISCML 2017 : NIPS 2017 Workshop on Discrete Structures in Machine Learning (DISCML)

FacebookTwitterLinkedInGoogle

Link: http://www.discml.cc
 
When Dec 8, 2017 - Dec 8, 2017
Where Long Beach
Submission Deadline Oct 30, 2017
Categories    machine learning   discrete optimization
 

Call For Papers

============================================================

Call for Papers
DISCML -- 7th Workshop on Discrete Structures in Machine Learning at NIPS 2017 (Long Beach)

Dec 8, 2017
www.discml.cc

============================================================


Discrete optimization problems and combinatorial structures are ubiquitous in machine learning. They arise for discrete labels with complex dependencies, structured estimators, learning with graphs, partitions, permutations, or when selecting informative subsets of data or features.

What are efficient algorithms for handling such problems? Can we robustly solve them in the presence of noise? What about streaming or distributed settings? Which models are computationally tractable and rich enough for applications? What theoretical worst-case bounds can we show? What explains good performance in practice?

Such questions are the theme of the DISCML workshop. It aims to bring together theorists and practitioners to explore new applications, models and algorithms, and mathematical properties and concepts that can help learning with complex interactions and discrete structures.

We invite high-quality submissions that present recent results related to discrete and combinatorial problems in machine learning, and submissions that discuss open problems or controversial questions and observations, e.g., missing theory to explain why algorithms work well in certain instances but not in general, or illuminating worst case examples. We also welcome the description of well-tested software and benchmarks.

Areas of interest include, but are not restricted to:
* discrete optimization in context of deep learning
* bridging discrete and continuous optimization methods
* graph algorithms
* continuous relaxations
* learning and inference in discrete probabilistic models
* algorithms for large data (streaming, sketching, distributed)
* online learning
* new applications


Submissions:

Please send submissions in NIPS 2017 format (length max. 6 pages, non-anonymous) to submit@discml.cc

Submission deadline: October 30, 2017.


Organizers:
Jeff A. Bilmes (University of Washington, Seattle),
Stefanie Jegelka (MIT),
Amin Karbasi (Yale University),
Andreas Krause (ETH Zurich, Switzerland),
Yaron Singer (Harvard University)

Related Resources

MLDM 2023   18th International Conference on Machine Learning and Data Mining
FAIML 2023   2023 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2023)
IJCNN 2023   International Joint Conference on Neural Networks
CFDSP 2023   2023 International Conference on Frontiers of Digital Signal Processing (CFDSP 2023)
MLDM 2023   19th International Conference on Machine Learning and Data Mining
CVPR 2023   The IEEE/CVF Conference on Computer Vision and Pattern Recognition
IEEE ICA 2022   The 6th IEEE International Conference on Agents
ECIR 2023   45th European Conference on Information Retrieval
FAIML 2023   2023 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2023)
Distributed ML and Opt. 2023   Distributed Machine Learning and Optimization: Theory and Applications