posted by user: Shawe82 || 3354 views || tracked by 7 users: [display]

AODL 2012 : NIPS Workshop on Analysis Operator Learning vs. Dictionary Learning: Fraternal Twins in Sparse Modeling

FacebookTwitterLinkedInGoogle

Link: https://sites.google.com/site/dlaoplnips2012/
 
When Dec 7, 2012 - Dec 7, 2012
Where Lake Tahoe
Submission Deadline Sep 30, 2012
Notification Due Oct 7, 2012
Final Version Due Nov 15, 2012
Categories    machine learning   signal processing
 

Call For Papers

Exploiting structure in data is crucial for the success of many techniques in neuroscience, machine learning, signal processing, and statistics. In this context, the fact that data of interest can be modeled via sparsity has been proven extremely valuable. As a consequence, numerous algorithms either aiming at learning sparse representations of data, or exploiting sparse representations in applications have been proposed within the machine learning and signal processing communities over the last few years.

The most common way to model sparsity in data is via the so called synthesis model, also known as sparse coding. Therein, the underlying assumption is that the data can be decomposed into a linear combination of very few atoms of some dictionary. Various previous workshops and special sessions at machine learning conferences have focused on this model and its applications, as well as on algorithms for learning suitable dictionaries.

In contrast to this, considerably less attention has been drawn up to now to an interesting alternative, the so called analysis model. Here, the data is mapped to a higher dimensional space by an analysis operator and the image of this mapping is assumed to be sparse. One of the most prominent examples of analysis sparsity is the total variation model in image processing.

Both analysis operators and dictionaries can either be defined analytically, or learned using training samples drawn from the considered data. Learning sparse models is important since they outperform analytic ones in terms of optimal sparse representation, and allow sparse representations for classes of data where no analytical model is available. For the challenge of learning, unsupervised techniques are of major interest as they do not require labeled ground-truth data and are independent of a specific task. There are theoretical results for the synthesis model that mathematically justify constraints on the structure of dictionaries and thus help to design learning algorithms. Nevertheless, many theoretical questions associated with learning sparse models remain, in particular for the analysis case, which is far from being fully understood.

Clearly, synthesis modeling has big impact on machine learning problems like detection, classification or recognition tasks and has mainly influenced the areas of e.g. Deep Learning, or Multimodal Learning. Although the analysis model has proven advantageous over the synthesis model in regularizing inverse problems, its applicability to the aforementioned data analysis tasks has much less been investigated.
Topics of Interest

Topics of interest are all aspects of sparse modeling in machine learning, including but not limited to:
* Dictionary Learning
* Analysis Operator Learning
* Joint Learning and Classification/Recognition
* Task Oriented Learning of Sparse Representations
* Theory of Analysis Operators and Dictionaries
* Sparse Models for Data Completion
* Multimodal Dictionary/Analysis Operator Learning
* Optimization for Learning Dictionaries and Analysis Operators

Related Resources

SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
IEEE-Ei/Scopus-SGGEA 2024   2024 Asia Conference on Smart Grid, Green Energy and Applications (SGGEA 2024) -EI Compendex
ICSTTE 2025   2025 3rd International Conference on SmartRail, Traffic and Transportation Engineering (ICSTTE 2025)
IEEE Big Data - MMAI 2024   IEEE Big Data 2024 Workshop on Multimodal AI
AASDS 2024   Special Issue on Applications and Analysis of Statistics and Data Science
Ei/Scopus-ACAI 2024   2024 7th International Conference on Algorithms, Computing and Artificial Intelligence(ACAI 2024)
MLPRIS 2025   The 7th Int'l Conference on Machine Learning, Pattern Recognition and Intelligent Systems
CVAI 2026   2026 International Symposium on Computer Vision and Artificial Intelligence (CVAI 2026)
CRET--EI 2025   2025 International Conference on Control, Robotics Engineering and Technology (CRET 2025)
CEU 2025   8th International Conference on Civil Engineering and Urban Planning