posted by user: danilo_dessi || 2825 views || tracked by 5 users: [display]

XSA 2021 : Explainable Deep Learning for Sentiment Analysis

FacebookTwitterLinkedInGoogle

Link: https://www.mdpi.com/journal/electronics/special_issues/SA_electronics
 
When N/A
Where N/A
Submission Deadline Dec 31, 2021
Categories    sentiment analysis   deep learning   explainability   embeddings
 

Call For Papers

CfP: Special Issue on Deep Learning and Explainability for Sentiment Analysis - Electronics (IF: 2.397)

***********************************

Webpage: https://www.mdpi.com/journal/electronics/special_issues/SA_electronics
Deadline: Dec 31, 2021

************************************

Summary:

People use online social platforms to express opinions about products and/or services in a wide range of domains, influencing the point of view and behavior of their peers. Understanding individuals’ satisfaction is a key element for businesses, policy makers, organizations, and social institutions to make decisions. This has led to a growing amount of interest within the scientific community, and, as a result, to a host of new challenges that need to be solved. Sentiment analysis methodologies have been investigated and employed by researchers in the past to provide methodologies and resources to stakeholders. In the field of machine learning, deep learning models which combine several neural networks have emerged and have become the state-of-the-art technologies in various domains for a variety of natural language processing tasks. The most prominent deep learning solutions are combined with word embeddings. However, how to include sentiment information in word-embedding representations to boost the performances of deep learning models, as well as explain what deep learning models (often employed as a black-box) learn are questions that still remain open and need further research and development.

The investigation of these key points will answer to why and how design choices for creating embedding representations and designing deep learning should be made. This goes toward the direction of Explainable Deep Learning (XDL), whose aim is to address how deep learning systems make decisions. This Special Issue aims to foster discussions about the design, development, and use of deep learning models and embedding representations which can help to improve state-of-the-art results, and at the same time enable interpreting and explaining the effectiveness of the use of deep learning for sentiment analysis. We invite theoretical works, implementations, and practical use cases that show benefits in the use of deep learning with a high focus on explainability for various domains.

The Special Issue is focused but not limited to these topics:

Deep learning topics:
- Deep learning topics
- Aspect-based DL and XDL models;
- Bias detection within DL and XDL for sentiment analysis;
- DL and XDL for toxicity and hate speech detection;
- Multilingual DL and XDL for sentiment analysis;
- DL and XDL for emotions detection;
- Weak-supervised DL and XDL for sentiment analysis;
- XDL design methodologies for sentiment analysis;
- Analysis of DL models for sentiment analysis.
Data representations topics
- Word embeddings for sentiment analysis;
- Knowledge graph and knowledge graph embeddings for sentiment analysis;
- Use of external knowledge (e.g., knowledge graphs) to feed DL for sentiment analysis;
- Combination of existing sentiment analysis resources (e.g., SenticNet) with embedding representations;
- Analysis of the performance of data representations for sentiment analysis tasks;
- Lexicon-based explainability for sentiment analysis.
Case studies
- Educational environments;
- Healthcare systems;
- Scholarly discussions (e.g., peer review process discussions, mailing lists, etc.);
- News platforms;
- Mental health systems;
- Social networks.

************************************

Important dates:
Deadline for paper submission: Dec 31, 2021.

Papers submitted before the deadline will be reviewed upon receipt and published continuously in the journal as soon as accepted.

************************************

Submission information:
Please use the Latex template you find here https://www.mdpi.com/authors/latex
Or microsoft word https://www.mdpi.com/files/word-templates/electronics-template.dot

************************************

Guest Editors:

Prof. Dr. Diego Reforgiato Recupero, University of Cagliari (Italy)
Prof. Dr. Harald Sack, FIZ Karlsruhe - Leibniz Institute for Information Infrastructure & Karlsruhe Institute of Technology (Germany)
Dr. Danilo Dessi', FIZ Karlsruhe - Leibniz Institute for Information Infrastructure & Karlsruhe Institute of Technology (Germany)

************************************

Related Resources

EAIH 2024   Explainable AI for Health
ACM-Ei/Scopus-CCISS 2024   2024 International Conference on Computing, Information Science and System (CCISS 2024)
ICDM 2024   IEEE International Conference on Data Mining
EAICI 2024   Explainable AI for Cancer Imaging
SOFTFM 2024   3rd International Conference on Software Engineering Advances and Formal Methods
DSIT 2024   2024 7th International Conference on Data Science and Information Technology (DSIT 2024)
COMSCI 2024   3rd International Conference on Computer Science and Information Technology
ACML 2024   16th Asian Conference on Machine Learning
IITUPC 2024   Immunotherapy and Information Technology: Unleashing the Power of Convergence
COMIT 2024   8th International Conference on Computer Science and Information Technology