posted by user: bapl || 506 views || tracked by 2 users: [display]

Adapt-NLP 2021 : The Second Workshop on Domain Adaptation for NLP

FacebookTwitterLinkedInGoogle

Link: https://adapt-nlp.github.io/Adapt-NLP-2021/
 
When Apr 19, 2021 - Apr 20, 2021
Where virtual
Submission Deadline Jan 18, 2021
Notification Due Feb 18, 2018
 

Call For Papers

Call For Papers

Overview

The growth in computational power and the rise of Deep Neural Networks
(DNNs) have revolutionized the field of Natural Language Processing
(NLP). The ability to collect massive datasets with the capacity to
train big models on powerful GPUs, has yielded NLP-based technology
that was beyond imagination only a few years ago.

Unfortunately, this technology is still limited to a handful of
resource rich languages and domains. This is because most NLP
algorithms rely on the fundamental assumption that the training and
the test sets are drawn from the same underlying distribution. When
the train and test distributions do not match, a phenomenon known as
domain shift, such models are likely to encounter performance drops.

Despite the growing availability of heterogeneous data, many NLP
domains still lack the amounts of labeled data required to feed
data-hungry neural models, and in some domains and languages even
unlabeled data is scarce. As a result, the problem of domain
adaptation, training an algorithm on annotated data from one or more
source domains, and applying it to other target domains, is a
fundamental challenge that has to be solved in order to make NLP
technology available for most world languages and textual domains.

Domain Adaptation (DA) is hence the focus of this workshop.
Particularly, the topics of the workshop include, but are not
restricted to:

- Novel DA algorithms addressing existing and new assumptions (e.g.
assuming or not assuming unlabeled data from the source and target
domains, making certain assumptions on the differences between the
source and target domain distributions, etc.).
- Introducing and exploring novel or under-explored DA setups, aiming
towards realistic and applicable ones (e.g., one-to-many DA,
many-to-many DA, DA when the target domain is unknown when training on
the source domain, and source-free DA where just a source model is
available but there is no access to source data).
- Extending DA research to new domains and tasks through both novel
datasets and algorithmic approaches.
- Proposing novel zero-shot and few-shot algorithms and discussing their
relevance for DA..
- Exploring the similarities and differences between algorithmic
approaches to DA, cross-lingual, and cross-task learning.
- A conceptual discussion of the definitions of fundamental concepts
such as domain, transfer as well as zero-shot and few-shot learning.
- Novel approaches to evaluation of DA methods under different
assumptions on data availability (e.g. evaluation without access to
target domain labeled data and even with small amounts of target
domain unlabeled data).
- Thorough empirical comparisons of existing DA methods on existing and
novel tasks, datasets, and setups.

Related Resources

[WWW 2021] FinSBD-3 Shared Task 2021   Structure Boundary Detection, an extension of Sentence Boundary Detection in PDF Noisy Text in the Financial Domain
[WWW 2021] FinSIM-2 Shared Task 2021   Learning Semantic Similarities for the Financial Domain
ACL-IJCNLP 2021   59t Annual Meeting of the Association for Computational Linguistcs and the 10th International Joint Conference on Natural Language Processing
NLP 2021   Call for Papers - “Information Extraction and NLP”
CoNeCo 2021   13th International Conference on Computer Networks & Communications
TSD 2021   The twenty-fourth International Conference on Text, Speech and Dialogue (TSD 2021).
Multilingual 2021   Multilingual Representations for NLP
DTMN 2021   7th International Conference on Data Mining
MLNLP 2021   2nd International Conference on Machine Learning Techniques and NLP
AS-RLPMTM 2021   Applied Sciences special issue Rich Linguistic Processing for Multilingual Text Mining