posted by organizer: yuhanpanda || 1294 views || tracked by 1 users: [display]

FL@FM-NeurIPS 2024 : International Workshop on Federated Foundation Models in Conjunction with NeurIPS 2024

FacebookTwitterLinkedInGoogle

Link: https://federated-learning.org/fl@fm-neurips-2024/
 
When Dec 14, 2024 - Dec 15, 2024
Where Vancouver, BC, Canada
Submission Deadline Sep 5, 2024
Notification Due Sep 30, 2024
Final Version Due Oct 15, 2024
Categories    artificial intelligence   machine learning   federated learning   foundation models
 

Call For Papers

[Introduction]
Foundation models (FMs) are typically associated with large language models (LLMs), like ChatGPT, and are characterized by their scale and broad applicability. While these models provide transformative capabilities, they also introduce significant challenges, particularly concerning distributed model management and related data privacy, efficiency, and scalability. The training of foundation models is data and resource intensive and the conventional methods are typically centralized; this creates significant challenges including regulatory and privacy concerns in real-world use cases. These include distributed training data, computational resources to manage distributed data repositories, and development of and alignment with regulatory guidelines (e.g., GDPR) that restrict sharing sensitive data.

Federated learning (FL) is an emerging paradigm that can mitigate these challenges by training a global but distributed model using distributed data. The extensive application of machine learning to analyze and draw insight from real-world, distributed, and sensitive data necessitates familiarity with and adoption of this relevant and timely topic within the general scientific community. As FL allows self-interested data owners to collaboratively train models, end-users can become co-creators of AI solutions. By adopting federated learning approaches, we can leverage distributed data and computing power available across different sources while respecting user privacy.

The rise of FMs amplifies the importance and relevance of FL as a crucial research direction. With FMs becoming the norm in machine learning development, the focus shifts from model architecture design to tackling the issues surrounding privacy-preserving and distributed learning. Advancements in FL methods have the potential to unlock the use of FMs, enabling efficient and scalable training while safeguarding sensitive data.

FMs such as GPT-4 encoded with vast knowledge and powerful emergent abilities have achieved remarkable success in various natural language processing and computer vision tasks. Grounding FMs by adapting them to domain-specific tasks or augmenting them with domain-specific knowledge enables us to exploit the full potential of FMs. However, grounding FMs faces several challenges, stemming primarily from constrained computing resources, data privacy, model heterogeneity, and model ownership. Federated Transfer Learning (FTL), the combination of FL and transfer learning, provides promising solutions to address these challenges. In recent years, the need for grounding FMs leveraging FTL, coined FTL-FM, has arisen strongly in both academia and industry.

With this in mind, we invite original research contributions, position papers, and work-in-progress reports on various aspects of federated learning in the era of foundation models. Since the emergence of foundation models has been a relatively recent phenomenon, their full impact on federated learning has not yet been well explored or understood. We hope to provide a platform to facilitate interaction among students, scholars, and industry professionals from around the world to discuss the latest advancements, share insights, and identify future directions in this exciting field.

This workshop aims to bring together academic researchers and industry practitioners to address open issues in this interdisciplinary research area. For industry participants, we intend to create a forum to communicate problems are practically relevant. For academic participants, we hope to make it easier to become productive in this area. The workshop will focus on the theme of combining FL with FM to open up opportunities to address new challenges. The workshop topics include but are not limited to:

Theory and algorithmic foundations:
-Federated in-context learning
-Federated neuro-symbolic learning
-Impact of heterogeneity in FL of large models
-Multi-stage model training (e.g., base model + fine tuning)
-Optimization advances in FL (e.g., beyond first-order and local methods)
-Prompt tuning and design in federated settings
-Self-supervised learning in federated settings

Leveraging foundation models to improve federated learning:
-Adaptive aggregation strategies for FL in heterogeneous environments
-Foundation model enhanced FL knowledge distillation
-Overcoming data interoperability challenges using foundation models
-Personalization of FL with foundation models

Federated learning for training and tuning foundation models:
-Fairness, bias, and interpretability challenges in FL with foundation models
-Federated transfer learning with foundation models
-FL-empowered multi-agent foundation model systems
-FL techniques for training large-scale foundation models
-Hardware for FL with foundation models
-Optimization algorithms for federated training of foundation models
-Privacy-preserving mechanisms in FL with foundation models
-Resource-efficient FL with foundation models
-Security and robustness considerations in FL with foundation models
-Systems and infrastructure for FL with foundation models
-Vertical federated learning with foundation models
-Vulnerabilities of FL with foundation models

[Submission Instructions]
The main text of a submitted paper can be between 4 to 9 content pages, including all figures and tables, following NeurIPS'24 template (https://media.neurips.cc/Conferences/NeurIPS2024/Styles.zip). Additional pages containing references don't count as content pages. An optional appendix of any length is allowed and should be put at the end of the paper (after references). Submissions are double-blind (author identity shall not be revealed to the reviewers), so the submitted PDF file should not include any identifiable information of authors.

Submissions are collected on OpenReview at the following link: https://openreview.net/group?id=NeurIPS.cc/2024/Workshop/Federated_Learning.
Accepted papers and their review comments will be posted on OpenReview in public. Due to the short timeline, we will not have a rebuttal period, but the authors are encouraged to interact and discuss with reviewers on OpenReview after the acceptance notifications are sent out. Rejected papers and their reviews will remain private and not posted in public.

For questions, please contact: han[dot]yu[at]ntu[dot]edu[dot]sg

Related Resources

IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
NeurIPS 2024   The Thirty-Eighth Annual Conference on Neural Information Processing Systems
SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
ENLSP 2025   The 4th NeurIPS ENLSP 2024 workshop on Efficient Natural Language & Speech Processing: Highlighting New Architectures for Future Foundation Models
IEEE-Ei/Scopus-CNIOT 2025   2025 IEEE 6th International Conference on Computing, Networks and Internet of Things (CNIOT 2025) -EI Compendex
NeurIPS 2025   Annual Conference on Neural Information Processing Systems
AMLDS 2025   IEEE--2025 International Conference on Advanced Machine Learning and Data Science
FL@FM-TheWebConf 2025   International Workshop on Federated Foundation Models for the Web 2025
FL-AsiaCCS 2025   International Workshop on Secure and Efficient Federated Learning In Conjunction with ACM AsiaCCS 2025
LSIJ 2024   Life Sciences: an International Journal