| |||||||||||
Foundation Models: Theory, Technology, a 2024 : Intelligent Computing: Special Issue: Foundation Models: Theory, Technology, and Applications | |||||||||||
Link: https://spj.science.org/page/icomputing/si/foundation-models | |||||||||||
| |||||||||||
Call For Papers | |||||||||||
Scope
------------------ Foundation Models, encompassing Large Language Models (LLMs) and Large Multimodal Models (LMMs), have garnered significant attention across both academic and industrial spheres. Prominent examples such as GPT-4, GPT-4o, Gemini, LLaMa, Qwen, and Sora have demonstrated remarkable capabilities in areas like text generation, human-machine interaction, and video synthesis. This special issue focuses on the theme of the latest technological development of research advances and industrial practices in foundation models. We welcome the original research studies on innovative algorithms, architectures, and applications. We particularly encourage submissions that address the challenges of computing-efficient models, as well as those application domains that foundation models would disruptively change. Topics of Interest ------------------ This special issue of Intelligent Computing accepts both Research Articles and Review Articles. Topics of interest include, but are not limited to: Breakthrough backbone architecture of foundation models, such as mamba, diffusion models, etc. New training algorithms of large foundation models, such as large-scale pre-training, supervised fine-tuning, and alignment. Efficient inference algorithms for large foundation models, such as decoding algorithms and reasoning algorithms. Computing-efficient large foundation models, such as compression, quantization, and memory efficient methods. LLM-based agent technologies such as RAG, tool learning, long-term planning. Deployment and adaptation methods for large models in embedded system, mobile devices, robots, and communication and networking system. Applications of foundation models to various professional fields, such as science, education, finance, etc. Open-sourced models, datasets and evaluation benchmarks for foundation models. Guest Editors ------------------ Haofen Wang, Tongji University, China Huanhuan Chen, University of Science and Technology of China, China Lu Chen, Shanghai Jiao Tong University, China Jinpeng Chen, Bejing University of Posts and Telecommunication, China Yuefeng Li, Queensland University of Technology, Australia Submission Instructions ------------------ Please indicate in your cover letter that your submission is intended for inclusion in the special issue. Submission Deadline: December 20, 2024 |
|