| |||||||||||||||
GENZERO 2024 : Revolutionizing Autonomous Systems with Generative AI, and Large Language Models for Zero Trust Architecture | |||||||||||||||
Link: https://genzero.tii.ae | |||||||||||||||
| |||||||||||||||
Call For Papers | |||||||||||||||
We are thrilled to announce our groundbreaking GENZERO workshop, hosted by the Secure System Research Center (SSRC) at the Technology Innovation Institute (TII) in Abu Dhabi. This prestigious event brings together the world's leading experts, researchers, and practitioners to explore cutting-edge technologies and their potential to enhance the performance and trustworthiness of autonomous and autonomic systems managing edge devices.
About TII and SSRC: The Technology Innovation Institute is a pioneering research center dedicated to pushing the boundaries of knowledge and innovation in various domains, including secure systems. Our Secure System Research Center focuses on creating end-to-end (E2E) secure, resilient, and safe autonomous and autonomic systems. We are committed to fostering global collaborations and driving groundbreaking research to address critical challenges in this field. Workshop Highlights: GENZERO workshop provides a platform for thought-provoking discussions and knowledge exchange on a wide range of topics, including: Generative AI and Machine Learning techniques for predictive maintenance, anomaly detection, and self-healing in autonomous systems and on edge robots (UxVs) Large language models for natural language understanding, context-aware for secure decision-making for command and control of autonomous systems Swarm intelligence and collective decision-making for security, resilience and safety in multi-robot systems using generative AI Secure edge computing, distributed architectures, and resource optimization for deploying generative AI models on resource-constrained robots. Real-world applications and case studies (e.g., autonomous transportation, delivery systems) of generative AI and LLM with zero trust architecture, in various domains. Zero Trust Platform (SoC), System and Communication architecture to support Generative AI, LLM, from Cloud to Edge continuum. Esteemed Partners and Advisory Board: We are privileged to have the support and collaboration of renowned institutions worldwide, including Purdue University, Caltech, Georgia Tech, Khalifa University, Graz University, SUPSI, Tampere University, Imperial College London, Turku University, Unimore, University of Waterloo, Universidade do Minho, McMaster University, University of Bologna, UNSW Sydney, Technion - Israel Institute of Technology, The University of Texas at Dallas, New York University Abu Dhabi, and the University of Manitoba. Our advisory board (link) comprises world-renowned experts in secure and resilient autonomous systems, ensuring the highest quality and relevance of the workshop. Call for Posters: We invite researchers, practitioners, and industry experts to submit poster abstracts showcasing their innovative approaches, experimental results, or lessons learned in addressing the challenges and opportunities related to security, resilience, and safety aspects of autonomous systems and edge robots. Poster submissions should align with one or more of the key topics mentioned above. Accepted poster will be invited to register for the workshop. Live Demonstrations: To enhance the interactive nature of the workshop, we encourage participants to complement their poster presentations with live demonstrations when applicable. These demos provide a dynamic element to the poster sessions, allowing attendees to experience the practical application of research and concepts firsthand. Awards for Excellence: The Secure Systems Research Centre (SSRC) sponsors the "Best Paper/Demo Award." This award is designed to acknowledge the top five presentations that stand out in innovation, the potential for making an impact, technical excellence, and impact on current SSRC research. The winners will receive an exceptional opportunity to further their research projects. They will be invited to submit detailed proposals for their projects. If these proposals meet the necessary criteria, SSRC to offer substantial financial support, with funding up to 1.5 million US dollars over three years. This funding opportunity, however, is exclusive to universities and organizations and will require adherence to TII's legal and procedural requirements. Poster Presentation and Travel Support: We invite up to two people to present each accepted poster at the workshop. Our organization will sponsor accommodation for one presenter and provide travel support of up to $1,500 for students. Poster Abstract Guidelines: Word limit: 3 pages IEEE Format: Word (US letter) or LaTeX template (https://www.overleaf.com/gallery/tagged/ieee-official) Title of the poster Author names and affiliations Abstract summarizing the research problem, methodology, key findings, and significance Keywords (up to five) Submission Guidelines: Please submit your poster abstracts https://easychair.org/conferences/?conf=genzero24. The deadline for poster submission is July 8th, 2024. Accepted poster abstracts will be notified by August 26th, 2024, and presenters will be required to prepare a digital poster for display during the workshop. Important Dates: Submission Deadline: July 8th, 2024 Acceptance Notification: August 26th, 2024 Camera Ready: October 14th, 2024 Registration Deadline: September 12th, 2024 Workshop Dates: November 12-13, 2024 Venue: Hilton Yas Abu Dhabi, a stunning location that offers a perfect blend of modern amenities and breathtaking views of the Yas Marina Circuit and the Arabian Gulf. Registration: Registration, courtesy of TII SSRC, is complimentary for individuals with accepted poster presentations and for our university affiliates. Access to registration will be provided directly to these groups. Please note, the registration deadline is September 12, 2024. For more information about the workshop, including the detailed program, keynote speakers, and travel information, please visit our website at https://genzero.tii.ae/. |
|