| |||||||||||||||||
HotCloudPerf 2021 : HotCloudPerf 2021: The 4th Workshop on Hot Topic in Cloud Computing Performance | |||||||||||||||||
Link: https://hotcloudperf.spec.org | |||||||||||||||||
| |||||||||||||||||
Call For Papers | |||||||||||||||||
=== The Fourth Workshop on Hot Topics in Cloud Computing Performance ===
(HotCloudPerf-2021) “Focus Theme: Benchmarking in the Cloud.” https://hotcloudperf.spec.org VENUE: Held in conjunction with ICPE, April 19 or 20, 2021, Rennes, France. Contact: hotcloudperf2021[at]easychair.org IMPORTANT DATES (Anywhere on Earth) January 20, 2021 Abstract due (informative; extended deadline) January 25, 2021 Papers due (extended deadline) February 11, 2021 Author Notification February 22, 2021 Camera-ready deadline April 19 or 20, 2021 Workshop Day WORKSHOP THEME AND BACKGROUND Cloud computing is emerging as one of the most profound changes in the way we build and use IT. The use of global services in public clouds is increasing, and the lucrative and rapidly growing global cloud market already supports over 1 million IT-related jobs. However, it is currently challenging to make the IT services offered by public and private clouds performant (in an extended sense) and efficient. Emerging architectures, techniques, and real-world systems include hybrid deployment, serverless operation, everything as a service, complex workflows, auto-scaling and -tiering, etc. It is unclear to which extent traditional performance engineering, software engineering, and system design and analysis tools can help with understanding and engineering these emerging technologies. The community also needs practical tools and powerful methods to address hot topics in cloud computing performance. Responding to this need, the HotCloudPerf workshop proposes a meeting venue for academics and practitioners, from experts to trainees, in the field of cloud computing performance. The workshop aims to engage this community, and to lead to the development of new methodological aspects for gaining deeper understanding not only of cloud performance, but also of cloud operation and behavior, through diverse quantitative evaluation tools, including benchmarks, metrics, and workload generators. The workshop focuses on novel cloud properties such as elasticity, performance isolation, dependability, and other non-functional system properties, in addition to classical performance-related metrics such as response time, throughput, scalability, and efficiency. Each year, the workshop chooses a focus theme to explore; for 2021, the theme is “benchmarking in the cloud.” Articles focusing on this topic are particularly encouraged for HotCloudPerf-2021. The HotCloudPerf workshop is technically sponsored by the Standard Performance Evaluation Corporation (SPEC)’s Research Group (RG), and is organized annually by the RG Cloud Group. HotCloudPerf has emerged from the series of yearly meetings organized by the RG Cloud Group, since 2013. The RG Cloud Group group is taking a broad approach, relevant for both academia and industry, to cloud benchmarking, quantitative evaluation, and experimental analysis. WORKSHOP SCOPE AND TOPICS Topics of the focus-theme for 2021 is “Benchmarking in the Cloud”. Articles focusing on this topic are particularly encouraged for HotCloudPerf-2021. Long-running topics of the HotCloudPerf workshop include, but are not limited to: 1. Empirical performance studies in cloud computing environments, applications, and systems, including observation, measurement, and surveys. 2. Comparative performance studies and benchmarking of cloud environments, applications, and systems. 3. Performance analysis using modeling and queueing theory for cloud environments, applications, and systems. 4. Simulation-based studies for all aspects of cloud computing performance. 5. Tuning and auto-tuning of systems operating in cloud environments, e.g., auto-scaling of resources and auto-tiering of data, optimized resource deployment. 6. Software patterns and architectures for engineering cloud performance, e.g., serverless. 7. Experience with and analysis of performance of cloud deployment models, including IaaS/PaaS/SaaS/FaaS. 8. End-to-end performance engineering for pipelines and workflows in cloud environments, or of applications with non-trivial SLAs. 9. Tools for monitoring and studying cloud computing performance. 10. General and specific methods and methodologies for understanding and engineering cloud performance. 11. Serverless computing platforms and microservices in cloud datacenters. ARTICLE SUBMISSION GUIDELINES (UPDATE: One extra page for references now allowed) We solicit the following types of contributions: * Talk only: Extended abstract limited to 1-2 pages (without formatting restrictions) * Full paper limited to 6 pages plus 1 page for references (double column, ACM conference format) * Short paper limited to 3 pages plus 1 page for references (double column, ACM conference format) Contributions in the 1st category (as Talk only) may have already been (partially) presented at other events or in publications and are not included in the conference proceedings. Contributions in the 2nd and 3rd category (technical papers) must represent original and unpublished work that is not currently under review. Full papers may report on original research, lessons learned from realizing an approach, or experiences on transferring a research prototype into practice. Short papers may report on work-in-progress, a tool/demo, or present a vision or position motivating the community to address new challenges. Articles and talk only contributions are required to be submitted via the EasyChair system of HotCloudPerf-2021: https://easychair.org/conferences/?conf=hotcloudperf2021 Articles must use the ACM conference format. Each valid submission will receive at least three (3) peer reviews. Articles must use the ACM conference format. Each valid submission will receive at least three (3) peer reviews. Presented papers will be published by ACM and included in the ACM Digital Library. Adhering to ACM guidelines for conferences, ICPE requires that at least one author of each accepted paper attends (in person or remote) the workshop and presents the paper. COVID-19 TRAVEL RESTRICTIONS: At this moment, it is likely that HotCloudPerf 2021 will be fully virtual. Our experience with a virtual ICPE (and HotCloudperf) 2020 was excellent and the participants rated the experience and format very highly. For more information, pelase contact us: hotcloudperf2021[at]easychair.org ORGANIZING COMMITTEE Cristina L. Abad (ESPOL, Ecuador) Nikolas Herbst (U. Würzburg, Germany) Alexandru Uta (Leiden University, the Netherlands) Alexandru Iosup (VU Amsterdam, the Netherlands) PROGRAM COMMITTEE Cristina L. Abad, ESPOL, Ecuador Ahmed Ali-Edin, Chalmers | University of Gothenburg, Sweden Marta Beltran, Universidad Rey Juan Carlos, Spain Andre Bondi, Software Performance and Scalability Consulting LLC, USA Marc Brooker, Amazon Web Services, USA Lucy Cherkasova, ARM Research, USA Dmitry Duplyakin, University of Utah, USA Bogdan Ghit, Databricks, The Netherlands Wilhelm Hasselbring, Kiel University, Germany Nikolas Herbst, U. Würzburg, Germany Alexandru Iosup, VU Amsterdam, The Netherlands Alessandro Papadopoulos, Mälardalen University, Sweden Joel Scheuner, Chalmers | University of Gothenburg, Sweden Petr Tůma, Charles University, Czech Republic Alexandru Uta, Leiden University, The Netherlands Erwin van Eyk, VU Amsterdam, The Netherlands André van Hoorn, University of Stuttgart, Germany Chen Wang, IBM, USA |
|