| |||||||||||||
XAI-TS 2023 : Explainable AI for Time Series: Advances and Applications | |||||||||||||
Link: https://lamarr-institute.github.io/xai-ts-workshop/ | |||||||||||||
| |||||||||||||
Call For Papers | |||||||||||||
Time series data is omnipresent, as time is an integral component of all observable phenomena. With the increasing instrumentation of our world, sensors and systems are continuously generating vast quantities of time series data. Hence, time series analysis is integrated in principled decision-making and planning in a wide variety of applications across various industries. This has led to an increasing demand for accurate and interpretable time series models. Explainable AI methods can help users understand the analysis and predictions made by these models and build trust in their predictions. The integration of expert knowledge into learning pipelines from time series plays a major role in this context. This workshop will bring together experts and researchers to discuss recent developments and applications of Explainable AI for Time Series data.
XAI-TS welcomes papers that cover, but are not limited to, one or several of the following topics: Explainable AI methods for Time Series modelling Interpretable machine learning algorithms for Time Series Explainability metrics and evaluation, including benchmark datasets Case studies and applications of Explainable AI for Time Series Integration of domain knowledge in Time Series modelling Explainability Methods for Multivariate Time Series Explainable concept drift detection in Time Series Visual explanations for long Time Series data. Explainable Time Series features engineering Explainable Deep Learning for Time Series Modeling Explainable pattern discovery and recognition in Time Series Explainable anomaly detection in Time Series. Explainable aggregation of Time Series. Causality; Stochastic process modelling Submission and Dates We welcome submissions of full papers (8-16 pages) and short papers (4 pages) reporting on original research. Submissions must follow the LNCS formatting style and will be double-blinded and reviewed by at least two program committee members. We also welcome submissions of position papers (2 pages) presenting novel ideas, perspectives, or challenges in explainable AI for Time Series, which will be reviewed by the organizers. You can submit via Microsoft CMT here. At least one author of each accepted paper must have a full registration and be in Turin to present the paper. Papers without a full registration or in-presence presentation won’t be included in the post-workshop Springer proceedings. Paper Submission: Monday, June 12, 2023 Author Notification: Wednesday, July 12, 2023 Proceedings Online: Wednesday, September 12, 2023 Workshop: Monday, September 18, 2023 The workshop will comprise of paper presentations, discussions and invited talks. In case of a large number of submissions, a poster session may also be included. Website: https://lamarr-institute.github.io/xai-ts-workshop/ |
|