| |||||||||||||||||
xAI ST Actionable XAI 2025 : xAI ST Actionable XAI 2025 : xAI World Conference Special Track on Actionable Explainable AI | |||||||||||||||||
Link: https://xaiworldconference.com/2025/actionable-explainable-ai/ | |||||||||||||||||
| |||||||||||||||||
Call For Papers | |||||||||||||||||
Call for Papers: xAI 2025, Special Track: Actionable Explainable AI
**** Abstract Due Date: February 10, 2025 Submission Due Date: February 15, 2025 Conference Dates: July 09-11, 2025 Conference Location: Istanbul, Turkey Website: https://xaiworldconference.com/2025/actionable-explainable-ai/ **** Following the success of Explainable AI in generating faithful and understandable explanations of complex ML models, there has been increasing attention on how the outcomes of Explainable AI can be systematically used to enable meaningful actions. These considerations are studied within the subfield of Actionable XAI. In particular, research questions relevant to this subfield include (1) what types of explanations are most helpful in enabling human experts to achieve more efficient and accurate decision-making, (2) how one can systematically improve the robustness and generalization ability of ML models or align them with human decision making and norms based on human feedback on explanations, (3) how to enable meaningful actioning of real-world systems via interpretable ML-based digital twins, and (4) how to evaluate and improve the quality of actions derived from XAI in an objective and reproducible manner. This special track will address both the technical and practical aspects of Actionable XAI. This includes the question of how to build highly informative explanations that form the basis for actionability, aiming for solutions that are interoperable with existing explanation techniques such as Shapley values, LRP or counterfactuals, and existing ML models. This special track will also cover the exploration of real-world use cases where these actions lead to improved outcomes. Topics include, but are not limited to: - Structured explanation techniques (e.g. higher-order, hierarchical) designed for actionability - Multifaceted explanation techniques (e.g. disentangled or concept-based) designed for actionability - Explanation techniques based on optimization in input space (e.g. counterfactuals or prototypes) designed for actionability - Hybrid methods combining multiple explanation paradigms to improve actionability further - Attribution or attention-based techniques for helping users taking meaningful actions in data-rich environments - Shapley-, LRP-, or attention-based XAI techniques for retrieving relevant features from gigapixel images - Explanation-guided dimensionality reduction to facilitate taking action under high-throughput data or real-time constraints - XAI-based techniques for aligning the model decision making with ground-truth provided by human annotators - XAI methods, such as CAM and LRP, to support semantic segmentation from limited annotations - Techniques that leverage user explanatory feedback to produce an improved ML model - Explanation-driven pruning or retraining to robustify an ML model against spurious correlations and dataset shifts - Counterfactual and attribution methods combined with digital twins to identify effective actions in real-world systems - Counterfactual and attribution methods combined with reinforcement learning to produce effective real-world control policies - Design of environments (e.g. simulated environments) for end-to-end evaluation of XAI actionability - Utility-based metrics (e.g. added-value in a deployed setting) for end-to-end evaluation of XAI actionability - Indirect metrics (e.g. explanation informativeness, action-response prediction accuracy) for component-wise evaluation of XAI actionability - Datasets (with simulated environments) for evaluating actions derived from XAI explanations in a reproducible manner - Application of actionable XAI in biomedicine, e.g. for acting on molecular pathways - XAI in clinical practice, e.g. for proposing targeted therapies - Application of actionable XAI in industry, e.g. for calibration in manufacturing processes Questions regarding the special track should be submitted to Grégoire Montavon (gregoire.montavon[at]fu-berlin.de). |
|