
The BIFOLD Lunch Talk series gives BIFOLD members and external partners the opportunity to engage in dialogue about their research in Machine Learning and Big Data.
Each Lunch Talk offers BIFOLD members, fellows and colleagues from other research institutes the chance to present their research and to network with each other.
By illuminating the behavior of complex machine learning (ML) models, Explainable Artificial Intelligence (XAI) became a vital tool to uncover non-linear relationships in large data sets. However, insights derived from XAI remain subject to the inherent randomness introduced by both the chosen XAI method and the ML training process. In light of the reproducibility crisis of data-driven science, combining deterministic XAI methods with convex distance-based predictors such as support vector machines (SVMs) offers a promising path toward robust and reproducible insights.
While powerful model-agnostic explanations such as Shapley Value approximations exist, they introduce estimation noise during explanation. Conversely, common deterministic methods like Sensitivity Analysis often fail to capture complex relationships between data features. To bridge this gap, we propose a novel, deterministic explanation method tailored for convex distance-based classifiers like SVMs, ensuring both accuracy and reproducibility. Our approach extracts a hierarchical network structure from distance-based models and applies layer-wise relevance propagation to assign importance to input features explaining the prediction. The accuracy of our approach is verified in a quantitative evaluation, and its general applicability is demonstrated in two practical use cases.

Florian Bley from the Explaining Deep Neural Networks group at BIFOLD will talk about " Combining Explainability and Reproducibility: Feature Attribution in Distance-Based Classifiers," shedding light on the importance of Explainability in research on Deep Neural Networks.
The Lunch Talk takes place at the TU Berlin. For further information on the Lunch Talks and registration, contact Dr. Laura Wollenweber via email.