Evaluation of CNN Performance in Semantically Relevant Latent Spaces

Jeroen van Doorenmalen, Vlado Menkovski

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)


We examine deep neural network (DNN) performance and behavior using contrasting explanations generated from a semantically relevant latent space. We develop a semantically relevant latent space by training a variational autoencoder (VAE) augmented by a metric learning loss on the latent space. The properties of the VAE provide for a smooth latent space supported by a simple density and the metric learning term organizes the space in a semantically relevant way with respect to the target classes. In this space we can both linearly separate the classes and generate meaningful interpolation of contrasting data points across decision boundaries. This allows us to examine the DNN model beyond its performance on a test set for potential biases and its sensitivity to perturbations of individual factors disentangled in the latent space.

Original languageEnglish
Title of host publicationAdvances in Intelligent Data Analysis XVIII - 18th International Symposium on Intelligent Data Analysis, IDA 2020, Proceedings
EditorsMichael R. Berthold, Ad Feelders, Georg Krempl
Number of pages13
ISBN (Print)9783030445836
Publication statusPublished - 2020
Event18th International Conference on Intelligent Data Analysis, IDA 2020 - Konstanz, Germany
Duration: 27 Apr 202029 Apr 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12080 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference18th International Conference on Intelligent Data Analysis, IDA 2020


  • Deep learning
  • Explanation
  • Interpretability
  • Metric learning
  • VAE


Dive into the research topics of 'Evaluation of CNN Performance in Semantically Relevant Latent Spaces'. Together they form a unique fingerprint.

Cite this