Please use this identifier to cite or link to this item:
Appears in Collections:Computing Science and Mathematics Conference Papers and Proceedings
Author(s): Crescimanna, Vincenzo
Graham, Bruce
Title: The Variational InfoMax AutoEncoder
Citation: Crescimanna V & Graham B (2020) The Variational InfoMax AutoEncoder. In: 2020 International Joint Conference on Neural Networks. IEEE International Joint Conference on Neural Networks (IJCNN) IJCNN 2020 - International Joint Conference on Neural Networks, Glasgow, UK, 19.07.2020-24.07.2020. Piscataway, NJ: IEEE.
Issue Date: 2020
Date Deposited: 3-May-2021
Series/Report no.: IEEE International Joint Conference on Neural Networks (IJCNN)
Conference Name: IJCNN 2020 - International Joint Conference on Neural Networks
Conference Dates: 2020-07-19 - 2020-07-24
Conference Location: Glasgow, UK
Abstract: The Variational AutoEncoder (VAE) learns simultaneously an inference and a generative model, but only one of these models can be learned at optimum, this behaviour is associated to the ELBO learning objective, that is optimised by a non-informative generator. In order to solve such an issue, we provide a learning objective, learning a maximal informative generator while maintaining bounded the network capacity: the Variational InfoMax (VIM). The contribution of the VIM derivation is twofold: an objective learning both an optimal inference and generative model and the explicit definition of the network capacity, an estimation of the network robustness.
Status: AM - Accepted Manuscript
Rights: © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Files in This Item:
File Description SizeFormat 
Crescimanna-Graham-IEEE-2020.pdfFulltext - Accepted Version2.63 MBAdobe PDFView/Open

This item is protected by original copyright

Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved

If you believe that any material held in STORRE infringes copyright, please contact providing details and we will remove the Work from public display in STORRE and investigate your claim.