Normalizing Flows with Multi-scale Autoregressive Priors
conference contribution
posted on 2023-11-29, 18:12authored byApratim Bhattacharyya, Shweta Mahajan, Mario FritzMario Fritz, Bernt Schiele, Stefan Roth
Flow-based generative models are an important class of exact inference models that admit efficient inference and sampling for image synthesis.
Owing to the efficiency constraints on the design of the flow layers, \eg~split coupling flow layers in which approximately half the pixels do not undergo further transformations, they have limited expressiveness for modeling long-range data dependencies compared to autoregressive models that rely on conditional pixel-wise generation.
In this work, we improve the representational power of flow-based models by introducing channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR).
Our mAR prior for models with split coupling flow layers (mAR-SCF) can better capture dependencies in complex multimodal data.
The resulting model achieves state-of-the-art density estimation results on MNIST, CIFAR-10, and ImageNet.
Furthermore, we show that mAR-SCF allows for improved image generation quality, with improvements in FID and Inception scores compared to state-of-the-art flow-based models.
History
Preferred Citation
Apratim Bhattacharyya, Shweta Mahajan, Mario Fritz, Bernt Schiele and Stefan Roth. Normalizing Flows with Multi-scale Autoregressive Priors. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2020.
Primary Research Area
Trustworthy Information Processing
Name of Conference
IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Legacy Posted Date
2020-03-26
Open Access Type
Gold
BibTeX
@inproceedings{cispa_all_3055,
title = "Normalizing Flows with Multi-scale Autoregressive Priors",
author = "Bhattacharyya, Apratim and Mahajan, Shweta and Fritz, Mario and Schiele, Bernt and Roth, Stefan",
booktitle="{IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}",
year="2020",
}