Deep Neural Networks as Iterated Function Systems and a Generalization Bound

a stochastic IFS viewpoint on deep architectures and generative modeling.

arXiv PDF

    This preprint connects modern deep architectures to stochastic iterated function systems. The idea is to treat depth as a random dynamical system: once this is made explicit for architectures such as ResNets, Transformers, or mixture-of-experts layers, questions of stability become questions of contractivity, and generative generalization can be controlled in Wasserstein distance.

    1. Vacher, J. Deep Neural Networks as Iterated Function Systems and a Generalization Bound. arXiv preprint arXiv:2601.19958 (2026).

    Overview

    The paper has three main pieces. First, it identifies conditions under which depth dynamics admit invariant measures and attractors. Second, it derives a generalization bound for generative modeling based on the gap between the data distribution and its image under the learned transfer operator. Third, it turns this bound into a collage-style training objective and tests it on a controlled 2D example together with latent-image experiments on MNIST, CelebA, and CIFAR-10.


    © 2019. All rights reserved.

    Powered by Hydejack v8.4.0