Flow_models 1: Distribution mapping
- The examples in this series are all computed with the scripts in my https://github.com/aganse/flow_models repo in Github.
- Links to articles in the series (also in nav menu above): 0.) Overview / Intro, 1.) Distribution mapping, 2.) Generative image modeling / anomaly detection, 3.) Generative classification (a), 4.) Generative classification (b), 5.) Ill-conditioned parameter estimation, 6.) Ill-posed inverse problem ignoring noise, 7.) Ill-posed inverse problem with noise in the data


As described in the overview/intro, normalizing flow models (ie invertible neural networks, or INNs), give us a.) a one-to-one mapping to actual probabilities in the latent space corresponding to input points, b.) a way to compute the likelihood of the data exactly, and c.) really efficient computation that allows practical training time for complex models, as the Jacobians are not dependent on the arbitrarily complex layers inside the affine coupling layers.

So let's run a few quick initial examples here to explore those statements for ourselves. First example will be the moon-shaped point clouds seen in the RealNVP paper (on which the model I'm using is based), which is what is shown in the figure below. The second example will use a distribution that's more-complex-than-gaussian, but still tractable enough that I can directly compute the probabilities of the sample points and the overall data likelihood. Then we'll recompute those probabilities and data likelihood via the INN and see how they compare. In other words, referencing the technical description in the overview/intro, we'll compute likelihood \(L\) both via \(p_X(x_i)\) directly and via \(p_Z(z_i)\) and see how close they are.

This diagram shows how the N-dimensional data inputs \(x_i\) are mapped through the flow model to points \(z_i\) in an N-dimensional multivariate Gaussian latent space. And those points can each be mapped back though the model to images as well. The distribution is a standard normal (N-dimensional), so by construction has mean of zero and covariance as the identity matrix.

INNfig1




FYI I developed and wrote the Flow_models 2 article/code first, and that's all there waiting for you. Cough, obviously the rest of this article is not yet. Preparing the codes for article 2 comprised the bulk of the overall work, as this article 1 will use the same code. The rest of it is coming soon, I promise!