Transforming Probability Distributions using Normalizing Flows



Original Source Here

Transforming Probability Distributions using Normalizing Flows

Normal to Bi-Modal distribution using bijective transformations. (Image by Author)

What you can expect to learn/review from this post —

  1. Getting started with TransformedDistribution module (within TensorFlow Probability library) to transform a base distribution via a bijection operation.
  2. Writing our own custom bijection function in TensorFlow.
  3. How to define the forward and inverse transformation within a custom bijection function?
  4. Visualizing original and custom-transformed distributions.

If you want to review the basics of bijection and diffeomorphism as starters for Normalizing flows, please check the previous post; we already went through the fundamentals of bijectors, Jacobian and transforming probability distributions and derived a few mathematical formulations for inverse and forward transformations which will all be heavily used in this post. All the codes are available in my GitHub, check the references below. Without any delay let’s begin!

TransformedDistribution Class:

In the previous post, we saw how we could define a bijector and just call forward or inverse methods to transform a tensor. Within the TensorFlow Probability distribution module (tfp.distributions), the TransformedDistributions class helps us to do this for distributions and let’s see an example. Let’s start with a normal distribution, and we know that exponential function of the distribution will follow a log-normal distribution. We will verify that using the TransformedDostributions where we start with Normal distribution and use exponential bijector as below:

Here we started with a normal distribution as in line 16 ( tfd.Normal ), and introduced the exponential bijector in line 19 ( tfb.Exp() ). To transform from normal to log-normal we call the TransformedDistribution in line 22. To confirm that we actually got the log-normal distribution given the starting Normal distribution, it is verified by calling tfd.Normal and plotting samples from these distributions as below:

Fig. 1: Transforming Distribution: Starting from a normal distribution we reached to log-normal distribution ϕ(z) and verify with a log-normal distribution LogNormal(z). [Image by Author]

Defining Custom Bijector:

In the previous example, we used the exponential bijector which is already available within the bijector module, but what if we want a custom operation that is not available within the bijector module? We can define a custom bijector and the approach is similar to making new layers via subclassing in Keras. Let’s write a custom bijector class where the forward mapping function is given by f → (ax)³, where a, x ∈ R. For writing the custom bijector, I’ve followed the structure of tfp.bijectors.power class as described in the GitHub source code. It is also mentioned that odd integers as power are not supported:

Powers that are reciprocal of odd integers like 1. / 3 are not supported because of numerical precision issues that make this property difficult to test. In order to simulate this behavior, we recommend using the Invert bijector instead (i.e. instead of tfb.Power(power=1./3) use tfb.Invert(tfb.Power(power=3.))).

Example below is just for understanding the steps of defining a custom bijector class:

Example of bijector subclassing where the forward method implements f → (ax)³.

Given this simple forward operation: f → (ax)³, one can easily calculate the forward log-det-jacobian and this is written as a comment on line 33 of the code block above. The inverse log-det-jacobian is automatically implemented given the forward log-det-jacobian function. In case you want to brush up on the definitions again, please check the previous post. Once defined, now we are ready to use the class and we can plot the result of forward and inverse transformation in python as below:

Fig. 2: Given a function f → (ax)³, we plot the results of forward and inverse transformation. [Image by Author]

Similarly, one can plot the resulting action of forward and inverse log-det-Jacobian which looks as below:

Fig. 3: Given a function f → (ax)³, we plot the results of the forward and inverse log of the determinant of the Jacobian. [Image by Author]

Normal to Bi-Modal Distribution:

Apart from Normal to Log-Normal distribution, another basic transformation that will be important later on is Normal to Bi-Modal distribution. As the name suggests, bi-modal refers to a distribution that has two peaks. Starting from a Normal distribution we can apply Soft-Sign bijection operation to reach bi-modal distribution, where the soft-sign function is defined as below:

def softsign(x):   return x/(1+abs(x))

However this bijection is already available in TensorFlow Probability, so we can apply TransformedDistributionas before, let’s see:

tf.random.set_seed(1234)normal_dist = tfd.Normal(loc=0., scale=1.0) 
# starting base distribution
sample_s = 500normal_dist_sample = normal_dist.sample(sample_s, seed=10)## bijectorsoftsign = tfb.Softsign()# tf.random.set_seed(1234)bi_modal = tfd.TransformedDistribution(normal_dist, softsign)bi_modal_sample = bi_modal.sample(sample_s, seed=10)

We can plot the samples from the transformed distribution as below:

Fig. 4: Starting from Normal distribution, we apply soft-sign bijection to reach Bi-Modal distribution. [Image by Author]

We can also visualize the resulting transformed distribution and original distribution as contour plots as below, where the left panel shows the Normal distribution and the right panel shows the transformed distribution.

Fig. 5: Contour plots of base (Normal) and transformed distributions (bi-modal) are shown. [Image by Author]

Instead of 2D contour plots, we can also plot the 3D contour plots as below:

Fig. 6: Same as fig. 5, but instead of 2D plots, we can also visualize them as 3D plots. [Image by Author]

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: