“Supporting Ammortized Inference for Normalizing Flows” (Pyro Github issue #1911) #102
Unanswered
DylanMannKrzisnik
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello!
I am currently using Pyro to implement normalizing flows as a way to enrich the proposed posterior of a variational autoencoder (for example, Rezende & Mohamed 2015 or Kingma et al. 2016). I've reached out for guidance regarding amortization of flow parameters in a post within the Pyro forum. I was invited to ask the FlowTorch community for solutions, so here I am!
Here is part of what I had posted in the Pyro forum:
"In the literature of flow-based VAE posteriors and/or priors, the parameters of flow transformations are amortized from the outputs of the encoder network.
As such, what are the best approaches for achieving amortization of flow parameters in Pyro, in the context of VAEs? The following Github issue is quite relevant: Supporting Ammortized Inference for Normalizing Flows · Issue #1911 · pyro-ppl/pyro · GitHub
For IAF-style amortization via a context variable, perhaps the use of conditional flows could be a solution. In that case, one could add a linear layer which takes encoder outputs to create such a context variable, and use this latter variable to condition the flow transformations. However, other amortization strategies such as the one used in the original Sylvester flows paper (van den Berg et al 2018), the flow parameters are themselves directly amortized via the encoder network, bypassing the need for a context variable. One could implement this behaviour manually, but before doing so, I was wondering whether a similar mechanism was already implemented in Pyro?"
Although I am very much looking forward to getting accustomed with FlowTorch, I'd prefer to stick with Pyro for the time being. That being said, it was explained to me in the Pyro forum that FlowTorch can be used within Pyro, so perhaps now could be a good time to jump in.
Thank you, looking forward to further developments with FlowTorch!
Beta Was this translation helpful? Give feedback.
All reactions