{{ message }}

Instantly share code, notes, and snippets.

# poolio/Adversarial variational bayes toy example.ipynb

Last active Oct 5, 2020
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

### artsobolev commented Jan 28, 2017

What does making `p(x)` "close" to `p(z)` even mean? Those are distributions over different spaces, how can one compare them?

### poolio commented Jan 29, 2017

That's a typo, should be `p(x)` close to `q(x)`

### JulienSiems commented Feb 24, 2017

Thanks for the great notebook! I have two questions though:

• I don't understand why you make the the output of the generative network a stochastic tensor. In the paper in figure 1 the eps just gets added/multiplied. Shouldn't it be sufficient as an output?
• Shouldn't the weights of q and p be updated separately? The paper states two different loss terms for them in the algorithm. Or is that possible because of proposition 2?

### poolio commented Mar 7, 2017

• The `StochasticTensor` in the generative model is used to keep track of the sample x ~ p(x|z) and the density p(x|z) so we can evaluate it when computing the log probability of the data given the sampled latent state, z.

• In practice we don't have acess to T* and use the current discriminator T as a replacement. T does not directly depend on the parameters of p, so d/dp -T(x, z) is 0, and the gradients are identical to the gradients using the separate losses in the paper.

### JulienSiems commented Mar 8, 2017

Makes sense! Thank you for taking the time to reply!

### clarken92 commented Dec 5, 2018

Thank you for your interesting post. Please forgive me if I ask a dumb question. How could I find "stochastic_tensor" module on tensorflow? I use the version 1.8 installed with pip but it says there is no such module. Thank you.