Skip to content

Instantly share code, notes, and snippets.

@ericjang
Created March 2, 2018 23:43
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ericjang/4ed23528ab88800f95cb8a038b99f2c4 to your computer and use it in GitHub Desktop.
Save ericjang/4ed23528ab88800f95cb8a038b99f2c4 to your computer and use it in GitHub Desktop.
numerically stable version of log-variance, in TF
def _reduce_logmeanexp(x, axis, epsilon):
"""Numerically-stable (?) implementation of log-mean-exp.
Args:
x: The tensor to reduce. Should have numeric type.
axis: The dimensions to reduce. If `None` (the default),
reduces all dimensions. Must be in the range
`[-rank(input_tensor), rank(input_tensor)]`.
epsilon: Floating point scalar to avoid log-underflow.
Returns:
log_mean_exp: A `Tensor` representing `log(Avg{exp(x): x})`.
"""
x_max = math_ops.reduce_max(x, axis=axis, keepdims=True)
return math_ops.log(
math_ops.reduce_mean(
math_ops.exp(x - x_max), axis=axis, keepdims=True) + epsilon) + x_max
def _log_variance(x, axis, epsilon):
"""Numerically-stable (?) implementation of log-variance.
Derived from the following algebra:
```
Var(y) = reduce_mean((y - mean)^2)
log Var(y) = reduce_logmeanexp(log (y - mean)^2)
= reduce_logmeanexp(2 log (abs(y - mean)))
= reduce_logmeanexp(
[2 log (abs(y - mean)); eps], axis=reduction_axes)
```
Args:
x: The tensor to reduce. Should have numeric type.
axis: The dimensions to reduce. If `None` (the default),
reduces all dimensions. Must be in the range
`[-rank(input_tensor), rank(input_tensor)]`.
epsilon: Floating point scalar to avoid log-underflow.
Returns:
The reduced tensor.
"""
mean = math_ops.reduce_mean(x, axis=axis, keepdims=True)
log_square_diff = 2 * math_ops.log(math_ops.abs(x - mean))
log_variance = _reduce_logmeanexp(log_square_diff, axis, epsilon)
return log_variance
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment