Skip to content

Instantly share code, notes, and snippets.

@ragulpr
Created February 27, 2019 00:22
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ragulpr/50b7011e7348944bee1ee160db2fbe0a to your computer and use it in GitHub Desktop.
Save ragulpr/50b7011e7348944bee1ee160db2fbe0a to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@AnirudhDagar
Copy link

Hi, how can I do something similar in PyTorch? I have input to my BatchNorm1d layer for example of shape 8,1630,50,171 where batch size=8, 1630 is the dimension along which I have padding. So for example along that dim I have data like [0,1,2,...1112,0,0,0,0,...0] so after 1112 it is padded with zeros to make it of length 1630 and similarly for all in the batch size.
I also have a corresponding mask.

@ragulpr
Copy link
Author

ragulpr commented Aug 28, 2019

@AnirudhDagar, unless things have happened over in the Torch (and in particular, the CUDA)-community since last year when I checked, Batchnorm does not support masking. Unfortunately, there's many numerical gotchas and BatchNorm has highly optimized low level implementations, so it seems like it wouldn't be very feasible to just write it using the basic Pytorch python API (I've tried, it was slow).

@AnirudhDagar
Copy link

Thanks for the explanation :)

@drussellmrichie
Copy link

This is old but I found this post from reddit, and this is just to say that Keras BatchNormalization does support masking now:

https://keras.io/api/layers/normalization_layers/batch_normalization/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment