Created
February 27, 2019 00:22
-
-
Save ragulpr/50b7011e7348944bee1ee160db2fbe0a to your computer and use it in GitHub Desktop.
@AnirudhDagar, unless things have happened over in the Torch (and in particular, the CUDA)-community since last year when I checked, Batchnorm does not support masking. Unfortunately, there's many numerical gotchas and BatchNorm has highly optimized low level implementations, so it seems like it wouldn't be very feasible to just write it using the basic Pytorch python API (I've tried, it was slow).
Thanks for the explanation :)
This is old but I found this post from reddit, and this is just to say that Keras BatchNormalization does support masking now:
https://keras.io/api/layers/normalization_layers/batch_normalization/
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, how can I do something similar in PyTorch? I have input to my
BatchNorm1d
layer for example of shape8,1630,50,171
wherebatch size=8
,1630
is the dimension along which I have padding. So for example along that dim I have data like [0,1,2,...1112,0,0,0,0,...0] so after 1112 it is padded with zeros to make it of length 1630 and similarly for all in the batch size.I also have a corresponding mask.