Skip to content

Instantly share code, notes, and snippets.

@franperezlopez
Created April 12, 2019 23:38
Show Gist options
  • Save franperezlopez/c42784c5d929e983bf068e488ba1e196 to your computer and use it in GitHub Desktop.
Save franperezlopez/c42784c5d929e983bf068e488ba1e196 to your computer and use it in GitHub Desktop.
def lin_bn_drop(self, n_in:int, n_out:int, bn:bool=True, p:float=0., actn:Optional[nn.Module]=None):
"Sequence of batchnorm (if `bn`), dropout (with `p`) and linear (`n_in`,`n_out`) layers followed by `actn`."
layers = [nn.Linear(n_in, n_out)]
if bn:
layers.append(nn.BatchNorm1d(n_out))
if p > 0:
layers.append(nn.Dropout(p))
if actn is not None:
layers.append(actn)
return layers
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment