Skip to content

Instantly share code, notes, and snippets.

@Lexie88rus
Created June 27, 2019 08:42
Show Gist options
  • Save Lexie88rus/dc8a7fc06bd29d8fb0c05f43e1a2b0f8 to your computer and use it in GitHub Desktop.
Save Lexie88rus/dc8a7fc06bd29d8fb0c05f43e1a2b0f8 to your computer and use it in GitHub Desktop.
SiLU demo
# use SiLU with model created with Sequential
# initialize activation function
activation_function = SiLU()
# Initialize the model using nn.Sequential
model = nn.Sequential(OrderedDict([
('fc1', nn.Linear(784, 256)),
('activation1', activation_function), # use SiLU
('fc2', nn.Linear(256, 128)),
('bn2', nn.BatchNorm1d(num_features=128)),
('activation2', activation_function), # use SiLU
('dropout', nn.Dropout(0.3)),
('fc3', nn.Linear(128, 64)),
('bn3', nn.BatchNorm1d(num_features=64)),
('activation3', activation_function), # use SiLU
('logits', nn.Linear(64, 10)),
('logsoftmax', nn.LogSoftmax(dim=1))]))
# Run training
train_model(model)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment