Skip to content

Instantly share code, notes, and snippets.

View alessiamarcolini's full-sized avatar

Alessia Marcolini alessiamarcolini

View GitHub Profile
# ! pip install slackclient
# get your token: https://api.slack.com/custom-integrations/legacy-tokens
# composing messages: https://api.slack.com/messaging/composing
import os
from slack import WebClient
sc = WebClient(token='<your_token>', run_async=True)
sc.chat_postMessage(channel='#tasks-notifications', text='', icon_emoji=':smile:', username='') # username is the "sender" name

PyCon Italy X Feedback forms

Please check the direct links to the feedback form for each talk!

Rate and provide us your feedback to help us running a better conference, and to help the speaker improve.

Ravioli vs Pelmeni software architecture (microservices vs services) by Anastasiia Tymoshchuk: https://python.it/feedback-1515

TemPy! Una alternativa al templating tradizionale usando solo Python by Federico Cerchiari:

@alessiamarcolini
alessiamarcolini / snapshot-ensembles.py
Created June 7, 2018 16:50
Snapshot Ensembles - Keras
def get_callbacks(self, model_prefix='Model'):
"""
Creates a list of callbacks that can be used during training to create a
snapshot ensemble of the model.
Args:
model_prefix: prefix for the filename of the weights.
Returns: list of 3 callbacks [ModelCheckpoint, LearningRateScheduler,
SnapshotModelCheckpoint] which can be provided to the 'fit' function
"""
if not os.path.exists('weights/'):
@alessiamarcolini
alessiamarcolini / clr.py
Created June 7, 2018 16:04
Cyclical Learning Rates - Keras
def clr(self):
cycle = np.floor(1 + self.clr_iterations / (2 * self.step_size))
x = np.abs(self.clr_iterations / self.step_size - 2 * cycle + 1)
if self.scale_mode == 'cycle':
return self.base_lr + (self.max_lr - self.base_lr) * \
np.maximum(0, (1 - x)) * self.scale_fn(cycle)
else:
return self.base_lr + (self.max_lr - self.base_lr) * \
np.maximum(0, (1 - x)) * self.scale_fn(self.clr_iterations)
@alessiamarcolini
alessiamarcolini / drop-based_LR_schedule.py
Created May 7, 2018 21:20
Drop Based Learning Rate Schedule - Keras
def step_decay(epoch):
initial_lrate = 0.1
drop = 0.5
epochs_drop = 10.0
lrate = initial_lrate * math.pow(drop, math.floor((1+epoch)/epochs_drop))
return lrate
# ...
lrate = LearningRateScheduler(step_decay)
@alessiamarcolini
alessiamarcolini / time-based_LR_schedule.py
Created May 7, 2018 20:53
Time Based Learning Rate Schedule - Keras
# Compile model
epochs = 50
learning_rate = 0.1
decay_rate = learning_rate / epochs
momentum = 0.8
sgd = SGD(lr=learning_rate, momentum=momentum, decay=decay_rate, nesterov=False)
model.compile(loss='binary_crossentropy', optimizer=sgd, metrics=['accuracy'])