Skip to content

Instantly share code, notes, and snippets.

Avatar
🐢
As fast as a Pentium I

esmitt esmitt

🐢
As fast as a Pentium I
View GitHub Profile
@esmitt
esmitt / statistic-list.py
Last active Oct 14, 2020
Prints statistics over an array of numbers using describe function from scipy and numpy range of values function (ptp)
View statistic-list.py
from scipy.stats import describe
import numpy as np
# arr_values is a numpy array
def print_stats(arr_values: np.array) -> None:
stats = describe(arr_values)
print(f'min: {stats.minmax[0]:.5f}, max: {stats.minmax[1]:.4f}')
print(f'mean: {stats.mean:.5f}')
print(f'standard: {np.std(arr_values):.5f}')
print(f'variance: {stats.variance:.5f}')
@esmitt
esmitt / cleaning-csharp.bat
Created Jul 26, 2020
A script to clean my C# projects
View cleaning-csharp.bat
@echo off
REM Remove files generated by compiler in this directory and all subdirectories.
REM Essential release files are kept.
echo Removing "*.csproj.user" files...
for /f "delims==" %%i in ('dir /b /on /s "%~p0*.csproj.user"') do del "%%i" /f /q
echo.
echo Removing "*.exe.config" files...
@esmitt
esmitt / plot-ROC.py
Last active Oct 15, 2020
Plotting the ROC curve using matplotlib
View plot-ROC.py
from sklearn.metrics import roc_auc_score, roc_curve
def plot_roc(name: str, labels: numpy.ndarray, predictions: numpy.ndarray, **kwargs) -> ():
fp, tp, _ = roc_curve(labels, predictions)
auc_roc = roc_auc_score(labels, predictions)
plt.plot(100*fp, 100*tp, label=name + " (" + str(round(auc_roc, 3)) + ")",
linewidth=2, **kwargs)
plt.xlabel('False positives [%]')
plt.ylabel('True positives [%]')
plt.title('ROC curve')
@esmitt
esmitt / precision-example.py
Created Jul 24, 2020
An example in how to compute the precision metric in Tensorflow 2
View precision-example.py
precision = Precision()
precision.update_state(y_train, y_train_pred)
precision.result().numpy()
@esmitt
esmitt / plot-confusion-matrix.py
Last active Oct 15, 2020
Plotting a confusion matrix for binary classification using matplotlib
View plot-confusion-matrix.py
from sklearn.metrics import confusion_matrix
import seaborn as sns
# notice the threshold
def plot_cm(labels: numpy.ndarray, predictions: numpy.ndarray, p: float=0.5) -> ():
cm = confusion_matrix(labels, predictions > p)
# you can normalize the confusion matrix
plt.figure(figsize=(5,5))
sns.heatmap(cm, annot=True, fmt="d")
@esmitt
esmitt / evaluate-model.py
Created Jul 24, 2020
Evaluation of a model in Tensorflow
View evaluate-model.py
# Evaluate the model on the test data using `evaluate`
print("Evaluate on test data")
score_test = model.evaluate(test_ds.batch(batch_size))
for name, value in zip(model.metrics_names, score_test):
print(name, ': ', value)
@esmitt
esmitt / plot-metrics.py
Last active Oct 15, 2020
Plotting the metrics using matplotlib
View plot-metrics.py
def plot_metrics(history: History) -> ():
metrics = ['loss', 'precision', 'recall', 'auc', 'tp', 'sensitivity']
for n, metric in enumerate(metrics):
name = metric.replace("_"," ").capitalize()
plt.subplot(3, 2, n+1) # adjust according to metrics
plt.plot(history.epoch, history.history[metric], color=colors[0], label='Train')
plt.plot(history.epoch, history.history['val_'+metric],
color=colors[0], linestyle="--", label='Val')
plt.xlabel('Epoch')
plt.ylabel(name)
@esmitt
esmitt / plot-loss.py
Last active Oct 15, 2020
Plotting the loss function using a log scale using Matplotlib
View plot-loss.py
import matplotlib.pyplot as plt
from matplotlib import rcParams
rcParams['figure.figsize'] = (12, 10)
colors = plt.rcParams['axes.prop_cycle'].by_key()['color']
def plot_log_loss(history: History, title_label: str, n: int) -> ():
# Use a log scale to show the wide range of values.
plt.semilogy(history.epoch, history.history['loss'],
color=colors[n], label='Train '+title_label)
@esmitt
esmitt / fit-model.py
Last active Oct 15, 2020
Fit model function in Tensorflow 2
View fit-model.py
batch_size = 64
"""
Training the model for 60 epochs using our dataset.
The batch size (64) is the same for the validation data.
Only 1 callback was used, but could be more like TensorBoard, ModelCheckpoint, etc.
"""
history = model.fit(train_ds.batch(batch_size=batch_size),
epochs=60,
validation_data=validation_ds.batch(batch_size=batch_size),
@esmitt
esmitt / early-stopping.py
Last active Oct 15, 2020
Early stopping callback in Tensorflow 2
View early-stopping.py
from tensorflow.keras.callbacks import EarlyStopping
"""
This callback will stop the training when there is no improvement in the validation accuracy across epochs
"""
early_callback = EarlyStopping(monitor='val_auc',
verbose=1,
patience=10,
mode='max',
restore_best_weights=True)
You can’t perform that action at this time.