Skip to content

Instantly share code, notes, and snippets.

View wassname's full-sized avatar
🙃

Michael J Clark wassname

🙃
View GitHub Profile
# This is a blocklist to block samsung smart tv's sending meta data at home.
# Please help to collect domains!
# It could be that the TV does not receive any more updates or other services no longer work. Please report such an incident.
abtauthprd.samsungcloudsolution.com
acr0.samsungcloudsolution.com
ad.samsungadhub.com
ads.samsungads.com
amauthprd.samsungcloudsolution.com
api-hub.samsungyosemite.com
@wassname
wassname / linux_x380.md
Last active February 5, 2024 17:43
xubuntu on a Thinkpad Yoga x380

This are a collection of fixes and tweaks I used to get Xubuntu 18.04 LTS working on a lenovo thinkpad X380 yoga laptop.

@wassname
wassname / jaccard_coef_loss.py
Last active January 30, 2024 15:45
jaccard_coef_loss for keras. This loss is usefull when you have unbalanced classes within a sample such as segmenting each pixel of an image. For example you are trying to predict if each pixel is cat, dog, or background. You may have 80% background, 10% dog, and 10% cat. Should a model that predicts 100% background be 80% right, or 30%? Categor…
from keras import backend as K
def jaccard_distance_loss(y_true, y_pred, smooth=100):
"""
Jaccard = (|X & Y|)/ (|X|+ |Y| - |X & Y|)
= sum(|A*B|)/(sum(|A|)+sum(|B|)-sum(|A*B|))
The jaccard distance loss is usefull for unbalanced datasets. This has been
shifted so it converges on 0 and is smoothed to avoid exploding or disapearing
gradient.
@wassname
wassname / pandas_classification_report.py
Last active January 19, 2024 06:28
Scikit Learn Classification Report in a pandas Dataframe (and confusion)
"""
@url: https://gist.github.com/wassname/f3cbdc14f379ba9ec2acfafe5c1db592
"""
import pandas as pd
import sklearn.metrics
import numpy as np
def classification_report(*args, **kwargs):
"""
@wassname
wassname / twohot.md
Last active January 14, 2024 02:09
two-hot encoding notes

What is two-hot encoding?

Description

Two hot encoding was introduced in 2017 in "Marc G Bellemare et all "A distributional perspective on reinforcement learning" but the clearest description is in the 2020 paper "Dreamer-v3" by Danijar Hafner et al.) where it is used for reward and value distributions.

two-hot encoding is a generalization of onehot encoding to continuous values. It produces a vector of length |B| where all elements are 0 except for the two entries closest to the encoded continuous number, at positions k and k + 1. These two entries sum up to 1, with more weight given to the entry that is closer to the encoded number

Code samples

@wassname
wassname / lightning_start.py
Last active January 12, 2024 23:40
This is my cheatsheet, for my current best practices for using pytorch lightning `lightning_start.py`. This is verbose so that I can delete what is not needed. I mainly log to csv to keep things simple.
"""
This is a template for starting with pytorch lightning, it includes many extra things because it's easier to delete than reinvent.
It is written for these versions:
- lightning==2.0.2
- pytorch-optimizer==2.8.0
"""
import torch
import torch.nn as nn
@wassname
wassname / torch_scalar.py
Created December 27, 2023 01:06
wrap sklearn scalars for torch
"""
how to wrap a scikit-learn scalar like RobustScaler for pytorch
"""
import torch
import numpy as np
from einops import rearrange
from sklearn.preprocessing import StandardScaler, RobustScaler
class TorchRobustScaler(RobustScaler):
@wassname
wassname / style_df.py
Created December 23, 2023 22:57
How to style dataframes in vscode
"""
you cannot display, you need to specify html
- see also https://pandas.pydata.org/docs/user_guide/style.html#Builtin-Styles
"""
import pandas as pd
from IPython.display import display, HTML
df = pd.DataFrame({
"strings": ["Adam", "Mike"],
"ints": [1, 3],
127.0.0.1 us.rdx2.lgtvsdp.com
127.0.0.1 us.info.lgsmartad.com
127.0.0.1 us.ibs.lgappstv.com
127.0.0.1 us.lgtvsdp.com
127.0.0.1 ad.lgappstv.com
127.0.0.1 smartshare.lgtvsdp.com
127.0.0.1 ibis.lgappstv.com
# added after fork
# from https://www.reddit.com/r/pihole/comments/6qmpv6/blacklists_for_lg_webos_tvs/ and others
@wassname
wassname / keras_weighted_categorical_crossentropy.py
Last active December 19, 2023 18:17
Keras weighted categorical_crossentropy (please read comments for updated version)
"""
A weighted version of categorical_crossentropy for keras (2.0.6). This lets you apply a weight to unbalanced classes.
@url: https://gist.github.com/wassname/ce364fddfc8a025bfab4348cf5de852d
@author: wassname
"""
from keras import backend as K
def weighted_categorical_crossentropy(weights):
"""
A weighted version of keras.objectives.categorical_crossentropy