Skip to content

Instantly share code, notes, and snippets.

View JaeDukSeo's full-sized avatar
🙏
Praying

J JaeDukSeo

🙏
Praying
View GitHub Profile
@Miladiouss
Miladiouss / Select_CIFAR10_Classes.py
Last active October 27, 2023 03:41
Create PyTorch datasets and dataset loaders for a subset of CIFAR10 classes.
import torchvision
import torchvision.transforms as transforms
from torchvision.datasets import CIFAR10
from torch.utils.data import Dataset, DataLoader
import numpy as np
# Transformations
RC = transforms.RandomCrop(32, padding=4)
RHF = transforms.RandomHorizontalFlip()
RVF = transforms.RandomVerticalFlip()
@fpreiswerk
fpreiswerk / entropy_comparison.py
Created January 4, 2018 10:10
Code for post on Medium titled "Shannon entropy in the context of machine learning and AI"
import matplotlib.pyplot as plt
import matplotlib.mlab as mlab
import numpy as np
import math
# disable divide by zero warnings
import warnings
warnings.filterwarnings("ignore")
SMALL_SIZE = 14
@tomokishii
tomokishii / fashion_mnist_theano.py
Created October 2, 2017 05:11
fashion_mist_theano.py - one another mnist (fashion mnist) classification code using Theano. (Goodbye Theano!)
#
# fashion_mnist_theano.py
# date. 10/2/2017
#
# REM: I read the article for stopping development of "THEANO".
# The deep learning framework stimulated me and made me write codes.
# I'd like to say thank you to Theano supporting team.
#
import os
@wassname
wassname / jaccard_coef_loss.py
Last active January 30, 2024 15:45
jaccard_coef_loss for keras. This loss is usefull when you have unbalanced classes within a sample such as segmenting each pixel of an image. For example you are trying to predict if each pixel is cat, dog, or background. You may have 80% background, 10% dog, and 10% cat. Should a model that predicts 100% background be 80% right, or 30%? Categor…
from keras import backend as K
def jaccard_distance_loss(y_true, y_pred, smooth=100):
"""
Jaccard = (|X & Y|)/ (|X|+ |Y| - |X & Y|)
= sum(|A*B|)/(sum(|A|)+sum(|B|)-sum(|A*B|))
The jaccard distance loss is usefull for unbalanced datasets. This has been
shifted so it converges on 0 and is smoothed to avoid exploding or disapearing
gradient.
@mjdietzx
mjdietzx / residual_block.py
Last active September 18, 2021 11:21
Clean and simple Keras implementation of the residual block (non-bottleneck) accompanying Deep Residual Learning: https://blog.waya.ai/deep-residual-learning-9610bb62c355.
from keras import layers
def residual_block(y, nb_channels, _strides=(1, 1), _project_shortcut=False):
shortcut = y
# down-sampling is performed with a stride of 2
y = layers.Conv2D(nb_channels, kernel_size=(3, 3), strides=_strides, padding='same')(y)
y = layers.BatchNormalization()(y)
y = layers.LeakyReLU()(y)
import numpy as np
from sklearn.datasets import make_moons
from sklearn.cross_validation import train_test_split
n_feature = 2
n_class = 2
def make_network(n_hidden=100):
@christopherlovell
christopherlovell / display.py
Last active November 18, 2023 22:22
display youtube video in jupyter notebook
from IPython.display import HTML
# Youtube
HTML('<iframe width="560" height="315" src="https://www.youtube.com/embed/S_f2qV2_U00?rel=0&amp;controls=0&amp;showinfo=0" frameborder="0" allowfullscreen></iframe>')
# Vimeo
HTML('<iframe src="https://player.vimeo.com/video/26763844?title=0&byline=0&portrait=0" width="700" height="394" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe><p><a href="https://vimeo.com/26763844">BAXTER DURY - CLAIRE (Dir Cut)</a> from <a href="https://vimeo.com/dannysangra">Danny Sangra</a> on <a href="https://vimeo.com">Vimeo</a>.</p>')
@ottokart
ottokart / nn.py
Last active August 27, 2021 05:52
3-layer neural network example with dropout in 2nd layer
# Tiny example of 3-layer nerual network with dropout in 2nd hidden layer
# Output layer is linear with L2 cost (regression model)
# Hidden layer activation is tanh
import numpy as np
n_epochs = 100
n_samples = 100
n_in = 10
n_hidden = 5
@curran
curran / README.md
Last active June 10, 2024 12:43
The Iris Dataset

This is the "Iris" dataset. Originally published at UCI Machine Learning Repository: Iris Data Set, this small dataset from 1936 is often used for testing out machine learning algorithms and visualizations (for example, Scatter Plot). Each row of the table represents an iris flower, including its species and dimensions of its botanical parts, sepal and petal, in centimeters.

The HTML page provides the basic code required to load the data and display it on the page (as JSON) using D3.js.

For a more up to date code example with React & D3, see (VizHub: Stylized Scatter Plot)[https://vizhub.com/curran/3d631093c2334030a6b27fa979bb4a0d?edit=files&file=index.js].

@PurpleBooth
PurpleBooth / README-Template.md
Last active June 25, 2024 09:23
A template to make good README.md

Project Title

One Paragraph of project description goes here

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.

Prerequisites