Skip to content

Instantly share code, notes, and snippets.

View Thanatoz-1's full-sized avatar

Tushar Dhyani Thanatoz-1

View GitHub Profile
@Thanatoz-1
Thanatoz-1 / What are Hyper-Parameter?
Created July 10, 2018 19:13
Deep learning fundamentals : Hyperparameters
Hyperparameter are simply 'knobs' and 'turns' used to tune the statistical model of deep learing.
They are model-specific properties that you fix even before specifying a network and further tune them to create the required fitting of your data.
List of few hyperparameters are:
1. Learning rate
2. Decay rate
3. Number of hidden layers
4. Dropout
5. Activation function
6. Momentum
7. Batch size & a lot other..
@Thanatoz-1
Thanatoz-1 / Loading_animation.py
Created August 1, 2018 13:19
Code for creating loading animation in python script
import itertools
import threading
import time
import sys
done = False
#here is the animation
def animate():
for c in itertools.cycle(['|', '/', '-', '\\']):
if done:
@Thanatoz-1
Thanatoz-1 / create_model_svg.py
Created September 1, 2018 21:21
Keras creating model SVG
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot
from keras.utils import plot_model
from kt_utils import *
plot_model(happyModel, to_file='HappyModel.png')
SVG(model_to_dot(model).create(prog='dot', format='svg'))
@Thanatoz-1
Thanatoz-1 / Pipeline-guide.md
Created December 3, 2018 10:18 — forked from ecgill/Pipeline-guide.md
Quick tutorial on Sklearn's Pipeline constructor for machine learning

If You've Never Used Sklearn's Pipeline Constructor...You're Doing It Wrong

How To Use sklearn Pipelines, FeatureUnions, and GridSearchCV With Your Own Transformers

By Emily Gill and Amber Rivera

What's a Pipeline and Why Use One?

The Pipeline constructor from sklearn allows you to chain transformers and estimators together into a sequence that functions as one cohesive unit. For example, if your model involves feature selection, standardization, and then regression, those three steps, each as it's own class, could be encapsulated together via Pipeline.

Benefits: readability, reusability and easier experimentation.
@Thanatoz-1
Thanatoz-1 / create_test_db.py
Created December 28, 2018 22:13 — forked from sprin/create_test_db.py
A demo of creating a new database via SQL Alchemy. This module takes the form of a nosetest with three steps: - Set up the new database. - Create a table in the new database. - Teardown the new database.
"""
A demo of creating a new database via SQL Alchemy.
Under MIT License from sprin (https://gist.github.com/sprin/5846464/)
This module takes the form of a nosetest with three steps:
- Set up the new database.
- Create a table in the new database.
- Teardown the new database.
"""
@Thanatoz-1
Thanatoz-1 / script.py
Created January 11, 2019 18:04
Code for running tensorboard using colab
# !!! RUN THIS CELL ONLY ON GOOGLE COLAB !!!
! wget https://raw.githubusercontent.com/hse-aml/intro-to-dl/master/setup_google_colab.py -O setup_google_colab.py
import setup_google_colab
# run tensorboard in background
import os
os.system("tensorboard --logdir=./logs --host 0.0.0.0 --port 6006 &")
# expose port and show the link
setup_google_colab.expose_port_on_colab(6006)
@Thanatoz-1
Thanatoz-1 / Fast.ai install script
Last active January 19, 2019 02:32 — forked from gilrosenthal/Fast.ai install script
Fast.ai Install on Google Colab
!pip install -q "fastai==0.7.0" Pillow==4.1.1 torchtext==0.2.3
!apt-get -qq install -y libsm6 libxext6 && pip install -q -U opencv-python
from wheel.pep425tags import get_abbr_impl, get_impl_ver, get_abi_tag
platform = '{}{}-{}'.format(get_abbr_impl(), get_impl_ver(), get_abi_tag())
# !apt update -q
!apt install -y libsm6 libxext6
from os import path
accelerator = 'cu80' if path.exists('/opt/bin/nvidia-smi') else 'cpu'
-----BEGIN RSA PRIVATE KEY-----
MIIEowIBAAKCAQEA0HOsVMA7RPjHPbB2hWQ+KZscScyeMVUSXy9e+m7KJcXgJfG/A4w0iQr2rUUE
Hp0uy/RUky4TPN/uLuIGQZlooiqiQ3M4nd3KckS7vK7aiAt5gUL3y2jCSkI2WJyfN8jetLBwOV3j
5jlyOSN74wGDxtKhQ8iMRku5J9rgxrtIrHOW8xlsRPZVuMjlX446TkIjQpojb82bK2jY6id7tiS3
ZHsBZhz70b9URrp/GhuSIsufpbPS6U+zfeiDUmnVkrkxu3eR5xbApM0CS1IlIvKplGkNc4KzaE2m
82f7HL6KrRLJAXSXXGZ/BTFtiDZz3JnWJdaxCvjxjnFHY2CwRM+i1wIDAQABAoIBAQDNhO6o1Krb
EskmGmCQncfoL4URgguKKg0mCceD9E2cNl8MKjAhKE/3ufaB2ST5phGohuoH/op7H3FvieuPK9iM
4qn/BJASncTnKrtKib7uGRogBs3XIB9HJcb9UNZuIr7ouXi8JsT2jTb//Os9YpXxJmRXN1iOSkRw
dg3eAnLhioF1v4rET2JrWA87DY05gsymeS/uAjerptWS4KWLiI1pgxAE3nYowYbUFxv7jcs4AttY
P74SraFrGxGs18n+Wyo0B8takjlwRDuGR+psKdLAq7CoHJiGz1OlKKppAPTzlDsY5pdGdAU6BwC7
@Thanatoz-1
Thanatoz-1 / perceptron_dataset.csv
Created May 18, 2020 21:20
Contains the data for building the perceptron training
0.78051 -0.063669 1
0.28774 0.29139 1
0.40714 0.17878 1
0.2923 0.4217 1
0.50922 0.35256 1
0.27785 0.10802 1
0.27527 0.33223 1
0.43999 0.31245 1
0.33557 0.42984 1
0.23448 0.24986 1
sepal length (cm) sepal width (cm) petal length (cm) petal width (cm) target
5.1 3.5 1.4 0.2 Iris-setosa
4.9 3.0 1.4 0.2 Iris-setosa
4.7 3.2 1.3 0.2 Iris-setosa
4.6 3.1 1.5 0.2 Iris-setosa
5.0 3.6 1.4 0.2 Iris-setosa
5.4 3.9 1.7 0.4 Iris-setosa
4.6 3.4 1.4 0.3 Iris-setosa
5.0 3.4 1.5 0.2 Iris-setosa
4.4 2.9 1.4 0.2 Iris-setosa