Skip to content

Instantly share code, notes, and snippets.

Homo Sapiens


Homo Sapiens
View GitHub Profile
Nov05 /
Created Jul 4, 2019 — forked from korakot/
Use selenium in Colab
# install chromium, its driver, and selenium
!apt install chromium-chromedriver
!cp /usr/lib/chromium-browser/chromedriver /usr/bin
!pip install selenium
# set options to be headless, ..
from selenium import webdriver
options = webdriver.ChromeOptions()
View 2019-08-02 Check Heroku Logs and
View 2019-10-20
import pandas as pd
import numpy as np

import matplotlib.pyplot as plt
import seaborn as sns
import pandas_profiling
import plotly
import plotly.graph_objects as go
from sklearn.preprocessing import MinMaxScaler
View 2019-10-20 custom
!pip install colorlover
# Successfully installed colorlover-0.3.0
import pandas as pd
import numpy as np

import matplotlib.pyplot as plt
View 2019-10-20 random


# baseline score
X_train, X_val, y_train, y_val = train_test_split(
    X_trainval, y_trainval, test_size=0.33, random_state=42)
print("train mean velocity:", y_train.mean())
y_val_pred = [y_train.mean()] * len(y_val)
print('baseline error score:', mean_squared_error(y_val, y_val_pred))
Nov05 /
Created Dec 5, 2019 — forked from emredjan/
Yelp Dataset Challenge JSON to CSV conversion
Load Yelp JSON files and spit out CSV files
Does not try to reinvent the wheel and uses pandas json_normalize
Kinda hacky and requires a bit of RAM. But works, albeit naively.
Tested with Yelp JSON files in dataset challenge round 12:
import json
View 2020-02-29 python code

Find the elements that appear in both lists.

import time
with open('names_1.txt', 'r') as f:
    names_1 ="\n")  # List containing 10000 names
with open('names_2.txt', 'r') as f:
    names_2 ="\n")  # List containing 10000 names
Nov05 /
Last active Mar 11, 2020
2020-03-07 CNN-LSTM image captioning
import torch
import torch.nn as nn
import torch.nn.functional as F
import torchvision.models as models
class EncoderCNN(nn.Module):
def __init__(self, embed_size):
# super(EncoderCNN, self).__init__()