Skip to content

Instantly share code, notes, and snippets.

@karpathy
karpathy / min-char-rnn.py
Last active Sep 14, 2021
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
View min-char-rnn.py
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@jkleint
jkleint / timeseries_cnn.py
Created Jul 29, 2016
Example of using Keras to implement a 1D convolutional neural network (CNN) for timeseries prediction.
View timeseries_cnn.py
#!/usr/bin/env python
"""
Example of using Keras to implement a 1D convolutional neural network (CNN) for timeseries prediction.
"""
from __future__ import print_function, division
import numpy as np
from keras.layers import Convolution1D, Dense, MaxPooling1D, Flatten
from keras.models import Sequential
@dabeaz
dabeaz / README.txt
Created Oct 15, 2019
PyCon India 2019, Code from Keynote Presentation by @dabeaz
View README.txt
Code from PyCon India 2019 Keynote Talk
David Beazley (https://www.dabeaz.com)
======================================
This code is presented "as is" and represents what was live-coded
during my closing keynote presentation at PyCon India, Chennai,
October 13, 2009. I have made no changes to the files.
Requires: Python 3.6+, numpy, pygame
@methane
methane / gist:2185380
Created Mar 24, 2012
Tornado Example: Delegating an blocking task to a worker thread pool from an asynchronous request handler
View gist:2185380
from time import sleep
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from tornado.web import Application, asynchronous, RequestHandler
from multiprocessing.pool import ThreadPool
_workers = ThreadPool(10)
def run_background(func, callback, args=(), kwds={}):
def _callback(result):
@mbostock
mbostock / .block
Last active May 27, 2021
Path Tween
View .block
license: gpl-3.0
@ndarville
ndarville / webm.md
Last active May 19, 2021
4chan’s guide to converting GIF to WebM - https://boards.4chan.org/g/res/41212767
View webm.md

Grab ffmpeg from https://www.ffmpeg.org/download.html

It's a command line tool which means you will have to type things with your keyboard instead of clicking on buttons.

The most trivial operation would be converting gifs:

ffmpeg -i your_gif.gif -c:v libvpx -crf 12 -b:v 500K output.webm
  • -crf values can go from 4 to 63. Lower values mean better quality.
  • -b:v is the maximum allowed bitrate. Higher means better quality.
@den-crane
den-crane / gist:f7382cd4f1f859ff6ac46afe7dc9925a
Created Oct 15, 2018
Populate AggregatingMergeTree through null table
View gist:f7382cd4f1f859ff6ac46afe7dc9925a
create table z(a date, b Int64) Engine=MergeTree Partition by toYYYYMM(a) order by a;
insert into z select today(), number from numbers(1000000000);
insert into z select yesterday(), number from numbers(1000);
create table mv_z_store(a date, max_b AggregateFunction(MAX,Int64)) ENGINE = AggregatingMergeTree Partition by toYYYYMM(a) order by a;
create table temp(a date, b Int64) Engine=Null;
create MATERIALIZED VIEW mv_z to mv_z_store AS SELECT a, maxState(b) AS max_b FROM temp GROUP BY a;
insert into temp select * from z;
drop table mv_z;
drop table temp;
@geohot
geohot / ransac_polyfit.py
Last active Mar 17, 2021
RANSAC polyfit. Fit polynomials with RANSAC in Python
View ransac_polyfit.py
def ransac_polyfit(x, y, order=3, n=20, k=100, t=0.1, d=100, f=0.8):
# Thanks https://en.wikipedia.org/wiki/Random_sample_consensus
# n – minimum number of data points required to fit the model
# k – maximum number of iterations allowed in the algorithm
# t – threshold value to determine when a data point fits a model
# d – number of close data points required to assert that a model fits well to data
# f – fraction of close data points required
besterr = np.inf
View WGET_Large_Files_GDrive.md

This document walks you through the steps to prepare a wget compatible link from a file that is located in your Google Drive.

Motivation: When working in Deep Learning, we often use Google Colab, Kaggle Kernels, or Cloud Instances for training our models on GPUs. But the problem that comes with it is we often have to upload all the necessary files required to get things up and running. This is particularly problematic when we have a large dataset and this cannot be uploaded/gathered directly (sometimes, scp does not work as well). We may have a dataset stored in our Google Drives. In situations like that, we generally create a wget compatible link from the file (typically the dataset) located in our Google Drive (this document only deals with Google Drive).

Steps:

  • Right click on the file (located in Google Drive) and click on "Share".
  • In the Link sharing on section, change the permissions of your file to "Anyone with the link can view" and copy the link.
  • Now, the link should resemble `
@akhenakh
akhenakh / gist:2894704
Created Jun 8, 2012
async callback from threads with Tornado
View gist:2894704
import functools
import time
import threading
import logging
import Queue
import hunspell
import tornado.web
import tornado.websocket
import tornado.locale