Skip to content

Instantly share code, notes, and snippets.

View rpatrik96's full-sized avatar

Patrik Reizinger rpatrik96

View GitHub Profile
@shahpoojan
shahpoojan / usb.cpp
Created July 6, 2011 12:57
A code to send data on the USB port for Windows
#include <iostream>
#include <string>
#include <Windows.h>
#include <tchar.h>
#include <stdio.h>
#include <Winbase.h>
HANDLE hCom;
DWORD sendData (const char* data, DWORD size)
@jrivero
jrivero / csv_splitter.py
Created July 15, 2011 20:33 — forked from palewire/csv_splitter.py
A Python CSV splitter
import os
def split(filehandler, delimiter=',', row_limit=10000,
output_name_template='output_%s.csv', output_path='.', keep_headers=True):
"""
Splits a CSV file into multiple pieces.
A quick bastardization of the Python CSV library.
Arguments:
@jxson
jxson / README.md
Created February 10, 2012 00:18
README.md template

Synopsis

At the top of the file there should be a short introduction and/ or overview that explains what the project is. This description should match descriptions added for package managers (Gemspec, package.json, etc.)

Code Example

Show what the library does as concisely as possible, developers should be able to figure out how your project solves their problem by looking at the code example. Make sure the API you are showing off is obvious, and that your code is short and concise.

Motivation

tmux cheatsheet

As configured in my dotfiles.

start new:

tmux

start new with session name:

@benbalter
benbalter / gist.md
Last active April 21, 2024 15:50
Example of how to embed a Gist on GitHub Pages using Jekyll.

Here's an example of how to embed a Gist on GitHub Pages:

{% gist 5555251 %}

All you need to do is copy and paste the Gist's ID from the URL (here 5555251), and add it to a gist tag surrounded by {% and %}.

@ashwin
ashwin / subfigure-example.tex
Created July 9, 2014 10:28
Example of using subfigure package in LaTeX
% Arrange three figures like this:
% Fig 1
% Fig 2 Fig 3
\begin{figure}
\centering
\subfigure[]
{
\includegraphics[scale=.2]{foo1.pdf}
}
@msrose
msrose / combining-git-repositories.md
Last active June 4, 2024 13:35
How to combine two git repositories.

Combining two git repositories

Use case: You have repository A with remote location rA, and repository B (which may or may not have remote location rB). You want to do one of two things:

  • preserve all commits of both repositories, but replace everything from A with the contents of B, and use rA as your remote location
  • actually combine the two repositories, as if they are two branches that you want to merge, using rA as the remote location

NB: Check out git subtree/git submodule and this Stack Overflow question before going through the steps below. This gist is just a record of how I solved this problem on my own one day.

Before starting, make sure your local and remote repositories are up-to-date with all changes you need. The following steps use the general idea of changing the remote origin and renaming the local master branch of one of the repos in order to combine the two master branches.

@jimblom
jimblom / LSM9DS1_Simple.ino
Created April 3, 2015 17:36
LSM9DS1 Simple - Spark Core
/*****************************************************************
LSM9DS1_Simple.ino
SFE_LSM9DS1 Library Simple Example Code
Jim Lindblom @ SparkFun Electronics
Original Creation Date: February 27, 2015
https://github.com/sparkfun/LSM9DS1_Breakout
The LSM9DS1 is a versatile 9DOF sensor. It has a built-in
accelerometer, gyroscope, and magnetometer. Very cool! Plus it
functions over either SPI or I2C.
@shagunsodhani
shagunsodhani / Batch Normalization.md
Last active July 25, 2023 18:07
Notes for "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" paper

The Batch Normalization paper describes a method to address the various issues related to training of Deep Neural Networks. It makes normalization a part of the architecture itself and reports significant improvements in terms of the number of iterations required to train the network.

Issues With Training Deep Neural Networks

Internal Covariate shift

Covariate shift refers to the change in the input distribution to a learning system. In the case of deep networks, the input to each layer is affected by parameters in all the input layers. So even small changes to the network get amplified down the network. This leads to change in the input distribution to internal layers of the deep network and is known as internal covariate shift.

It is well established that networks converge faster if the inputs have been whitened (ie zero mean, unit variances) and are uncorrelated and internal covariate shift leads to just the opposite.

@kevinzakka
kevinzakka / data_loader.py
Last active April 19, 2024 23:42
Train, Validation and Test Split for torchvision Datasets
"""
Create train, valid, test iterators for CIFAR-10 [1].
Easily extended to MNIST, CIFAR-100 and Imagenet.
[1]: https://discuss.pytorch.org/t/feedback-on-pytorch-for-kaggle-competitions/2252/4
"""
import torch
import numpy as np