Skip to content

Instantly share code, notes, and snippets.

View dansondergaard's full-sized avatar

Dan Søndergaard dansondergaard

View GitHub Profile
@afrendeiro
afrendeiro / divideAndSlurm.py
Last active March 13, 2017 11:28
Class to perform map reduce-style operations split in jobs across high-performance computing cluster
class DivideAndSlurm(object):
"""
DivideAndSlurm is a class to handle a map-reduce style submission of jobs to a Slurm cluster.
Add a particula task to the object (though a specific function) and it will divide the input data
into pools, which will be submitted (use the submit() function) in parallel to the cluster.
Tasks can also further process its input in parallel, taking advantage of all processors.
"""
def __init__(self, tmpDir="/fhgfs/scratch/users/user/", logDir="/home/user/logs", queue="shortq", userMail=""):
super(DivideAndSlurm, self).__init__()
@kennethreitz
kennethreitz / pr.md
Created September 12, 2012 20:56 — forked from piscisaureus/pr.md
Checkout github pull requests locally

Locate the section for your github remote in the .git/config file. It looks like this:

[remote "origin"]
	fetch = +refs/heads/*:refs/remotes/origin/*
	url = git@github.com:joyent/node.git

Now add the line fetch = +refs/pull/*/head:refs/remotes/origin/pr/* to this section. Obviously, change the github url to match your project's URL. It ends up looking like this:

@MohamedAlaa
MohamedAlaa / tmux-cheatsheet.markdown
Last active July 23, 2024 19:59
tmux shortcuts & cheatsheet

tmux shortcuts & cheatsheet

start new:

tmux

start new with session name:

tmux new -s myname