Skip to content

Instantly share code, notes, and snippets.

View bmpvieira's full-sized avatar

Bruno Vieira bmpvieira

  • Lifebit
View GitHub Profile
var red = '#c00'
var white = '#fff'
console.log('<table border=0 cellspacing=0 cellpadding=0 style="line-height:10px">\n' + ([
'111111111111111111',
'1 1 1 1',
'1 1 1 1 1 1 1 1',
'1 1 1 1 1 1 1 1',
'1 1 1 1 1 1 1',
'111111 1111111111',
@max-mapper
max-mapper / package.json
Last active August 29, 2015 14:04
fcc-ecfs to dat importer
{
"name": "fcc-ecfs",
"version": "1.0.0",
"gasket": {
"main": [
"gasket run import -- http://www.fcc.gov/files/ecfs/14-28/14-28-RAW-Solr-1.xml",
null,
"gasket run import -- http://www.fcc.gov/files/ecfs/14-28/14-28-RAW-Solr-2.xml",
null,
"gasket run import -- http://www.fcc.gov/files/ecfs/14-28/14-28-RAW-Solr-3a.xml",
@hubgit
hubgit / README.md
Last active August 29, 2015 14:05
Stream filtered tweets into dat
  1. sudo aptitude install git php5-cli nodejs
  2. npm install -g npm
  3. npm install -g maxogden/dat
  4. dat init
  5. dat listen
  6. Create a Twitter application and generate OAuth keys for your user - add the app and user credentials to stream.php.
  7. composer init && composer require fennb/phirehose:dev-master && composer install
  8. php stream.php | dat import -json
  9. Install and start dat-editor then open dat-editor to browse the table.
var once = require('once')
var concurrent = 10
var pending = 0
through(function(data, enc, cb) {
cb = once(cb)
pending++
if (pending < concurrent) cb()
rr().write(data, function(err) {
if (err) return cb(err)
# A quick function to save a PBM (http://en.wikipedia.org/wiki/Netpbm_format)
# visualize *a lot* of missing data pretty quickly (outside of R).
writeMissingPBM <- function(x, file) {
dims <- dim(x)
x[] <- as.integer(is.na(x))
con <- file(file, open="wt")
writeLines(sprintf("P1\n%d %d", ncol(x), nrow(x)), con)
write.table(x, file=con, sep=" ", col.names=FALSE, row.names=FALSE, quote=FALSE)
close(con)
@ctb
ctb / quote.txt
Last active August 29, 2015 14:13
Michael Barton quote @Bioinformatics during #BaltiandBioinformatics
https://www.youtube.com/watch?x-yt-cl=84359240&x-yt-ts=1421782837&feature=player_embedded&v=ZACVcJt0oJA#t=7303
Kai Blin: If we containerize all these things won’t it just encourage
worse software development practices; right now developers still need
to consider someone other than themselves installing the software.
Michael Barton:
“It’s a good point. Ultimately, though, if I can get a container, and
it works, and I know it will work, do you care how well it was
@audy
audy / fastash
Created January 29, 2015 19:45
#!/usr/bin/env node
var fs = require('fs');
var fasta = require('bionode-fasta');
var crypto = require('crypto');
fasta.obj('proteins.fasta').on('data', function(x) {
var seq = x.seq
.replace(/\*/g, '') // remove asterisks (stop codons)
.toUpperCase(); // enforce case
var atomicQueue = require(...)
var queue = atomicQueue(db, function (data, cb) {
console.log('got some work', data)
// do stuff ...
cb(null, result) // if no error this work will be removed from the leveldb
})
// write will add this 'work' to the leveldb
queue.write({some: 'world'})
@vsbuffalo
vsbuffalo / unix-pipes.R
Created March 18, 2015 03:17
Yo dawg, I heard you like pipes, so I put a pipe in your pipe, so you can pipe streams while you pipe streams
library(magrittr)
write.table(iris, file="iris.txt", sep="\t", row.names=FALSE, quote=FALSE)
plines <- function(cmd) readLines(pipe(cmd))
ptabs <- function(cmd) read.delim(pipe(cmd))
# just as a trivial example, process some shuffled lines using GNU shuf, sort,
# etc. (we could do this in R, but this is to simulate command line operations)
define run_fastqc
$$(addsuffix .ok,$(1)) : $(1)
mkdir -p $$(dir $$@) && \
unzip -p $$< $$(addsuffix .tmp_fastqc/fastqc_data.txt,$$(notdir $$<)) |\
grep ">>" | grep -v ">>END_MODULE" | cut -c 3- |\
awk -F ' ' 'BEGIN{N=0;} {if(($$$$1=="Basic Statistics" || $$$$1=="Per base sequence quality" || $$$$1=="Per base N content") && $$$$2=="pass" ) N++;} END { if(N>2) {print "__PASS__"}}' |\
grep __PASS__ && echo "FASTQC ok for $$<" > $$@