Skip to content

Instantly share code, notes, and snippets.

View meg-codes's full-sized avatar

meg-codes

View GitHub Profile
@meg-codes
meg-codes / fix_latlon.py
Created April 17, 2017 15:59
Code to take a spreadsheet with a varitey of lat/long formulas and make them decimal values, then write fixed CSV
# -*- coding: utf-8 -*-
import pandas as pd
import LatLon as latlon
import re
from LatLon import string2latlon
# Requires: pandas, LatLon, py2.7
def makefloatlat(data):
#!/usr/bin/env python
"""
Python script to list all MySQL databases and allow them to be recreated.
This assumes you have server information and credentials stored in a ~/.my.cnf
file (ideally chmoded to 0600)
[client]
host=localhost
user=root
password=secretpasswordgoeshere
@meg-codes
meg-codes / RQuantLib-Brew.md
Last active March 12, 2018 15:13
Instructions for RQuantLib and Brew with OpenMP

Installing QuantLib

Follow the instructions here to install Quantlib.

Make especially sure that you use the --with-intraday flag for QuantLib.

Install an up-to-date clang with OpenMP support

The default MacOS clang++ for High Sierra does not include support for OpenMP, so you'll need to install a version that does and then set R to use it for compiling libraries.

On the Adroit headnode

This requests an allocation on a compute node of Adroit for 15 minutes. Once granted you can see I'm moved to the compute node adroit-14. I also ask for two tasks (you can adjust based on whether you actually use the cores or not.)

[bhicks@adroit4 ~]$ salloc -N 1 --ntasks-per-node=2 -t 15:00
salloc: Granted job allocation 497989
[bhicks@adroit-14 ~]$ export XDG_RUNTIME_DIR=""
@meg-codes
meg-codes / cdh-scrape.py
Last active January 26, 2019 09:22
A basic web-scrape script designed to look for bad links on a particular site
#!/usr/bin/env python
# Script to scrape all links from a site, compile counts of each link, status
# codes of access and output the results as a CSV
#
# There's absolutely no reason this shouldn't be pulled into an OOP paradigm
# per se, but I left it functionalized because that can be easier for multitasking.
#
# Requirements:
# requests, bs4
@meg-codes
meg-codes / mnist_classify.py
Created December 18, 2018 19:18
Slurm adapted version of MNIST Tensorflow/Keras example
import pickle
import tensorflow as tf
from tensorflow import keras
import numpy as np
import matplotlib.pyplot as plt
# reload pickled data
@meg-codes
meg-codes / mnist_data.py
Created December 18, 2018 19:19
MNIST data for Slurm cluster without internet connectivity on processing nodes
import pickle
from tensorflow import keras
# Compute nodes don't have internet, so download in advance and pickle the
# image training data
# adapted from https://www.tensorflow.org/tutorials/keras/basic_classification
# see notes there for how to actually work things
fashion_mnist = keras.datasets.fashion_mnist
@meg-codes
meg-codes / mnist.cmd
Last active December 18, 2018 19:22
slurm script to run mnist_example
#!/bin/bash
#SBATCH --gres=gpu:1
#one GPU, asking for only one CPU core since we're offloading to GPU
#SBATCH -t 00:05:00
#SBATCH --mail-type=begin
#SBATCH --mail-type=end
#SBATCH --mail-user=yourNETID@princeton.edu
module load anaconda3
# assumes you have a conda environment named tf with tensorflow and matplotlib, i.e.
const webpackConf = require('./webpack.config.js')
const merge = require('webpack-merge')
module.exports = function (config) {
config.set({
browsers: [ 'ChromeHeadless' ], //run in Chrome
browserNoActivityTimeout: 60000,
frameworks: [ 'mocha', 'sinon-chai' ], //use the mocha test framework
files: [
//'src/**/*.{jsx,js}',
// path and webpack
const path = require('path')
const webpack = require('webpack')
// webpack plugins
const BundleTracker = require('webpack-bundle-tracker')
const VueLoaderPlugin = require('vue-loader/lib/plugin')
const MiniCssExtractPlugin = require('mini-css-extract-plugin')
// configure dev mode because of absurdly misleading webpack documentation