Skip to content

Instantly share code, notes, and snippets.

View meg-codes's full-sized avatar

meg-codes

View GitHub Profile
@meg-codes
meg-codes / install_pyenv_mint.sh
Created November 29, 2020 01:11
Install pyenv on Linux Mint (or Debian) -- run as root or with sudo
#!/bin/bash
apt update
apt install -y make build-essential libssl-dev zlib1g-dev libbz2-dev \
libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev \
xz-utils tk-dev libffi-dev liblzma-dev python-openssl git
curl https://pyenv.run | bash
# Load pyenv automatically by adding
@meg-codes
meg-codes / audit_identifiers.py
Last active May 6, 2019 21:05
Audit cardholder list produced by mep_cardholders.py
with open('/tmp/sylviabeach-card-images.txt') as found_images:
urls = found_images.read().split('\n')
identifiers = {}
for u in urls:
identifier = '/'.join(u.split('/')[-3:])
identifiers[identifier] = ''
with open('/tmp/pudl0123-825298-noboxes-sorted.txt') as image_list:
@meg-codes
meg-codes / mep_cardholders.py
Last active May 6, 2019 20:46
Updated version of rlskoeser, Export people data for Shakespeare & co lending card holders (fields needed for Finding Aid enhancement and image import into Figgy)
# use env to set django settings module
# env DJANGO_SETTINGS_MODULE=mep.settings python mep_cardholders.py
import csv
import codecs
import django
from django.db import models
@meg-codes
meg-codes / ppa-upset.R
Created April 19, 2019 18:46
PPA Collection UpsetR plot
library(tidyr)
library(dplyr)
library(UpSetR)
# read in the data
ppa <- read.csv("ppa-digitizedworks-20190419T18_24_42.csv")
# subset to only needed rows and split Collection into multiple rows
new_df <- ppa %>% select("Title", "Source.ID", "Collection") %>% separate_rows("Collection", sep=";")
# give a truth column value to map on spread
@meg-codes
meg-codes / argentina_politics.csv
Created February 8, 2019 18:42
Argentina Politics and Government, LAE
year count
1972-1974 1
1973 1
1975-1977 1
1980-1990 1
1981 1
1985 1
1985-1990 1
1986 1
1988 3
// path and webpack
const path = require('path')
const webpack = require('webpack')
// webpack plugins
const BundleTracker = require('webpack-bundle-tracker')
const VueLoaderPlugin = require('vue-loader/lib/plugin')
const MiniCssExtractPlugin = require('mini-css-extract-plugin')
// configure dev mode because of absurdly misleading webpack documentation
const webpackConf = require('./webpack.config.js')
const merge = require('webpack-merge')
module.exports = function (config) {
config.set({
browsers: [ 'ChromeHeadless' ], //run in Chrome
browserNoActivityTimeout: 60000,
frameworks: [ 'mocha', 'sinon-chai' ], //use the mocha test framework
files: [
//'src/**/*.{jsx,js}',
@meg-codes
meg-codes / mnist.cmd
Last active December 18, 2018 19:22
slurm script to run mnist_example
#!/bin/bash
#SBATCH --gres=gpu:1
#one GPU, asking for only one CPU core since we're offloading to GPU
#SBATCH -t 00:05:00
#SBATCH --mail-type=begin
#SBATCH --mail-type=end
#SBATCH --mail-user=yourNETID@princeton.edu
module load anaconda3
# assumes you have a conda environment named tf with tensorflow and matplotlib, i.e.
@meg-codes
meg-codes / mnist_data.py
Created December 18, 2018 19:19
MNIST data for Slurm cluster without internet connectivity on processing nodes
import pickle
from tensorflow import keras
# Compute nodes don't have internet, so download in advance and pickle the
# image training data
# adapted from https://www.tensorflow.org/tutorials/keras/basic_classification
# see notes there for how to actually work things
fashion_mnist = keras.datasets.fashion_mnist
@meg-codes
meg-codes / mnist_classify.py
Created December 18, 2018 19:18
Slurm adapted version of MNIST Tensorflow/Keras example
import pickle
import tensorflow as tf
from tensorflow import keras
import numpy as np
import matplotlib.pyplot as plt
# reload pickled data