Skip to content

Instantly share code, notes, and snippets.

View portableant's full-sized avatar
👹
Ducking and diving

Daniel Pett portableant

👹
Ducking and diving
View GitHub Profile
#!/bin/bash
#
# PostgreSQL Backup Script Ver 1.0
# http://autopgsqlbackup.frozenpc.net
# Copyright (c) 2005 Aaron Axelsen <axelseaa@amadmax.com>
#
# This script is based of the AutoMySQLBackup Script Ver 2.2
# It can be found at http://sourceforge.net/projects/automysqlbackup/
#
# The PostgreSQL changes are based on a patch agaisnt AutoMySQLBackup 1.9
@portableant
portableant / setting-up-earls-heroku.md
Last active August 29, 2015 14:27 — forked from shawngraham/setting-up-earls-heroku.md
steps for setting up @edsu's EARLS on Heroku;

Setting up EARLS

Earls is useful for archiving and counting URLS shared in tweets from a particular hashtag, eg., conference backchat etc. Getting it set up on Heroku (a tiered free service) is fairly straightforward. Getting it to go live requires a credit card tied to your account - but if you're using the free tier, this shouldn't be an issue. Anyway, with the help of Dan Pett I learned how to set one up. Dan set up an earls instance for #msudai.

getting ready

get a heroku account

install git

install node

<!DOCTYPE html>
<html>
<head>
<meta charset=utf-8 />
<title>Load GeoJSON</title>
<meta name='viewport' content='initial-scale=1,maximum-scale=1,user-scalable=no' />
<script src='https://api.mapbox.com/mapbox.js/v2.2.1/mapbox.js'></script>
<link href='https://api.mapbox.com/mapbox.js/v2.2.1/mapbox.css' rel='stylesheet' />
<style>
body { margin:0; padding:0; }
@portableant
portableant / OAI finds
Created August 21, 2015 02:13
A script in R to OAI harvest data from PAS
library('OAIHarvester')
baseurl <- 'https://finds.org.uk/datalabs/oai/request'
data <- oaih_harvest(baseurl, prefix = "oai_dc",from = NULL, until = NULL, set = 'BM',transform = TRUE)
@portableant
portableant / test.geojson
Created August 25, 2015 06:59
A test geojson file
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@portableant
portableant / alignTGNtoNomisma.R
Created October 22, 2015 13:36
Method for aligning Nomisma mints with Getty via Ryan B's script
#' ----
#' title: " A script for aligning data from the TGN to Nomisma based on Pleiades IDs"
#' author: "Daniel Pett"
#' date: "10/16/2015"
#' output: csv_document
#' ----
setwd("C:\\rstuff")
nomisma <- read.csv("http://nomisma.org/query?query=PREFIX+nm%3A%09%3Chttp%3A%2F%2Fnomisma.org%2Fid%2F%3E%0D%0A++++PREFIX+nmo%3A%09%3Chttp%3A%2F%2Fnomisma.org%2Fontology%23%3E%0D%0A++++PREFIX+skos%3A%09%3Chttp%3A%2F%2Fwww.w3.org%2F2004%2F02%2Fskos%2Fcore%23%3E%0D%0A++++SELECT+*+WHERE+%7B%0D%0A++++%3Furi+a+nmo%3AMint%3B%0D%0A++++skos%3AprefLabel+%3Flabel%3B%0D%0A++++skos%3AcloseMatch+%3Fmatch%3B%0D%0A++++.FILTER+regex%28str%28%3Fmatch%29%2C+%22pleiades%22%29+.%0D%0A++++FILTER%28langMatches%28lang%28%3Flabel%29%2C+%22EN%22%29%29+.%0D%0A++++%7D+LIMIT+100000&output=csv")
names(nomisma) <- c('uri', 'prefLabel', 'pleiades')
@portableant
portableant / test.geojson
Last active November 9, 2015 12:46
PAS finds geojson
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@portableant
portableant / perry.csv
Last active January 6, 2016 16:50
Megan Perry gis stuff
ID GEOM ATTRIBUTE CLASS LATITUDE LONGITUDE
1 POLYGON B.6 tomb 30.33192927 35.44511046
1 POLYGON B.6 tomb 30.33193793 35.4451296
1 POLYGON B.6 tomb 30.33194466 35.44512644
1 POLYGON B.6 tomb 30.33193506 35.44510488
1 POLYGON B.6 tomb 30.33192927 35.44511046
2 POLYGON B.5 tomb 30.33196859 35.44503376
2 POLYGON B.5 tomb 30.33197311 35.44505717
2 POLYGON B.5 tomb 30.33198008 35.44505588
2 POLYGON B.5 tomb 30.3319744 35.44503141
@portableant
portableant / minePAS.R
Created January 11, 2016 16:34
Mine PAS for terminal dates for hoards
# Download data from PAS search
library(jsonlite)
url <- 'https://finds.org.uk/database/search/results/q/hoard/broadperiod/ROMAN/format/json'
json <- fromJSON(url)
total <- json$meta$totalResults
results <- json$meta$resultsPerPage
pagination <- ceiling(total/results)
keeps <- c("id","old_findID","fromdate", "todate")
data <- json$results
data <- data[,(names(data) %in% keeps)]