Instantly share code, notes, and snippets.

View cleanup.js
"use strict";
var AWS = require("aws-sdk");
var ec2 = new AWS.EC2();
var securityGroup = "sg-XXX";
exports.handler = (event, context, callback) => {
ec2.describeSecurityGroups({ GroupIds: [securityGroup] }, function(
err,
data
View worldclim_to_cog.md

http://www.worldclim.org/current

wget http://biogeo.ucdavis.edu/data/climate/worldclim/1_4/grid/cur/tmin_30s_bil.zip
wget http://biogeo.ucdavis.edu/data/climate/worldclim/1_4/grid/cur/tmax_30s_bil.zip
wget http://biogeo.ucdavis.edu/data/climate/worldclim/1_4/grid/cur/tmean_30s_bil.zip
wget http://biogeo.ucdavis.edu/data/climate/worldclim/1_4/grid/cur/prec_30s_bil.zip

unzip tmin_30s_bil.zip
View nlcd_to_cog.md

NLCD data is published in 3x3 degree chunks, as tiff files inside a zip file. For processing this data it's far more convienent if the files are available directly as tiff files without having to unzip.

s3cmd get --skip-existing -r s3://prd-tnm/StagedProducts/NLCD/data/2011/landcover/3x3/
for i in *.zip; do unzip $i '*.tif'; done

mkdir tmp
mkdir clouded

for i in *.tif; do
View download_tiles.py
#!/usr/bin/env python3
import logging
from optparse import OptionParser
import os
import mercantile
from urllib.parse import urlparse
import requests
from boto.s3.connection import S3Connection
View Trailhead research.txt
Executive summary:
There are 1261 points with a name ending in Trailhead. A decent % of them have no tags other than name.
There are 544 points tagged highway = trailhead. A bunch of them are in the Phillipenes.
There are 821 polygons with a name ending in Trailhead. There are almost all tagged as parking lots.
There are 64 polygons tagged highway=Trailhead. They are all tagged amenity=parking.
osm_rendering=> select name, osm_id, "natural", "amenity", "highway", "tourism", leisure, tags from planet_osm_point where name ilike '% Trailhead';
name | osm_id | natural | amenity | highway | tourism | leisure | tags
----------------------------------------------------------------------+------------+---------+-----------------+-------------------+-------------+----------------+----------------
View test-mapnik-sql.py
#!/usr/bin/env python3
import logging
from optparse import OptionParser
import os
import sys
import mercantile
def _main():
usage = "usage: %prog"
View README.md

Deploy updates to aws ecs services based on github commits. Posts deploy notifications to slack.

Assumptions:

  • Task definitions map 1:1 with services, and they have the same name.
  • Code is stored on github
  • You want to deploy the latest commit in a branch
  • Docker images are tagged with commit SHA
  • Docker images are stored on AWS ECR
View upload_tiles.py
#!/usr/bin/env python3
import os
from urllib.parse import urlparse
from multiprocessing.pool import ThreadPool
from functools import partial
import gzip
from boto.s3.connection import S3Connection
from boto.s3.key import Key
View valhalla diary.md

Ultimate Goal: valhalla running on AWS ECS, behind a load balancer, in multiple regions, with auto scaling, with data updating once a week.

Diary

First goal: Build with docker, then cut tiles from a small extract and get server to run locally.

jesse:projects/ $ mkdir ~/valhalla-data
jesse:projects/ $ cd ~/valhalla-data
jesse:projects/ $ wget http://download.geofabrik.de/north-america/us/california-latest.osm.pbf
[output abbreviated]
jesse:projects/ $ cd ~/projects
View split.py
#!/usr/bin/env python
import logging
from optparse import OptionParser
import os
import gpxpy
import gpxpy.gpx
def split_gpx(source, dest_dir, max_segment_points=500):