Skip to content

Instantly share code, notes, and snippets.

@robinkraft
robinkraft / projected_area.py
Created December 12, 2014 00:37
get the area in square meters of a polygon using shapely and pyproj
import pyproj
from shapely.geometry import shape
from shapely.ops import transform
geom = {'type': 'Polygon',
'coordinates': [[[-122., 37.], [-125., 37.],
[-125., 38.], [-122., 38.],
[-122., 37.]]]}
s = shape(geom)
@robinkraft
robinkraft / osgeo_compile.sh
Created June 24, 2014 01:41
compile and install GEOS, PROJ4 and GDAL from source on Ubuntu 12.0.4
# Compile/install GEOS. Taken from:
# http://grasswiki.osgeo.org/wiki/Compile_and_Install_Ubuntu#GEOS_2
cd /tmp
wget http://download.osgeo.org/geos/geos-3.4.2.tar.bz2
bunzip2 geos-3.4.2.tar.bz2
tar xvf geos-3.4.2.tar
cd geos-3.4.2
minikube stop; minikube delete &&
docker stop $(docker ps -aq) &&
rm -rf ~/.kube ~/.minikube &&
sudo rm -rf /usr/local/bin/localkube /usr/local/bin/minikube &&
launchctl stop '*kubelet*.mount' &&
launchctl stop localkube.service &&
launchctl disable localkube.service &&
sudo rm -rf /etc/kubernetes/ &&
docker system prune -af --volumes
@robinkraft
robinkraft / s3bucketsize.py
Last active October 24, 2021 11:43
Simple python script to calculate size of S3 buckets
import sys
import boto
# based on http://www.quora.com/Amazon-S3/What-is-the-fastest-way-to-measure-the-total-size-of-an-S3-bucket
# assumes you've already configured your access id & secret key
s3 = boto.connect_s3()
@robinkraft
robinkraft / gist:4d1807fb8f9c246b2d21
Last active January 16, 2021 17:42
installing opencv 2.4.10
# install dependencies
sudo apt-get update
sudo apt-get install -y build-essential
sudo apt-get install -y cmake
sudo apt-get install -y libgtk2.0-dev
sudo apt-get install -y pkg-config
sudo apt-get install -y python-numpy python-dev
sudo apt-get install -y libavcodec-dev libavformat-dev libswscale-dev
sudo apt-get install -y libjpeg-dev libpng-dev libtiff-dev libjasper-dev
@robinkraft
robinkraft / gist:1413347
Created December 1, 2011 03:41
the easy way to install GDAL 1.8.0 on Ubuntu
sudo add-apt-repository ppa:ubuntugis/ppa
sudo apt-get update
sudo apt-get install gdal-bin
sudo apt-get -y install python-gdal
@robinkraft
robinkraft / download.py
Last active April 20, 2020 19:41
Unofficial client for the Planet.com satellite image catalog and API, originally part of Planet's quickstart guides. https://developers.planet.com/docs/quickstart/
#!/usr/bin/env python
import argparse
import os
import requests
import json
import sys
import logging
import datetime
@robinkraft
robinkraft / load_hdf.py
Created February 10, 2012 23:48
Load hdf file with GDAL and Python, get NDVI
from osgeo import gdal
import os
layer_dict = {"reli":"reliability", "qual":"Quality", "ndvi":"NDVI", "evi":"EVI"}
def print_data(layer, data):
print data
print "data:", layer
print "type:", data.dtype
print "mean:", data.mean()
@robinkraft
robinkraft / domino_event_log_5c777df946e0fb0008ada92c_condensed.json
Created February 28, 2019 21:17
cat ~/Downloads/domino_event_log_5c777df946e0fb0008ada92c.json | jq ".[].description"
"Request to deploy app with id 5c777df946e0fb0008ada92c"
"Deploying app with id 5c777df946e0fb0008ada92c"
"Deployment created: run-5c777df946e0fb0008ada92c"
"Pod created: run-5c777df946e0fb0008ada92c-6794d486b8-sxsp4"
"Scaled up replica set run-5c777df946e0fb0008ada92c-6794d486b8 to 1"
"Successfully assigned run-5c777df946e0fb0008ada92c-6794d486b8-sxsp4 to ip-10-0-175-133.us-west-2.compute.internal"
"Pod updated: run-5c777df946e0fb0008ada92c-6794d486b8-sxsp4"
"Deployment updated: run-5c777df946e0fb0008ada92c"
"Deployment updated: run-5c777df946e0fb0008ada92c"
"Pod updated: run-5c777df946e0fb0008ada92c-6794d486b8-sxsp4"
@robinkraft
robinkraft / instructions.md
Last active January 16, 2019 17:25
Burn scar algorithm for Google Earth Engine, derived from Elvidge and Baugh (2014).

Set up for update

  1. Go to fires download page - https://firms.modaps.eosdis.nasa.gov/download/
  2. Zoom to Sumatra. It doesn't have to be perfect. We do some screening in EE to make sure there are no fires in Malaysia used.
  3. Submit the download request (2013-03-30 to present) - csv is easiest - then wait for it to complete (usually < 30 minutes)
  4. Upload the CSV file to Fusion Tables. Go here, then just click "new table" under the File menu and follow the instructions.
  5. Get the docid from the url. For example, for docid=1SzJl73nj5IPVEOGqhGc8uv5Vkwb504uqK_YTnVGh just grab 1SzJl73nj5IPVEOGqhGc8uv5Vkwb504uqK_YTnVGh
  6. Update dates in this script when you run it on EE - the second date in the POST variable may need to be extended.
  7. Update the FIRES variable to equal to "ft:". You'll see how it is in the code.

Run + export