Skip to content

Instantly share code, notes, and snippets.

@robinkraft
robinkraft / check-data.clj
Last active August 29, 2015 14:21
debug MODIS time series in FORMA processing
(use 'forma.hadoop.jobs.forma)
(in-ns 'forma.hadoop.jobs.forma)
(use 'forma.hadoop.pail)
(let [pail-path "s3n://pailbucket/all-master"
out-path "s3n://pailbucket/output/run-2015-05-14/date-counts"
s-res "500"
t-res "16"
pail-src (split-chunk-tap pail-path ["ndvi" (format "%s-%s" s-res t-res)])
sink (hfs-textline out-path :sinkmode :replace)]
import os
import sys
import requests
basefname = 'Hansen_GFC2013_lossyear_%d%s_%03d%s.tif'
baseurl = 'http://commondatastorage.googleapis.com/earthenginepartners-hansen/GFC2013/%s'
def download(url, output):
@robinkraft
robinkraft / gist:40f0d9dc13db2b5eb458
Created January 29, 2015 00:19
CLI commands to convert FORMA geotiff to text, with thresholding
gdal_translate -of XYZ ~/Downloads/forma-30104_2005-04-23-0000000000-0000000000.tif /tmp/forma.txt
grep -v "nan" /tmp/forma.txt | awk '$3 > 0.5' > /tmp/forma_hits.txt
@robinkraft
robinkraft / projected_area.py
Created December 12, 2014 00:37
get the area in square meters of a polygon using shapely and pyproj
import pyproj
from shapely.geometry import shape
from shapely.ops import transform
geom = {'type': 'Polygon',
'coordinates': [[[-122., 37.], [-125., 37.],
[-125., 38.], [-122., 38.],
[-122., 37.]]]}
s = shape(geom)
# A mill with less than 5% UMD loss from 2005-2012 is low risk, 5-28% is medium risk, and over 28% is high risk.
# One within concessions and one within entire radius
# Use the area of the entire radius or concession as the denominator (effectively assume 100% forested)
import json
import requests # sudo pip install requests
APIURL = "http://wri-01.cartodb.com/api/v2/sql"
def parse_concession_json(j):
@robinkraft
robinkraft / gist:03c28924c8203b6ce6dc
Last active August 29, 2015 14:07
Rainforest Connection looking for talent

SOFTWARE DEVELOPMENT

Rainforest Connection is a San Francisco-based nonprofit building open-source solutions to illegal logging and poaching. Our flagship project is the world’s first real-time forest monitoring system, built from recycled Android smartphones.

We have a number of pilot deployments that will begin shortly. We’re looking to build an engineering team to join us and bring our platform to the next level throughout the stack.

Let us know what you can do and your requirements. We ourselves have been working on this project for 1.5 years without being paid, so we’re not afraid to ask others to dive in. If we’re successful, money is unlikely to be a problem. Our basic philosophy is that technology and innovation should be used to solve real world problems, rather than creating “shiny objects” or harnessing the best minds in the world to say ... democratize cab service across town.

We appreciate fast learners with a sense of hum

@robinkraft
robinkraft / gist:05205edbd978de84873c
Last active August 29, 2015 14:06
revised GFW queries for UMD loss/gain data
# global sum of loss and gain by year
# note that the gain is the same each year (it is total gain / 12), so if you want total gain
# you need to add up all the gain values, or multiply one of them by 12
SELECT year,
Sum(loss) loss,
Sum(gain) gain
FROM umd_nat
WHERE thresh = 10
AND year > 2000
GROUP BY year
@robinkraft
robinkraft / cacher.py
Created August 5, 2014 17:14
precache images for crowdsourcing app
import sys
import requests
import multiprocessing
URL = 'http://forma-cs-validation.elasticbeanstalk.com/api/alerts/next/cache'
def f(a):
try:
r = requests.get(URL)
@robinkraft
robinkraft / instructions.md
Last active January 16, 2019 17:25
Burn scar algorithm for Google Earth Engine, derived from Elvidge and Baugh (2014).

Set up for update

  1. Go to fires download page - https://firms.modaps.eosdis.nasa.gov/download/
  2. Zoom to Sumatra. It doesn't have to be perfect. We do some screening in EE to make sure there are no fires in Malaysia used.
  3. Submit the download request (2013-03-30 to present) - csv is easiest - then wait for it to complete (usually < 30 minutes)
  4. Upload the CSV file to Fusion Tables. Go here, then just click "new table" under the File menu and follow the instructions.
  5. Get the docid from the url. For example, for docid=1SzJl73nj5IPVEOGqhGc8uv5Vkwb504uqK_YTnVGh just grab 1SzJl73nj5IPVEOGqhGc8uv5Vkwb504uqK_YTnVGh
  6. Update dates in this script when you run it on EE - the second date in the POST variable may need to be extended.
  7. Update the FIRES variable to equal to "ft:". You'll see how it is in the code.

Run + export

@robinkraft
robinkraft / ee_poly_export.py
Last active August 29, 2015 14:03
Extracting polygons from GEE
import ee
# based on https://ee-api.appspot.com/cd2e21cd508327c5eb92ce57809f1360
def main():
# speciesData = ee.FeatureCollection(
# 'ft:14QQnwJVRPYXdhb4sRqaDblVRclbMxmew1rQCYig6')
# collection = ee.ImageCollection(
# 'MODIS/MOD13Q1').filterDate('2000-02-24', '2014-07-01')