Skip to content

Instantly share code, notes, and snippets.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import zipfile
import arcpy
import os
import glob
import subprocess
import shutil
arcpy.env.overwriteOutput = "TRUE"
def unzipfiles(myzipfile):
print "unzipping file"
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@sgibbes
sgibbes / fails.geojson
Created August 21, 2018 13:00
Failed GLAD analysis geostores
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@sgibbes
sgibbes / clip_landcover_tiles.py
Created August 20, 2018 20:28
Clip Indonesia and Southeast Asia landcover tiles to 10x10 deg size and unset nodata
import gdal
import subprocess
def clip_using_input_coords(original):
file_to_clip = 'sea_land_cover_classified.tif'
# get coords of file to mock
src = gdal.Open(original)
@sgibbes
sgibbes / Querying_Country_Pages_Statistics_from_GFW_API.ipynb
Last active April 17, 2018 16:17
This is an example of how to return annual forest loss statistics within Indonesia by querying the GFW API. I'll show all possible boundaries to query by, and an example of looking at forest loss within Indonesia Primary Forest
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
from utilities import util, tile_geom_util
import json
import sys
from shapely.geometry import shape
# big aoi
aoi = {"type":"FeatureCollection","features":[{"type":"Feature","properties":{},"geometry":{"type":"Polygon","coordinates":[[[25.576171875,7.841615185204699],[15.864257812499998,-1.098565496040652],[18.45703125,-9.44906182688142],[31.6845703125,-9.535748998133615],[35.859375,-0.4394488164139641],[25.576171875,7.841615185204699]]]}}]}
aoi = {"type":"FeatureCollection","features":[{"type":"Feature","properties":{},"geometry":{"type":"Polygon","coordinates":[[[14.501953124999998,5.703447982149503],[2.4609375,1.142502403706165],[17.841796875,-2.3723687086440504],[21.708984375,-12.297068292853805],[27.773437499999996,-0.4394488164139641],[38.056640625,-1.4939713066293112],[32.16796875,6.489983332670651],[35.947265625,13.068776734357694],[24.78515625,12.64033830684679],[20.126953125,18.562947442888312],[14.501953124999998,5.703447982149503]]]}}]}
@sgibbes
sgibbes / list_overlapping_tsvs.py
Created October 12, 2017 17:08
find common tiles of tsv'd wdpa and ifl data, and download to local dir
from boto.s3.connection import S3Connection
import boto3
import os
# connect to the s3 bucket
conn = S3Connection(host="s3.amazonaws.com")
bucket = conn.get_bucket('gfw2-data')
s3 = boto3.resource('s3')
# loop through file names in the bucket
@sgibbes
sgibbes / combine.sh
Created April 5, 2017 14:32 — forked from mappingvermont/combine.sh
Combine all shapefiles in directory, selecting only features where GRID_ID (or DN) is = 1
#!/bin/bash
echo "here"
DATA=`find . -name '*.shp'`
ogr2ogr -a_srs EPSG:4326 merge.shp
for i in $DATA
do
SHP=${i:2:100}
FINAL="${SHP/.shp/}"
echo $FINAL
echo $i