Skip to content

Instantly share code, notes, and snippets.

@flibbertigibbet
flibbertigibbet / google_viz.py
Last active December 17, 2015 16:19
Helper functions to use the Google Visualization API in Django. Module stores queries to use and creates a JSon representation of a Google Visualization DataTable for each by inspecting the query's model. Looks up any choices defined for the columns.
import gviz_api
from django.db.models.query import ValuesQuerySet
# mapping of model field types to google vizualization column types (all not
# in list are 'string' type) valid google viz data types are: 'string'
# 'number' 'boolean' 'date' 'datetime' 'timeofday'
fieldmap = {'DateField':'date','DateTimeField':'datetime','BooleanField':'boolean',
'IntegerField':'number','DecimalField':'number','BigIntegerField':'number',
'FloatField':'number','TimeField':'timeofday','NullBooleanField':'boolean'}
@flibbertigibbet
flibbertigibbet / query_places.py
Created December 14, 2013 06:03
Finds the centroids of 50km hexagons to tile a bounding box and writes the coordinates to a file. Then, creates a bash script to query Google Places for each point and write the responses to files. (There's some overlap between the queried areas, but it should be minimal.) Here, it's searching for grocery stores in New Jersey.
#!/usr/bin/env python
import csv
from math import cos, pi
# API key for Google Places
api_key= 'YOUR_KEY_GOES_HERE'
outf = open('njpoints.csv','w')
w = csv.writer(outf)
@flibbertigibbet
flibbertigibbet / FeedFetcher.py
Last active June 14, 2017 02:25
Checks for new GTFS feeds, then downloads and validates them. Fetches feeds for SEPTA, NYC MTA, NJ TRANSIT, CTTRANSIT, Delaware First State, NJ/NY PATH, and PATCO. Dependencies: Python requests, BeautifulSoup, git, and Google's transit feed validator. NJ TRANSIT developer login credentials required to download from that agency. Cannot check for …
#!/usr/bin/python
import requests, os, pickle, datetime, zipfile, subprocess, csv
from bs4 import BeautifulSoup
class FeedFetcher():
def __init__(self, ddir=os.getcwd(), get_nj=True, nj_username='', nj_pass=''):
self.ddir = ddir
self.get_nj = get_nj # whether to fetch from NJ TRANSIT or not
self.tc = {} # time checks for GTFS fetches
@flibbertigibbet
flibbertigibbet / ned_fetcher.py
Last active January 1, 2016 04:29
Downloads GFloat National Elevation Dataset data from USGS, converts to GTiff, then merges to a single tif. Expects a file 'ned_links.html' in the same directory that has the HTML contents of the email from USGS containing the download links for the GFloat zip files. Depends on BeautifulSoup and gdal; expects to be on a unix-y system.
#!/usr/bin/env python
from bs4 import BeautifulSoup
import requests, os, subprocess
from time import sleep
download_dir = os.getcwd()
soup = BeautifulSoup(open('ned_links.html','rb'))
of = open('just_ned_links.txt', 'wb')
@flibbertigibbet
flibbertigibbet / census_fetcher.py
Created December 23, 2013 06:20
Download TIGER/Line data from the Census FTP server based on state FIPS code. Here, pulls EDGE data for four states.
#!/usr/bin/env python
from os import walk
from ftplib import FTP
from time import sleep
# 10 - DE
# 34 - NJ
# 36 - NY
# 42 - PA
@flibbertigibbet
flibbertigibbet / recurse_ftp.py
Last active June 23, 2020 20:22
Recursively fetch files from an FTP server directory. Here, it's downloading all the zip files found in or beneath the parent directory.
#!/usr/bin/env python
from ftplib import FTP
from time import sleep
import os
my_dirs = [] # global
my_files = [] # global
curdir = '' # global
#!/bin/sh
#
# update_storyboard_strings.sh - automatically extract translatable strings from storyboards and update strings files
# Based on http://forums.macrumors.com/showpost.php?p=16060008&postcount=4 by mikezang
storyboardExt=".storyboard"
stringsExt=".strings"
newStringsExt=".strings.new"
oldStringsExt=".strings.old"
localeDirExt=".lproj"
@flibbertigibbet
flibbertigibbet / cyclephilly_geojson_generator.py
Created June 12, 2014 23:14
Build GeoJSON of CyclePhilly tracks from MySQL database
#!/usr/bin/env python
import pymysql
import geojson
from geojson import FeatureCollection, Feature, LineString, Point
dt_fmt = '%Y-%m-%d %H:%M'
db = pymysql.connect(host='', user='', passwd='', db='')
@flibbertigibbet
flibbertigibbet / FeedFetcherMA.py
Last active August 29, 2015 14:10
Fetch updated GTFS feeds for Massachusetts
#!/usr/bin/python
import requests, os, pickle, datetime, zipfile, subprocess, csv
class FeedFetcher():
def __init__(self, ddir=os.getcwd()):
self.ddir = ddir
self.tc = {} # time checks for GTFS fetches
self.new_use = [] # list of new feeds successfully downloaded and validated
#!/usr/bin/env python
""" Parse JSON real-time bus data from SEPTA:
http://www3.septa.org/hackathon/
"""
from datetime import datetime, timedelta
import json
import requests
import sys