Skip to content

Instantly share code, notes, and snippets.

View sergiolucero's full-sized avatar
💭
coding the days away

Sergio Lucero sergiolucero

💭
coding the days away
View GitHub Profile
@sergiolucero
sergiolucero / pics_to_ppt.py
Last active August 11, 2021 17:55
PPT compiler
from pptx import Presentation
from pptx.util import Inches
import glob
SLIDE_LAYOUT_TITLE_AND_CONTENT = 1
FILTER = 'fotos/*.jpg' # what to include
prs = Presentation()
slide_layout = prs.slide_layouts[SLIDE_LAYOUT_TITLE_AND_CONTENT]
@sergiolucero
sergiolucero / isa.py
Last active November 24, 2020 03:40
here isolines
from creds import ID,CODE # isolines: driving from my house
import folium
import requests
home =[-33.406654,-70.572701] # 085 LOS MILITARES / ALONSO DE CORDOVA
head = 'https://isoline.route.cit.api.here.com/routing/7.2/calculateisoline.json?'
URL_BASE = '{}app_id={}&app_code={}&mode=shortest;car;traffic:disabled&start=geo!{},{}&range={}&rangetype={}'
def isodata(home, range, type):
@sergiolucero
sergiolucero / async_scraper.py
Last active October 6, 2019 17:15
async scraping
import aiohttp
import asyncio
import time, pandas as pd
def async_http_get(urls, extractor=None, json_response=True):
tasks = []
sem = asyncio.Semaphore(32)
async def fetch(session, url):
async with session.get(url) as response:
@jflasher
jflasher / details.md
Created February 23, 2018 14:56
Accessing data older than 90 days from OpenAQ

Currently, only data within the last 90 days is available via the OpenAQ API. However, there is much more data available on OpenAQ and a variety of different access mechanisms. Note also that there is work under way to bring back to the API a mechanism to access the data older than 90 days, details here.

If you're looking to query across all the data or even easily export the data (or a subset of it), the easiest way to do that currently is using a service like Amazon Athena. I'll provide some directions on how to do that below, but at a high level, this will let you make any query of the entire dataset that you'd like (written in SQL). I'll also provide some sample queries so you can see what's possible.

On to the directions!

  1. You will need to create an AWS account if you don't currently have one, you can start this process at htt
@sergiolucero
sergiolucero / showtables.py
Last active January 27, 2018 16:55
show sqlite3 tables in python
import sqlite3
import pandas as pd
db = sqlite3.connect('database.db')
tables = pd.read_sql_query("SELECT * FROM sqlite_master WHERE type='table'", db)
print(tables)
@nygeog
nygeog / csv-to-shapefile-geopandas.py
Last active November 25, 2023 09:47
Read a CSV with Pandas and set as GeoDataFrame with geopandas and save as Shapefile with fiona
import pandas as pd
from geopandas import GeoDataFrame
from shapely.geometry import Point
import fiona
df = pd.read_csv('data.csv')
geometry = [Point(xy) for xy in zip(df.x, df.y)]
crs = {'init': 'epsg:2263'} #http://www.spatialreference.org/ref/epsg/2263/
geo_df = GeoDataFrame(df, crs=crs, geometry=geometry)
@sergiolucero
sergiolucero / slideshow.py
Last active January 31, 2019 16:30
bunch of pics to ppt
import glob, os, pptx, scipy.misc
from PIL import Image
size = 800,600
prs = pptx.Presentation()
for fn in glob.glob('*.jpg'):
img = Image.open(fn)
img.thumbnail(size, Image.ANTIALIAS)
img.save(fn)
@eskerda
eskerda / termbike.py
Last active November 29, 2017 18:38
# -*- coding: utf-8 -*-
"""
❯ termbike.py "36 Maple St, NY"
396 - Lefferts Pl & Franklin Ave
🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 🚲 ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮
365 - Fulton St & Grand Ave
🚲 🚲 🚲 🚲 🚲 ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮
436 - Hancock St & Bedford Ave
🚲 🚲 🚲 🚲 🚲 ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮ ☮
437 - Macon St & Nostrand Ave
@kinnala
kinnala / gist:6376196
Created August 29, 2013 09:50
Soccerway API scraping example
import urllib2
import json
import re
class SoccerwayTeamMatches:
def __init__(self, teamId):
self.teamId = str(teamId)
self.data = {'all': [], 'home': [], 'away': []}