Skip to content

Instantly share code, notes, and snippets.

View biggers's full-sized avatar

Mark Biggers biggers

View GitHub Profile
#!/usr/bin/env python
'''Script to pull data out of GitHub and push into Elasticsearch'''
import os
import sys
import requests
import httplib
import json # NOQA
from urlparse import urljoin
from uritemplate import expand
@jianingy
jianingy / iodatagram.py
Last active November 15, 2017 12:11
udp / raw socket for python tornado
#
# This piece of code is written by
# Jianing Yang <jianingy.yang@gmail.com>
# with love and passion!
#
# H A P P Y H A C K I N G !
# _____ ______
# ____==== ]OO|_n_n__][. | |
# [________]_|__|________)< |YANG|
# oo oo 'oo OOOO-| oo\\_ ~o~~o~
@biggers
biggers / Makefile.docker
Created March 22, 2016 15:46
GNU Makefile for docker-compose work
## -*- makefile -*-
## biggers@utsl.com, Mark Biggers
## GNU Makefile for docker-compose build & run of a Python or other Project
##
## REFs:
## https://docs.docker.com/compose/reference/
## https://github.com/docker/compose/releases
##
@davebshow
davebshow / a_aioclient.py
Last active July 24, 2017 01:28
This gist details some simple profiling that I did to compare compare a Tornado based client implementation vs. a Aiohttp based implementation of gremlinclient. To do so, I simply dropped the Aiohttp websocket client in the place of the Tornado client in gremlinclient like shown in the following file. Next I did a simple IPython %timeit, followe…
import asyncio
import collections
import json
import uuid
import aiohttp
Message = collections.namedtuple(
"Message",
@mivade
mivade / coroutinify.py
Created August 3, 2015 17:53
Adapting blocking calls to Tornado coroutines with run_on_executor decorators
import random
import time
from tornado import gen
from tornado.concurrent import run_on_executor, futures
from tornado.ioloop import IOLoop
class TaskRunner(object):
def __init__(self, loop=None):
self.executor = futures.ThreadPoolExecutor(4)
self.loop = loop or IOLoop.instance()
@mivade
mivade / tornadobg.py
Last active March 17, 2020 02:07
Background tasks with tornado and concurrent.futures
"""A simple demonstration of running background tasks with Tornado.
Here I am using a basic TCP server which handles streams and keeps
them open while asynchronously performing a fake task in the
background. In order to test it, simply telnet to localhost port 8080
and start typing things to see that the server receives the messages.
The advantage to running on an executor instead of conventional
threads is that we can more easily shut it down by stopping the
tornado IO loop.
@jamiesun
jamiesun / udpserver.py
Created May 25, 2014 10:34
tornado UDP server
#!/usr/bin/env python
#coding=utf-8
import socket
import os
import errno
from tornado.ioloop import IOLoop
from tornado.platform.auto import set_close_exec
class UDPServer(object):
def __init__(self, io_loop=None):
@biggers
biggers / linux_rescue_notes.rst
Last active November 15, 2018 02:05
Rescuing a Linux system
@dankrause
dankrause / pagerduty.py
Last active May 30, 2023 00:43
Simple python client for the Pagerduty integration API
#!/usr/bin/env python
"""pagerduty.py
Usage:
pagerduty.py trigger [options] <description> [<incident_key>]
pagerduty.py acknowledge [options] <description> <incident_key>
pagerduty.py resolve [options] <description> <incident_key>
Options:
-c --conf=FILE A path to a config file
@jokull
jokull / gist:5639728
Created May 23, 2013 21:51
Cache a python package from PyPI on S3
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""pycache -- cache a python package from PyPI on S3.
A simple script to collect a cache of packages locally and sync them up to an S3 bucket, using directories as namespaces so that different projects can have different dependencies.
This is just about the simplest thing that could possibly work.
"""
import warnings
warnings.filterwarnings('ignore')