Skip to content

Instantly share code, notes, and snippets.

View nmwalsh's full-sized avatar
💻

Nick Walsh nmwalsh

💻
View GitHub Profile
@nmwalsh
nmwalsh / template.html
Created July 20, 2017 09:37
Facebook Instant Article Template
<!doctype html>
<html lang="en" prefix="op: http://media.facebook.com/op#">
<head>
<meta charset="utf-8">
<!-- URL of the web version of this article -->
<!-- TODO: Change the domain to match the domain of your website -->
<link rel="canonical" href="http://example.com/article.html">
<meta property="op:markup_version" content="v1.0">
</head>
<body>
@nmwalsh
nmwalsh / predict.py
Created December 6, 2017 23:00
Standalone predict function for a saved Python ML model pickled in `model.dat`
# predict.py
# Script that should consist of a single method (predict) - passing data in a presumed parsimonious syntax to your model for prediction
#
# In this exaple, predict would require data of the following datatype:
# Pandas DataFrame with features
# X_test= [[ 6.9, 3.2, 5.7, 2.3]]
import os
import pickle
import pandas as pd
import random
@nmwalsh
nmwalsh / falcon_gateway.py
Created December 7, 2017 02:23
Falcon API gateway for a simple machine learning API
# falcon_gateway.py
import falcon
import json
from data_handler import invoke_predict
# Falcon follows the REST architectural style, meaning (among
# other things) that you think in terms of resources and state
# transitions, which map to HTTP verbs.
# data_handler.py
#
# Argument handler that does 4 things.
#
# 1. Decode: deserialize raw input from API POST request received in `falcon_gateway.py`
# 2. Preprocess: convert input data into form required for model, as specified in `predict.py`
# 3. Postprocess: convert prediction from model (from `predict.py`) into form that can be serializable for serving API response
# 4. Encode: serialize postprocessed data into valid JSON-esque format for API response, and pass back to `falcon_gateway.py`
import json
class InfoResource(object):
def on_get(self, req, resp):
"""Handles GET requests"""
resp.status = falcon.HTTP_200 # This is the default status
resp.body = ('\nThis is an API for a deployed Datmo model, '
'where it takes flower lengths as input and returns the predicted Iris species.\n'
'To learn more about this model, send a GET request to the /predicts endpoint or visit the repository online at: \n\n'
'https://datmo.com/nmwalsh/falcon-api-model\n\n')
class PredictsResource(object):
def on_get(self, req, resp):
"""Handles GET requests"""
resp.status = falcon.HTTP_200 # This is the default status
resp.body = ('\nThis is the PREDICT endpoint. \n'
'Both requests and responses are served in JSON. \n'
'\n'
'INPUT: Flower Lengths (in cm) \n'
' "sepal_length":[num] \n'
' "sepal_width": [num] \n'
# falcon.API instances are callable WSGI apps. Never change this.
app = falcon.API()
# Resources are represented by long-lived class instances. Each Python class becomes a different "URL directory"
info = InfoResource()
predicts = PredictsResource()
# things will handle all requests to the '/info' or '/predicts' URL path
app.add_route('/info', info)
app.add_route('/predicts', predicts)
if [[ `uname` == 'Linux' ]]; then
echo 'Removing old Torch files from your Linux...'
# Removing folders
sudo rm -rf /usr/local/lib/{luarocks/,lua/,torch/,torchrocks/}
sudo rm -rf /usr/local/share/{torch,cmake/torch/,lua}
sudo rm -rf /usr/local/etc/{luarocks/,torchrocks/}
sudo rm -rf /usr/local/include/{torch,TH,THC,lauxlib.h,lua.h,lua.hpp,luaT.h,luaconf.h,luajit.h,lualib.h,qtlua}
sudo rm -rf ~/.luarocks
sudo rm -rf ~/.cache/luarocks*
# Removing files
@nmwalsh
nmwalsh / select system drivers
Created July 19, 2018 04:05
select system drivers
(1) gpu
(2) cpu
Please select one of the above environment type (e.g. 1 or gpu):
@nmwalsh
nmwalsh / select an environment
Created July 19, 2018 04:13
select an environment
(1) data-analytics : has libraries such as xgboost, lightgbm, sklearn etc.
(2) mxnet : has libraries for mxnet(v1.1.0) along with sklearn, opencv etc.
(3) caffe2 : has libraries for caffe2(v0.8.0) along with sklearn, opencv etc.
(4) keras-tensorflow : has libraries for keras(v2.1.6) and tensorflow(v1.9.0) along with sklearn, opencv etc.
(5) kaggle : has the environment provided by kaggle
(6) pytorch : has libraries for pytorch(v0.4.0) along with sklearn, opencv etc.
(7) python-base : has base python image with no libraries installed
(8) r-base : has base R image with no libraries installed. Use this environment for rstudio workspace
Please select one of the above environments (e.g. 1 or data-analytics):