Skip to content

Instantly share code, notes, and snippets.

View wronk's full-sized avatar

Mark Wronkiewicz wronk

  • Southern California
View GitHub Profile
@wronk
wronk / epochsToNpy.py
Created October 28, 2015 22:09
Convert epochs to stc and save as array for use in Blender.
"""
epochsToStc
@author:wronk
Convert epochs to stc and save as array for use in Blender.
"""
import mne
import numpy as np
@wronk
wronk / vertexColors_v1.py
Created October 28, 2015 22:11
vertexColors_v1.py
"""
vertexColors_v1.py
@author: wronk
Script meant to be run within blender to load source estimate data and create a series of images
(one for each time point) that can be converted into a movie. This requires that the brain surface
first be loaded into blender.
"""
# Blender specific libraries:
#Generate an inverse solution via python
import mne
import os
fwdName = "fwd.fif"
rawName = "raw.fif"
covName = "noiseCov.fif"
fSaveInv = os.path.join(os.getcwd(), "invPython.fif")
@wronk
wronk / postUpdate_TODO_missingEEGReference.html
Created September 10, 2013 22:46
Difference in fif read/write output before and after PR #753
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<!-- saved from url=(0021)file:///tmp/tmpnc7vEt -->
<html><head><meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title></title>
<style type="text/css">
table.diff {font-family:Courier; border:medium;}
.diff_header {background-color:#e0e0e0}
td.diff_header {text-align:right}
.diff_next {background-color:#c0c0c0}
@wronk
wronk / Tutorial_EC2_with_Python.ipynb
Last active July 17, 2017 19:26
A basic tutorial on getting an Amazon EC2 instance running with Python for cloud computing
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@wronk
wronk / TF_servingPost_export_estimator.py
Created February 13, 2019 19:33
TF Serving blog post: creating and saving TF estimator object
# Define a place to save your compiled model
export_dir = '/path/to/my_exported_models/001'
# Define a path to your trained keras model and load it in as a `tf.keras.models.Model`
# If you just trained your model, you may already have it in memory and can skip the below 2 lines
model_save_fpath = '/path/to/my_model.h5'
keras_model = tf.keras.models.load_model(model_save_fpath)
# Create an Estimator object
estimator_save_dir = '/path/to/save/estimator'
@wronk
wronk / TF_servingPost_docker_image.sh
Created February 13, 2019 19:36
TF Serving blog post: Docker image
######################################
# Pseudocode for creating Docker image
# Get the Docker TF Serving image we'll use as a foundation to build our custom image
docker pull tensorflow/serving
# Start up this TF Docker image as a container named `serving_base`
docker run -d --name serving_base tensorflow/serving
# Copy the Estimator from our local folder to the Docker container
@wronk
wronk / TF_servingPost_inference_payload.py
Created February 13, 2019 19:38
TF Serving blog post: sending payloads for inference
import json
import base64
import requests
# Modify the name of your model (`hv_grid` here) to match what you used in Section 2
server_endpoint = 'http://localhost:8501/v1/models/hv_grid:predict'
img_fpaths = ['path/to/my_image_1.png', 'path/to/my_image_2.png']
# Load and Base64 encode images
data_samples = []
@wronk
wronk / TF_servingPost_payload_format.json
Created February 13, 2019 19:40
TF Serving blog post: json payload example format
{
"instances": [
{
"image_bytes": {"b64": "iVBO...Oxs6"}
},
{
"image_bytes": {"b64": "0KGg...Pyg8"}
},
{
"image_bytes": {"b64": "AABr...EKA0"}
@wronk
wronk / TF_servingPost_send_inference_payload.py
Created February 13, 2019 19:42
TF Serving blog post: sending payload
# Send prediction request
r = requests.post(server_endpoint, data=payload)
print(json.loads(r.content)['predictions'])