Skip to content

Instantly share code, notes, and snippets.

@hariby
hariby / stream_project.sh
Created September 6, 2018 03:35
See the DeepLens project stream
#!/bin/bash
X=858
Y=480
mplayer -demuxer lavf -lavfdopts format=mjpeg:probesize=32 /tmp/results.mjpeg -x $X -y $Y
import io, boto3
import numpy as np
from PIL import Image
from botocore.exceptions import ClientError
REGION = 'us-east-2'
CollectionId = '<CollectionID>'
def get_name(frame):
rekognition = boto3.client('rekognition', region_name=REGION)
@hariby
hariby / create-manifest_aft-vbi-pds.py
Last active January 12, 2019 02:45
Create a Manifest File for Amazon SageMaker Ground Truth
# put the output manifest file to your S3 (in the us-east-1 region).
object = 's3://aft-vbi-pds/bin-images/{:05}.jpg'
with open('aft-vbi-pds-manifest.json', 'w') as f:
for i in range(1, 100000):
f.write('{"source-ref":"' + object.format(i) + '"}\n')
@hariby
hariby / object_detection_tutorial_high_level_api.py
Created April 16, 2019 14:46
Use high-level APIs of SageMaker SDK for PIPE mode augmented manifest training on object_detection_tutorial.ipynb
# object_detection_tutorial.ipynb
od_model = sagemaker.estimator.Estimator(training_image,
role,
train_instance_count=1,
train_instance_type='ml.p3.2xlarge',
train_volume_size = 50,
train_max_run = 360000,
input_mode = 'Pipe',
output_path=s3_output_path,
@hariby
hariby / ml-pipeline-deploy-lambda.py
Last active July 13, 2019 09:18
Deploy endpoint in machine learning pipeline which will be integrated with AWS Step Functions
import time
import json
import boto3
def create_endpoint(event, job_name_prefix='machine-learning-pipeline'):
sagemaker_client = boto3.client('sagemaker')
# create model
timestamp = time.strftime('-%Y-%m-%d-%H-%M-%S', time.gmtime())
model_name= job_name_prefix + timestamp
@hariby
hariby / resnet-18.py
Created July 2, 2019 01:56
Simple example for pre-trained model deployment with Amazon SageMaker.
# Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License").
# You may not use this file except in compliance with the License.
# A copy of the License is located at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# or in the "license" file accompanying this file. This file is distributed
# on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either
@hariby
hariby / predict_image.ipynb
Last active July 2, 2019 01:58
Simple example for pre-trained model (https://mxnet.incubator.apache.org/versions/master/tutorials/python/predict_image.html) deployment with Amazon SageMaker.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@hariby
hariby / create_sagemaker_layer.sh
Last active July 13, 2019 09:18
Creating a SageMaker Python SDK Lambda Layer. Merge after AWSLambda-Python37-SciPy1x
#!/bin/bash
mkdir python
python3 -m pip install -r sagemaker-sdk-requirements.txt -t ./python/ --no-deps
zip sagemaker-sdk-layer.zip -r ./python/
rm -rf python
#!/bin/bash
set -e
sudo -u ec2-user -i <<'EOF'
git clone https://github.com/awslabs/git-secrets.git
cd git-secrets/
sudo make install
git secrets --register-aws --global
@hariby
hariby / blueqat-qgate-on-start.sh
Last active November 7, 2019 05:08
Install Blueqat and Qgate to conda_python3 kernel when starting a SageMaker Notebook instance.
#!/bin/bash
set -e
# OVERVIEW
# This script installs a single pip package in a single SageMaker conda environments.
# Slightly modified from https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-pip-package-single-environment/on-start.sh
sudo -u ec2-user -i <<'EOF'
# PARAMETERS