Skip to content

Instantly share code, notes, and snippets.

View dermatologist's full-sized avatar
🔥
Playing with FHIR

Bell Eapen dermatologist

🔥
Playing with FHIR
View GitHub Profile
@badsyntax
badsyntax / android_emulator_cli_ci.md
Last active January 27, 2022 12:03
start an android emulator with screen dimensions (specifically for use in CI)
# Install AVD files
yes | $ANDROID_HOME/tools/bin/sdkmanager --install 'system-images;android-29;default;x86'
yes | $ANDROID_HOME/tools/bin/sdkmanager --licenses

# Create emulator
echo "no" | $ANDROID_HOME/tools/bin/avdmanager create avd -n Pixel_API_29_AOSP -d pixel --package 'system-images;android-29;default;x86' --force

$ANDROID_HOME/emulator/emulator -list-avds
@mugifly
mugifly / image-predict-on-tfjs-node.js
Last active January 7, 2024 16:11
Image Prediction on tfjs-node (with model made by Teachable Machine Image)
const tf = require('@tensorflow/tfjs-node');
const Jimp = require('jimp');
// Directory path for model files (model.json, metadata.json, weights.bin)
// NOTE: It can be obtained from [Export Model] -> [Tensorflow.js] -> [Download my model]
// on https://teachablemachine.withgoogle.com/train/image
const MODEL_DIR_PATH = `${__dirname}`;
// Path for image file to predict class
const IMAGE_FILE_PATH = `${__dirname}/example.jpg`;
# Having a requirements.txt file has follows
torch==1.5.0
numpy==1.18.1
# Add channels. Last added is with the highest priorety
conda config --add channels pytorch
conda config --add channels conda-forge
conda config --add channels anaconda
# Install pip for fallback
@yuhanz
yuhanz / run-bert-tensorflow2.py
Last active March 13, 2023 14:11
To run bert with tensorflow 2.0
pip install bert-for-tf2
pip install bert-tokenizer
pip install tensorflow-hub
pip install bert-tensorflow
pip install sentencepiece
import tensorflow_hub as hub
import tensorflow as tf
import bert
@nraw
nraw / torch_model.py
Last active August 15, 2023 10:19
Kedro Pytorch Model io
""" Kedro Torch Model IO
Models need to be imported and added to the dictionary
as shown with the ExampleModel
Example of catalog entry:
modo:
type: kedro_example.io.torch_model.TorchLocalModel
filepath: modo.pt
@pirate
pirate / docker-compose-backup.sh
Last active March 31, 2024 00:51
Backup a docker-compose project, including all images, named and unnamed volumes, container filesystems, config, logs, and databases.
#!/usr/bin/env bash
### Bash Environment Setup
# http://redsymbol.net/articles/unofficial-bash-strict-mode/
# https://www.gnu.org/software/bash/manual/html_node/The-Set-Builtin.html
# set -o xtrace
set -o errexit
set -o errtrace
set -o nounset
set -o pipefail
@ZachisGit
ZachisGit / ExportModel.py
Last active January 6, 2023 13:51
Make your model Tensorflow Serving compatible and modify it to accept png encoded images as input and return png encoded images as output.
''' ExportModel.py - TF-Serving
# Basically we are wrapping your pretrained model
# in a tensorflow serving compatible format.
# This excepts base64 encoded png images and uses
# them as input to your model. Then we convert
# your models output into a png encoded image and
# it gets returned by tensorflow serving base64 encoded.
'''
import tensorflow as tf
@mrk-han
mrk-han / emulator-install-using-avdmanager.md
Last active April 16, 2024 14:43
Installing and creating Emulators with AVDMANAGER (For Continuous Integration Server or Local Use)

Install and Create Emulators using AVDMANAGER and SDKMANAGER

TL;DR

For an emulator that mimics a Pixel 5 Device with Google APIs and ARM architecture (for an M1/M2 Macbook):

  1. List All System Images Available for Download: sdkmanager --list | grep system-images

  2. Download Image: sdkmanager --install "system-images;android-30;google_atd;arm64-v8a"

// Create a Singularity image from a Docker image that is in the Docker hub
// where /tmp/ is the folder where the image will be created and ubuntu:14.04
// is the docker image used to convert to the Singularity image
docker run \
-v /var/run/docker.sock:/var/run/docker.sock \
-v /tmp/:/output \
--privileged -t --rm \
singularityware/docker2singularity \
ubuntu:14.04
//
@alexellis
alexellis / kvm_minikube.md
Last active July 21, 2023 10:45
Run multiple minikube Kubernetes clusters on Ubuntu Linux with KVM

Ramp up your Kubernetes development, CI-tooling or testing workflow by running multiple Kubernetes clusters on Ubuntu Linux with KVM and minikube.

In this tutorial we will combine the popular minikube tool with Linux's Kernel-based Virtual Machine (KVM) support. It is a great way to re-purpose an old machine that you found on eBay or have gathering gust under your desk. An Intel NUC would also make a great host for this tutorial if you want to buy some new hardware. Another popular angle is to use a bare metal host in the cloud and I've provided some details on that below.

We'll set up all the tooling so that you can build one or many single-node Kubernetes clusters and then deploy applications to them such as OpenFaaS using familiar tooling like helm. I'll then show you how to access the Kubernetes clusters from a remote machine such as your laptop.

Pre-reqs

  • This tutorial uses Ubuntu 16.04 as a base installation, but other distributions are supported by KVM. You'll need to find out how to install