Skip to content

Instantly share code, notes, and snippets.

View hcl14's full-sized avatar

Egor Beliaev hcl14

View GitHub Profile
@hcl14
hcl14 / infer.py
Last active December 22, 2018 02:44
MobileNet AVA inference - creation of prediction dataset for https://github.com/idealo/image-quality-assessment/issues/18
# Create dataset of labels and predictions for the model from https://github.com/idealo/image-quality-assessment/
# Use tensorflow >= 1.10.0 to load this model. I had weird errors.
from os import path
import numpy as np
import pandas as pd
import cv2
from keras.preprocessing.image import load_img
from keras.preprocessing.image import img_to_array
@hcl14
hcl14 / app.py
Created October 18, 2018 20:11
Speed up video capture for https://github.com/victordibia/skyfall
## Author: Victor Dibia
## Load hand tracking model, spin up web socket and web application.
from utils import detector_utils as detector_utils
from utils import object_id_utils as id_utils
import cv2
import tensorflow as tf
import multiprocessing
from multiprocessing import Queue, Pool
## Author: Victor Dibia
## Load hand tracking model, spin up web socket and web application.
from utils import detector_utils as detector_utils
from utils import object_id_utils as id_utils
import cv2
import tensorflow as tf
import multiprocessing
from multiprocessing import Queue, Pool
@hcl14
hcl14 / 1_tkinter_separate.py
Last active October 16, 2018 10:20
Running tkinter (packange pythin3-tk in Linux) as a separate process to display images, generated in main program
# My answer here: https://stackoverflow.com/questions/52793096/reload-and-zoom-image/52818151#52818151
from PIL import ImageTk, Image
from scipy.ndimage import rotate
from scipy.misc import imresize
import numpy as np
import time
@hcl14
hcl14 / flask_app.py
Created October 16, 2018 09:37
Logging into separate files for multiprocessing and tornado/flask in python
# create flask app to be run by tornado process
# process-specific globals
import global_vars
from flask import Flask, request, Response, json, abort, jsonify
# oridnary (non-flask) json if needed
import json as json2
@hcl14
hcl14 / qlearn_simple.py
Last active October 16, 2018 08:53
Q-learning behavior
# Simple example of Q-learning inability to go in loops
# Though it is strictly forbibben by the code (line 101),
# but you can comment out that logic and see that algorithm just becomes less stable
# The reason is that loop is impossible in this setup,
# as only a single Q-value exists for each position on the map
import numpy as np
np.random.seed(0)
@hcl14
hcl14 / database.txt
Created October 10, 2018 14:32
database.txt
Which color is normally a cat?;Black
How tall was the longest man on earth?;272 cm
Is the earth round?;Yes
Which color is normally a cat?;Black
How tall was the longest man on earth?;272 cm
Is the earth round?;Yes
Which color is normally a cat?;Black
How tall was the longest man on earth?;272 cm
Is the earth round?;Yes
Which color is normally a cat?;Black
# Layer-wise training neural network with second order (Newton).
# New layer is added on each iteration and optimized with Newton Method.
# And example of Tensorflow eager execution
# Combines gradiets, hessian, and call to Optimizer
# Might contain logical errors though, so think yourself when adapting this code
# Newton's method in Tensorflow
@hcl14
hcl14 / newton_tensorflow_sklearn_digits.py
Last active September 20, 2018 11:56
Simple example of Netwon's method of second order optimization in Tensorflow on sklearn digits dataset
# Newton's method in Tensorflow
# WARNING! This code is memory and computationally intensive, better run it on GPU
# having bigger dimensionality increases computing time significantly
# Original dataset is passable on GTX 1050 GPU, but if you have time/memory problems, uncomment PCA compression
# Also, you can probably remove line 159 (hessian fixing) if you use PCA
# 'Vanilla' N.m. intended to work when loss function to be optimized is convex.
# One-layer linear network without activation is convex.
# If activation function is monotonic, the error surface associated with a single-layer model is convex.
@hcl14
hcl14 / newton_tensorflow_iris.py
Created September 20, 2018 10:50
Simple example of second-order optimization via Newton's method in Tensorflow on Iris dataset
# Newton's method in Tensorflow
# 'Vanilla' N.m. intended to work when loss function to be optimized is convex.
# One-layer linear network without activation is convex.
# If activation function is monotonic, the error surface associated with a single-layer model is convex.
# In other cases, Hessian will have negative eigenvalues in saddle points and other non-convex places of the surface
# To fix that, you can try different methods. One of those approaches is to do eigendecomposition of H and invert negative eigenvalues,
# making H "pushing out" in those directions, as described in this paper: Identifying and attacking the saddle point problem in high-dimensional non-convex optimization (https://papers.nips.cc/paper/5486-identifying-and-attacking-the-saddle-point-problem-in-high-dimensional-non-convex-optimization.pdf)