Skip to content

Instantly share code, notes, and snippets.

View cheind's full-sized avatar

Christoph Heindl cheind

View GitHub Profile
@cheind
cheind / optional_property.h
Created December 13, 2011 19:29
C++ Policy Based Property Implementation
/**
* C++ property implementation
* Christoph Heindl 2011
* christoph.heindl@gmail.com
*/
#pragma once
#include <cheind/properties/property.h>
#include <cheind/properties/policy_optional_value.h>
@cheind
cheind / kalman.py
Created April 11, 2017 12:10
Milk and Butter price inference
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation
from matplotlib.patches import Ellipse
import math
# Kalman
def lkf_predict(x, P, A, B, u, Q):
@cheind
cheind / homography.py
Created June 5, 2017 13:21
1D homography between lines using DLT transform
import numpy as np
import matplotlib.pyplot as plt
o3d = np.array([-100, -100, 50])
d3d = np.array([1, 1, 5.])
d3d /= np.linalg.norm(d3d)
t3d = np.arange(0, 100, 1.)
p3d = o3d + d3d * t3d[:, None]
@cheind
cheind / svd.py
Created November 5, 2017 15:13
Nice properties about SVD
import numpy as np
import matplotlib.pyplot as plt
# Relation of SVD to PCA and eigen-problems
# A = USV'
# A'A = VSU'USV' = VS^2V'
# A'AV = VS^2V'V
# A'AV = VS^2
# which is an eigenvector problem. Means V are the eigenvectors of A'A.
# A similar argument leads to U being the eigenvectors AA'.
@cheind
cheind / hmm_train_tf.py
Last active December 5, 2022 07:20
HMM training based on gradient descent (Tensorflow version)
__author__ = 'Christoph Heindl'
__copyright__ = 'Copyright 2017'
__license__ = 'BSD'
"""Trains a HMM based on gradient descent optimization.
The parameters (theta) of the model are transition and
emission probabilities, as well as the initial state probabilities.
Given a start solution, the negative log likelihood of data given the
@cheind
cheind / hmm_train_mxnet_imp.py
Created December 17, 2017 10:15
HMM training based on gradient descent (MXNet imperative version)
__author__ = 'Christoph Heindl'
__copyright__ = 'Copyright 2017'
__license__ = 'BSD'
"""Trains a HMM based on gradient descent optimization.
The parameters (theta) of the model are transition and
emission probabilities, as well as the initial state probabilities.
Given a start solution, the negative log likelihood of data given the
@cheind
cheind / show_annotations.py
Created February 25, 2018 11:37
BeaverDam dense annotation viewer
import cv2
import json
import pandas as pd
import numpy as np
def convert_to_pandas(content):
events = []
for obj in content:
for f in obj['frames']:
events.append({
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@cheind
cheind / matrix_transactions.py
Last active January 11, 2021 07:44
Transactional-like (undoable) matrix operations (row delete, column permute) implemented based on permutation matrices
import numpy as np
from itertools import count
def perm_matrix(perm_indices):
'''Returns the permutation matrix corresponding to given permutation indices
Here `perm_indices` defines the permutation order in the following sense:
value `j` at index `i` will move row/column `j` of the original matrix to
row/column `i`in the permuated matrix P*M/M*P^T.
@cheind
cheind / optcost.py
Last active April 12, 2021 10:01
Linear Programming for Optimizing Funding Costs. See https://doi.org/10.5281/zenodo.4607219 for documentation.
from scipy.optimize import linprog
import numpy as np
import pandas as pd
def print_metrics(df):
print('Total staff costs', df.to_numpy().sum())
print('Management cost ratio')
print(df.MgtStaffCosts / df.to_numpy().sum())
print('Partner cost ratio')