Skip to content

Instantly share code, notes, and snippets.

View loiseaujc's full-sized avatar

Jean-Christophe loiseaujc

View GitHub Profile
@loiseaujc
loiseaujc / Rosenblatt.py
Last active December 11, 2021 20:23
Implementation of Rosenblatt's perceptron using Python.
# --> Import standard Python libraries.
import numpy as np
# --> Import sklearn utility functions to create derived-class objects.
from sklearn.base import BaseEstimator, ClassifierMixin
# --> Redefine the Heavisde function.
H = lambda x: np.heaviside(x, 1).astype(np.int)
class Rosenblatt(BaseEstimator, ClassifierMixin):
@loiseaujc
loiseaujc / spod.py
Created December 6, 2019 09:03
Python implementation of the spectral proper orthogonal decomposition.
"""
Spectral Proper Orthogonal Decomposition
-----------------------------------------
This module implements the Spectral Proper Orthogonal Decomposition class. The
present implementation corresponds to the batch algorithm originally proposed
in [1]. Note that a streaming algorithm has also been proposed in [2].
References
----------
@loiseaujc
loiseaujc / Adaline.py
Created May 4, 2020 11:07
Implementation of Adaline (Adaptive Linear Neurons) in Python.
# --> Import standard Python libraries.
import numpy as np
# --> Import sklearn utility functions to create derived-class objects.
from sklearn.base import BaseEstimator, ClassifierMixin
# --> Redefine the Heaviside function.
def H(x): return np.heaviside(x-0.5, 1).astype(np.int)
# --> Import standard Python libraries.
import numpy as np
from scipy.special import expit
from scipy.linalg import norm
# --> Import sklearn utility functions.
from sklearn.base import BaseEstimator, ClassifierMixin
class LogisticRegression_GD(BaseEstimator, ClassifierMixin):
# --> Import standard Python libraries.
import numpy as np
from scipy.special import expit
from scipy.linalg import norm
# --> Import sklearn utility functions.
from sklearn.base import BaseEstimator, ClassifierMixin
class LogisticRegression_Newton(BaseEstimator, ClassifierMixin):
# Author : Jean-Christophe Loiseau <jean-christophe.loiseau@ensam.eu>
# Date : July 2020
# --> Standard python libraries.
import numpy as np
from scipy.linalg import pinv, eigh, eig
def dmd_analysis(x, y=None, rank=2):
# --> Partition data matrix into X and Y.
@loiseaujc
loiseaujc / SoftMax_regression.py
Last active February 15, 2021 11:36
Simple implementation of SoftMax regression using gradient descent with quasi-optimal adaptive learning rate.
# --> Import standard Python libraries.
import numpy as np
from scipy.special import softmax
from scipy.linalg import norm
from scipy.optimize import line_search, minimize_scalar
# --> Import sklearn utility functions.
from sklearn.base import BaseEstimator, ClassifierMixin
def SoftMax(x):
using LinearAlgebra
function sensor_placement(Ψ)
# --> Compute the QR w/ column pivoting decomposition of Ψ.
_, _, p = qr(transpose(Ψ), Val(true))
return p[1:size(Ψ, 2)]
end
from scipy.linalg import qr
def sensor_placement(Psi):
# --> Perform QR w/ column pivoting.
_, _, p = qr(Psi.T, pivoting=True, mode="economic")
return p[:Psi.shape[1]]
using LinearAlgebra
using Convex, SCS
# --> Direct resolution of the measurement equation.
lstsq(Θ, y) = Θ \ y
# --> Constrained least-squares formulation.
function cstrnd_lstsq(Θ, y, Σ)
# --> Optimization variable.
a = Convex.Variable(length(Σ))