Skip to content

Instantly share code, notes, and snippets.

View loiseaujc's full-sized avatar

Jean-Christophe loiseaujc

View GitHub Profile
import numpy as np
import matplotlib.pyplot as plt
from scipy.linalg import solve_continuous_lyapunov as clyap
from scipy.linalg import svd
# --> Utility function.
vec2array = lambda x, ny, nz : x.reshape(ny, nz)
array2vec = lambda x : x.flatten()
# --> Differential operators.
@loiseaujc
loiseaujc / LQE.ipynb
Last active January 31, 2024 09:07
Notebook for the control class on the Kalman filter
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@loiseaujc
loiseaujc / LQR.ipynb
Last active January 30, 2024 23:27
Notebook for the control class on Linear Quadratic Regulator
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
using StructuredOptimization
"""
Simple implementation of basis pursuit denoising using StructuredOptimization.jl
INPUT
-----
C : The measurement matrix.
Ψ : Basis in which x is assumed to be sparse.
y : Pixel measurements.
λ : (Optional) Sparsity knob.
@loiseaujc
loiseaujc / optimized_bpdn.jl
Created February 17, 2021 13:51
Implementation of basis pursuit denoising with StructuredOptimization.jl
using StructuredOptimization
"""
Simple implementation of basis pursuit denoising using StructuredOptimization.jl
INPUT
-----
m, n : Size of the image in both direction.
idx : Linear indices of the measured pixels.
y : Pixel measurements.
using LinearAlgebra
using Convex, SCS
# --> Direct resolution of the measurement equation.
lstsq(Θ, y) = Θ \ y
# --> Constrained least-squares formulation.
function cstrnd_lstsq(Θ, y, Σ)
# --> Optimization variable.
a = Convex.Variable(length(Σ))
from scipy.linalg import qr
def sensor_placement(Psi):
# --> Perform QR w/ column pivoting.
_, _, p = qr(Psi.T, pivoting=True, mode="economic")
return p[:Psi.shape[1]]
using LinearAlgebra
function sensor_placement(Ψ)
# --> Compute the QR w/ column pivoting decomposition of Ψ.
_, _, p = qr(transpose(Ψ), Val(true))
return p[1:size(Ψ, 2)]
end
@loiseaujc
loiseaujc / SoftMax_regression.py
Last active February 15, 2021 11:36
Simple implementation of SoftMax regression using gradient descent with quasi-optimal adaptive learning rate.
# --> Import standard Python libraries.
import numpy as np
from scipy.special import softmax
from scipy.linalg import norm
from scipy.optimize import line_search, minimize_scalar
# --> Import sklearn utility functions.
from sklearn.base import BaseEstimator, ClassifierMixin
def SoftMax(x):
# Author : Jean-Christophe Loiseau <jean-christophe.loiseau@ensam.eu>
# Date : July 2020
# --> Standard python libraries.
import numpy as np
from scipy.linalg import pinv, eigh, eig
def dmd_analysis(x, y=None, rank=2):
# --> Partition data matrix into X and Y.