Skip to content

Instantly share code, notes, and snippets.

View fabianp's full-sized avatar
🏠
Working from home

Fabian Pedregosa fabianp

🏠
Working from home
View GitHub Profile
@fabianp
fabianp / group_lasso.py
Created December 2, 2011 14:17
group lasso
import numpy as np
from scipy import linalg, optimize
MAX_ITER = 100
def group_lasso(X, y, alpha, groups, max_iter=MAX_ITER, rtol=1e-6,
verbose=False):
"""
Linear least-squares with l2/l1 regularization solver.
@agramfort
agramfort / lowess.py
Last active August 16, 2023 06:19
LOWESS : Locally weighted regression
"""
This module implements the Lowess function for nonparametric regression.
Functions:
lowess Fit a smooth nonparametric regression curve to a scatterplot.
For more information, see
William S. Cleveland: "Robust locally weighted regression and smoothing
scatterplots", Journal of the American Statistical Association, December 1979,
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@BenoitDamota
BenoitDamota / light_pca.py
Last active December 10, 2015 00:39
LightPCA : PCA with little memory footprint.
"""
LightPCA : PCA with little memory footprint, but can only fit_transform()
the data (no transform(), no inverse_transform()).
targeted data (be sure to have at least 5-6GB of free memory):
>>> import numpy as np
>>> from light_pca import LightPCA
>>> X = np.random.randn(1301, 500000) # ~5GB
>>> pca = LightPCA(copy=False, n_components=1300)
>>> X = pca.fit_transform(X)
"""
@pprett
pprett / bench_yahoo_ltrc.py
Created March 13, 2012 06:29
Sklearn Yahoo LTRC 2010 Benchmark script
import numpy as np
import svmlight_loader
from sklearn.ensemble import GradientBoostingRegressor
from time import time
ROOT_DIR = '/home/pprett/corpora/yahoo-ltrc-2010/data'
X_train, y_train = svmlight_loader.load_svmlight_file(ROOT_DIR + '/set1.train.txt',
n_features=700,
@fabianp
fabianp / gist:1076792
Created July 11, 2011 21:04
Ridge regression
import numpy as np
from scipy import linalg
def ridge(A, b, alphas):
"""Return coefficients for regularized least squares
||A x - b|| + alpha ||x||
"""
U, s, V = linalg.svd(X, full_matrices=False)
d = np.dot(U.T, y) / (s + alphas[:, np.newaxis] / s)
return np.dot(d, V)