Skip to content

Instantly share code, notes, and snippets.

@FrederickGeek8
Created May 22, 2021 23:23
Show Gist options
  • Save FrederickGeek8/f9cf79e2005af0a7eee8e03ec68b17ba to your computer and use it in GitHub Desktop.
Save FrederickGeek8/f9cf79e2005af0a7eee8e03ec68b17ba to your computer and use it in GitHub Desktop.
Test file for PR
@Inbook{Bengio2006,
author="Bengio, Yoshua
and Delalleau, Olivier
and Le Roux, Nicolas
and Paiement, Jean-Fran{\c{c}}ois
and Vincent, Pascal
and Ouimet, Marie",
editor="Guyon, Isabelle
and Nikravesh, Masoud
and Gunn, Steve
and Zadeh, Lotfi A.",
title="Spectral Dimensionality Reduction",
bookTitle="Feature Extraction: Foundations and Applications",
year="2006",
publisher="Springer Berlin Heidelberg",
address="Berlin, Heidelberg",
pages="519--550",
abstract="In this chapter, we study and put under a common framework a number of non-linear dimensionality reduction methods, such as Locally Linear Embedding, Isomap, Laplacian eigenmaps and kernel PCA, which are based on performing an eigen-decomposition (hence the name ``spectral''). That framework also includes classical methods such as PCA and metric multidimensional scaling (MDS). It also includes the data transformation step used in spectral clustering. We show that in all of these cases the learning algorithm estimates the principal eigenfunctions of an operator that depends on the unknown data density and on a kernel that is not necessarily positive semi-definite. This helps generalizing some of these algorithms so as to predict an embedding for out-of-sample examples without having to retrain the model. It also makes it more transparent what these algorithm are minimizing on the empirical data and gives a corresponding notion of generalization error.",
isbn="978-3-540-35488-8",
doi="10.1007/978-3-540-35488-8_28",
url="https://doi.org/10.1007/978-3-540-35488-8_28"
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment