Skip to content

Instantly share code, notes, and snippets.

@goerz
Last active September 7, 2020 08:27
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save goerz/6395404bf71f5a3be3727d35a16f1f57 to your computer and use it in GitHub Desktop.
Save goerz/6395404bf71f5a3be3727d35a16f1f57 to your computer and use it in GitHub Desktop.
PDF bookmarks for "James, Witten, Hastie, Tibshirani - An Introduction to Statistical Learning" (LaTeX)

This gist contains out.tex, a tex file that adds a PDF outline ("bookmarks") to the freely available pdf file of the book

An Introduction to Statistical Learning with Applications in R, by Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani

http://www-bcf.usc.edu/~gareth/ISL/index.html

The bookmarks allow to navigate the contents of the book while reading it on a screen.

Usage

  • download http://www-bcf.usc.edu/~gareth/ISL/ISLR%20Seventh%20Printing.pdf as in.pdf
  • create cover.pdf from any image of the cover you can find on the web (print to pdf), or use the file cover.pdf attached to this gist
  • download out.tex into the same folder as in.pdf and cover.pdf
  • compile as pdflatex out.tex
  • rename the resulting output file out.pdf to e.g. James, Witten, Hastie, Tibshirani - An Introduction to Statistical Learning.pdf
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
% Usage:
% * download http://www-bcf.usc.edu/~gareth/ISL/ISLR%20Seventh%20Printing.pdf
% as 'in.pdf'
% * create 'cover.pdf' from any image of the cover you can find on the web
% (print to pdf)
% * store this file as 'out.tex', and compile as 'pdflatex out.tex'
% * rename output file to e.g.
% 'James, Witten, Hastie, Tibshirani - An Introduction to Statistical Learning.pdf'
\documentclass{article}
\usepackage[utf8]{inputenc}
\usepackage{geometry}
%\geometry{papersize={155mm,235mm}}
% You may uncomment the above line to create the book in the original size.
% Otherwise, the output page size will be the defaul letter or A4, which
% I prefer (extra margins for notes)
\usepackage{pdfpages}
\usepackage[
pdfpagelabels=true,
pdftitle={An introduction to statistical learning: with applications in R},
pdfauthor={Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani},
pdfsubject={Mathematical statistics, R},
pdfkeywords={},
unicode=true,
]{hyperref}
\usepackage{bookmark}
\begin{document}
\pagenumbering{Roman}
\setcounter{page}{1}
\includepdf[pages={1, {}}]{cover.pdf} % with extra blank page
\pagenumbering{roman}
\setcounter{page}{1}
\includepdf[pages=1-14]{in.pdf}
\pagenumbering{arabic}
\setcounter{page}{1}
\includepdf[pages=15-]{in.pdf}
\bookmark[page=1,level=0]{Cover}
\bookmark[page=5,level=0]{Title Page}
\bookmark[page=9,level=0]{Preface}
\bookmark[page=11,level=0]{Contents}
\bookmark[page=17,level=0]{1.: Introduction}
\bookmark[page=31,level=0]{2.: Statistical Learning}
\bookmark[page=31,level=1]{2.1.: What Is Statistical Learning?}
\bookmark[page=33,level=2]{2.1.1.: Why Estimate $f$?}
\bookmark[page=37,level=2]{2.1.2.: How Do We Estimate $f$?}
\bookmark[page=40,level=2]{2.1.3.: The Trade-Off Between Prediction Accuracy and Model Interpretability}
\bookmark[page=42,level=2]{2.1.4.: Supervised Versus Unsupervised Learning}
\bookmark[page=44,level=2]{2.1.5.: Regression Versus Classification Problems}
\bookmark[page=45,level=1]{2.2.: Assessing Model Accuracy}
\bookmark[page=45,level=2]{2.2.1.: Measuring the Quality of Fit}
\bookmark[page=49,level=2]{2.2.2.: The Bias-Variance Trade-Off}
\bookmark[page=53,level=2]{2.2.3.: The Classification Setting}
\bookmark[page=58,level=1]{2.3.: Lab: Introduction to R}
\bookmark[page=58,level=2]{2.3.1.: Basic Commands}
\bookmark[page=61,level=2]{2.3.2.: Graphics}
\bookmark[page=63,level=2]{2.3.3.: Indexing Data}
\bookmark[page=64,level=2]{2.3.4.: Loading Data}
\bookmark[page=65,level=2]{2.3.5.: Additional Graphical and Numerical Summaries}
\bookmark[page=68,level=1]{2.4.: Exercises}
\bookmark[page=75,level=0]{3.: Linear Regression}
\bookmark[page=77,level=1]{3.1.: Simple Linear Regression}
\bookmark[page=77,level=2]{3.1.1.: Estimating the Coefficients}
\bookmark[page=79,level=2]{3.1.2.: Assessing the Accuracy of the Coefficient Estimates}
\bookmark[page=84,level=2]{3.1.3.: Assessing the Accuracy of the Model}
\bookmark[page=87,level=1]{3.2.: Multiple Linear Regression}
\bookmark[page=88,level=2]{3.2.1.: Estimating the Regression Coefficients}
\bookmark[page=91,level=2]{3.2.2.: Some Important Questions}
\bookmark[page=98,level=1]{3.3.: Other Considerations in the Regression Model}
\bookmark[page=98,level=2]{3.3.1.: Qualitative Predictors}
\bookmark[page=102,level=2]{3.3.2.: Extensions of the Linear Model}
\bookmark[page=108,level=2]{3.3.3.: Potential Problems}
\bookmark[page=118,level=1]{3.4.: The Marketing Plan}
\bookmark[page=120,level=1]{3.5.: Comparison of Linear Regression with K-Nearest Neighbors}
\bookmark[page=125,level=1]{3.6.: Lab: Linear Regression}
\bookmark[page=125,level=2]{3.6.1.: Libraries}
\bookmark[page=126,level=2]{3.6.2.: Simple Linear Regression}
\bookmark[page=129,level=2]{3.6.3.: Multiple Linear Regression}
\bookmark[page=131,level=2]{3.6.4.: Interaction Terms}
\bookmark[page=131,level=2]{3.6.5.: Non-linear Transformations of the Predictors}
\bookmark[page=133,level=2]{3.6.6.: Qualitative Predictors}
\bookmark[page=135,level=2]{3.6.7.: Writing Functions}
\bookmark[page=136,level=1]{3.7.: Exercises}
\bookmark[page=143,level=0]{4.: Classification}
\bookmark[page=144,level=1]{4.1.: An Overview of Classification}
\bookmark[page=145,level=1]{4.2.: Why Not Linear Regression?}
\bookmark[page=146,level=1]{4.3.: Logistic Regression}
\bookmark[page=147,level=2]{4.3.1.: The Logistic Model}
\bookmark[page=149,level=2]{4.3.2.: Estimating the Regression Coefficients}
\bookmark[page=150,level=2]{4.3.3.: Making Predictions}
\bookmark[page=151,level=2]{4.3.4.: Multiple Logistic Regression}
\bookmark[page=153,level=2]{4.3.5.: Logistic Regression for $>2$ Response Classes}
\bookmark[page=154,level=1]{4.4.: Linear Discriminant Analysis}
\bookmark[page=154,level=2]{4.4.1.: Using Bayes' Theorem for Classification}
\bookmark[page=155,level=2]{4.4.2.: Linear Discriminant Analysis for $p=1$}
\bookmark[page=158,level=2]{4.4.3.: Linear Discriminant Analysis for $p>1$}
\bookmark[page=165,level=2]{4.4.4.: Quadratic Discriminant Analysis}
\bookmark[page=167,level=1]{4.5.: A Comparison of Classification Methods}
\bookmark[page=170,level=1]{4.6.: Lab: Logistic Regression, LDA, QDA, and KNN}
\bookmark[page=170,level=2]{4.6.1.: The Stock Market Data}
\bookmark[page=172,level=2]{4.6.2.: Logistic Regression}
\bookmark[page=177,level=2]{4.6.3.: Linear Discriminant Analysis}
\bookmark[page=178,level=2]{4.6.4.: Quadratic Discriminant Analysis}
\bookmark[page=179,level=2]{4.6.5.: K-Nearest Neighbors}
\bookmark[page=180,level=2]{4.6.6.: An Application to Caravan Insurance Data}
\bookmark[page=184,level=1]{4.7.: Exercises}
\bookmark[page=191,level=0]{5.: Resampling Methods}
\bookmark[page=192,level=1]{5.1.: Cross-Validation}
\bookmark[page=192,level=2]{5.1.1.: The Validation Set Approach}
\bookmark[page=194,level=2]{5.1.2.: Leave-One-Out Cross-Validation}
\bookmark[page=197,level=2]{5.1.3.: k-Fold Cross-Validation}
\bookmark[page=199,level=2]{5.1.4.: Bias-Variance Trade-Off for k-Fold Cross-Validation}
\bookmark[page=200,level=2]{5.1.5.: Cross-Validation on Classification Problems}
\bookmark[page=203,level=1]{5.2.: The Bootstrap}
\bookmark[page=206,level=1]{5.3.: Lab: Cross-Validation and the Bootstrap}
\bookmark[page=207,level=2]{5.3.1.: The Validation Set Approach}
\bookmark[page=208,level=2]{5.3.2.: Leave-One-Out Cross-Validation}
\bookmark[page=209,level=2]{5.3.3.: k-Fold Cross-Validation}
\bookmark[page=210,level=2]{5.3.4.: The Bootstrap}
\bookmark[page=213,level=1]{5.4.: Exercises}
\bookmark[page=219,level=0]{6.: Linear Model Selection and Regularization}
\bookmark[page=221,level=1]{6.1.: Subset Selection}
\bookmark[page=221,level=2]{6.1.1.: Best Subset Selection}
\bookmark[page=223,level=2]{6.1.2.: Stepwise Selection}
\bookmark[page=226,level=2]{6.1.3.: Choosing the Optimal Model}
\bookmark[page=230,level=1]{6.2.: Shrinkage Methods}
\bookmark[page=231,level=2]{6.2.1.: Ridge Regression}
\bookmark[page=235,level=2]{6.2.2.: The Lasso}
\bookmark[page=243,level=2]{6.2.3.: Selecting the Tuning Parameter}
\bookmark[page=244,level=1]{6.3.: Dimension Reduction Methods}
\bookmark[page=246,level=2]{6.3.1.: Principal Components Regression}
\bookmark[page=253,level=2]{6.3.2.: Partial Least Squares}
\bookmark[page=254,level=1]{6.4.: Considerations in High Dimensions}
\bookmark[page=254,level=2]{6.4.1.: High-Dimensional Data}
\bookmark[page=255,level=2]{6.4.2.: What Goes Wrong in High Dimensions?}
\bookmark[page=257,level=2]{6.4.3.: Regression in High Dimensions}
\bookmark[page=259,level=2]{6.4.4.: Interpreting Results in High Dimensions}
\bookmark[page=260,level=1]{6.5.: Lab 1: Subset Selection Methods}
\bookmark[page=260,level=2]{6.5.1.: Best Subset Selection}
\bookmark[page=263,level=2]{6.5.2.: Forward and Backward Stepwise Selection}
\bookmark[page=264,level=2]{6.5.3.: Choosing Among Models Using the Validation Set Approach and Cross-Validation}
\bookmark[page=267,level=1]{6.6.: Lab 2: Ridge Regression and the Lasso}
\bookmark[page=267,level=2]{6.6.1.: Ridge Regression}
\bookmark[page=271,level=2]{6.6.2.: The Lasso}
\bookmark[page=272,level=1]{6.7.: Lab 3: PCR and PLS Regression}
\bookmark[page=272,level=2]{6.7.1.: Principal Components Regression}
\bookmark[page=274,level=2]{6.7.2.: Partial Least Squares}
\bookmark[page=275,level=1]{6.8.: Exercises}
\bookmark[page=281,level=0]{7.: Moving Beyond Linearity}
\bookmark[page=282,level=1]{7.1.: Polynomial Regression}
\bookmark[page=284,level=1]{7.2.: Step Functions}
\bookmark[page=286,level=1]{7.3.: Basis Functions}
\bookmark[page=287,level=1]{7.4.: Regression Splines}
\bookmark[page=287,level=2]{7.4.1.: Piecewise Polynomials}
\bookmark[page=287,level=2]{7.4.2.: Constraints and Splines}
\bookmark[page=289,level=2]{7.4.3.: The Spline Basis Representation}
\bookmark[page=290,level=2]{7.4.4.: Choosing the Number and Locations of the Knots}
\bookmark[page=292,level=2]{7.4.5.: Comparison to Polynomial Regression}
\bookmark[page=293,level=1]{7.5.: Smoothing Splines}
\bookmark[page=293,level=2]{7.5.1.: An Overview of Smoothing Splines}
\bookmark[page=294,level=2]{7.5.2.: Choosing the Smoothing Parameter Lambda}
\bookmark[page=296,level=1]{7.6.: Local Regression}
\bookmark[page=298,level=1]{7.7.: Generalized Additive Models}
\bookmark[page=299,level=2]{7.7.1.: GAMs for Regression Problems}
\bookmark[page=302,level=2]{7.7.2.: GAMs for Classification Problems}
\bookmark[page=303,level=1]{7.8.: Lab: Non-linear Modeling}
\bookmark[page=304,level=2]{7.8.1.: Polynomial Regression and Step Functions}
\bookmark[page=309,level=2]{7.8.2.: Splines}
\bookmark[page=310,level=2]{7.8.3.: GAMs}
\bookmark[page=313,level=1]{7.9.: Exercises}
\bookmark[page=319,level=0]{8.: Tree-Based Methods}
\bookmark[page=319,level=1]{8.1.: The Basics of Decision Trees}
\bookmark[page=320,level=2]{8.1.1.: Regression Trees}
\bookmark[page=327,level=2]{8.1.2.: Classification Trees}
\bookmark[page=330,level=2]{8.1.3.: Trees Versus Linear Models}
\bookmark[page=331,level=2]{8.1.4.: Advantages and Disadvantages of Trees}
\bookmark[page=332,level=1]{8.2.: Bagging, Random Forests, Boosting}
\bookmark[page=332,level=2]{8.2.1.: Bagging}
\bookmark[page=336,level=2]{8.2.2.: Random Forests}
\bookmark[page=337,level=2]{8.2.3.: Boosting}
\bookmark[page=340,level=1]{8.3.: Lab: Decision Trees}
\bookmark[page=340,level=2]{8.3.1.: Fitting Classification Trees}
\bookmark[page=343,level=2]{8.3.2.: Fitting Regression Trees}
\bookmark[page=344,level=2]{8.3.3.: Bagging and Random Forests}
\bookmark[page=346,level=2]{8.3.4.: Boosting}
\bookmark[page=348,level=1]{8.4.: Exercises}
\bookmark[page=353,level=0]{9.: Support Vector Machines}
\bookmark[page=354,level=1]{9.1.: Maximal Margin Classifier}
\bookmark[page=354,level=2]{9.1.1.: What Is a Hyperplane?}
\bookmark[page=355,level=2]{9.1.2.: Classification Using a Separating Hyperplane}
\bookmark[page=357,level=2]{9.1.3.: The Maximal Margin Classifier}
\bookmark[page=358,level=2]{9.1.4.: Construction of the Maximal Margin Classifier}
\bookmark[page=359,level=2]{9.1.5.: The Non-separable Case}
\bookmark[page=360,level=1]{9.2.: Support Vector Classifiers}
\bookmark[page=360,level=2]{9.2.1.: Overview of the Support Vector Classifier}
\bookmark[page=361,level=2]{9.2.2.: Details of the Support Vector Classifier}
\bookmark[page=365,level=1]{9.3.: Support Vector Machines}
\bookmark[page=365,level=2]{9.3.1.: Classification with Non-linear Decision Boundaries}
\bookmark[page=366,level=2]{9.3.2.: The Support Vector Machine}
\bookmark[page=370,level=2]{9.3.3.: An Application to the Heart Disease Data}
\bookmark[page=371,level=1]{9.4.: SVMs with More than Two Classes}
\bookmark[page=371,level=2]{9.4.1.: One-Versus-One Classification}
\bookmark[page=372,level=2]{9.4.2.: One-Versus-All Classification}
\bookmark[page=372,level=1]{9.5.: Relationship to Logistic Regression}
\bookmark[page=375,level=1]{9.6.: Lab: Support Vector Machines}
\bookmark[page=375,level=2]{9.6.1.: Support Vector Classifier}
\bookmark[page=379,level=2]{9.6.2.: Support Vector Machine}
\bookmark[page=381,level=2]{9.6.3.: ROC Curves}
\bookmark[page=382,level=2]{9.6.4.: SVM with Multiple Classes}
\bookmark[page=382,level=2]{9.6.5.: Application to Gene Expression Data}
\bookmark[page=384,level=1]{9.7.: Exercises}
\bookmark[page=389,level=0]{10.: Unsupervised Learning}
\bookmark[page=389,level=1]{10.1.: The Challenge of Unsupervised Learning}
\bookmark[page=390,level=1]{10.2.: Principal Components Analysis}
\bookmark[page=391,level=2]{10.2.1.: What Are Principal Components?}
\bookmark[page=395,level=2]{10.2.2.: Another Interpretation of Principal Components}
\bookmark[page=396,level=2]{10.2.3.: More on PCA}
\bookmark[page=401,level=2]{10.2.4.: Other Uses for Principal Components}
\bookmark[page=401,level=1]{10.3.: Clustering Methods}
\bookmark[page=402,level=2]{10.3.1.: K-Means Clustering}
\bookmark[page=406,level=2]{10.3.2.: Hierarchical Clustering}
\bookmark[page=415,level=2]{10.3.3.: Practical Issues in Clustering}
\bookmark[page=417,level=1]{10.4.: Lab 1: Principal Components Analysis}
\bookmark[page=420,level=1]{10.5.: Lab 2: Clustering}
\bookmark[page=420,level=2]{10.5.1.: K-Means Clustering}
\bookmark[page=422,level=2]{10.5.2.: Hierarchical Clustering}
\bookmark[page=423,level=1]{10.6.: Lab 3: NCI60 Data Example}
\bookmark[page=424,level=2]{10.6.1.: PCA on the NCI60 Data}
\bookmark[page=426,level=2]{10.6.2.: Clustering the Observations of the NC160 Data}
\bookmark[page=429,level=1]{10.7.: Exercises}
\bookmark[page=435,level=0]{Index}
\end{document}
@goerz
Copy link
Author

goerz commented Dec 30, 2017

A table of contents for the book "The Elements of Statistical Learning" by some of the same authors is available at
https://gist.github.com/goerz/4c863a2fde1d3357113b95643d0ace16

@knoahlr
Copy link

knoahlr commented May 20, 2018

Thank you for this.

@tzabal
Copy link

tzabal commented Jan 8, 2019

Thank you very much for sharing this.

@MohammadAliAfsahi
Copy link

Thanks

@faisalnawazmir
Copy link

I did not find in.pdf, will you help me...

@MohammadAliAfsahi
Copy link

I did not find in.pdf, will you help me...

You should download the book from the link then name it as in.pdf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment