Skip to content

Instantly share code, notes, and snippets.

@dioptx
Forked from ntakouris/notes_deep_learn.txt
Created August 9, 2019 16:14
Show Gist options
  • Save dioptx/a500ba6875c537a9ddf7cd205abf0308 to your computer and use it in GitHub Desktop.
Save dioptx/a500ba6875c537a9ddf7cd205abf0308 to your computer and use it in GitHub Desktop.
- [x] inverted dropout
- [x] goldilocks ideal rate // bullshit
- [x] shannons entropy measure // good one
- [x] harmonic mean
- [x] beta distribution
- [x] gamma function
- [x] bias correction of exponentially weighted average
- [x] covariate shifting computer vision
- [x] why use strided convolution
- [x] selu
- [x] exponential linear units
- [x] lecun normal // bullshit — weight init to prevent overfit
- [x] alpha dropout // gaussian noise
- [x] curve shift // im stupid
- [x] smote resampler // smart but garbage in garbage out?
- [x] sparse cross entropy loss // treat as int
- [x] dropout 0.5 effect
- [x] same convolution
- [x] same maxpooling
- [x] pruning
- [x] quantisation
- [x] intel distiller
- [ ] pca color augmentation
- [ ] convolutional implementation of sliding windows
- [x] non max suppression // intersection over union
- [x] cross-validation for test and train sets // do you generalize well?
- [ ] Image resizing interposltions opencv
- [ ] Ornsteinn uhlenbeck
- [ ] univariate plots
- [ ] Feature trend uniformity debugging
- [ ] Affine transform
- [ ] visualizing hidden units in conv networks
- [ ] PPO Algorithm // Reinforcement learning
- [ ] exploration and entropy
- [ ] huber loss
- [ ] encoder-decoder approaches
- [ ] RNN backwards diff
- [ ] GRU backwards diff
- [ ] LSTM backwards diff
- [ ] GRU details
- [x] t SNE
- [ ] fc7 response
- [ ] Hierarchical softmax
- [ ] Moving average crossover
- [ ] gradient boosted trees
- [ ] Bayesian confidence intervals
- [ ] UMAP Uniform Manifold Approximation and Projection for Dimension Reduction
- [ ] [https://colah.github.io/posts/2014-03-NN-Manifolds-Topology/](https://colah.github.io/posts/2014-03-NN-Manifolds-Topology/)
- [ ] low discrepancy
- [ ] het et al initialization
- [ ] SqueezeNet
- [ ] adaptive dropout
- [ ] rare event classification autonecoder
- [ ] multiple softmax outputs
- [ ] spearman correlation
- [ ] glorot uniform initialization
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment