Skip to content

Instantly share code, notes, and snippets.

@zeyademam
zeyademam / Troubleshoot-dcnn.md
Last active January 22, 2024 05:54
Troubleshooting Convolutional Neural Nets

Troubleshooting Convolutional Neural Networks

Intro

This is a list of hacks gathered primarily from prior experiences as well as online sources (most notably Stanford's CS231n course notes) on how to troubleshoot the performance of a convolutional neural network . We will focus mainly on supervised learning using deep neural networks. While this guide assumes the user is coding in Python3.6 using tensorflow (TF), it can still be helpful as a language agnostic guide.

Suppose we are given a convolutional neural network to train and evaluate and assume the evaluation results are worse than expected. The following are steps to troubleshoot and potentially improve performance. The first section corresponds to must-do's and generally good practices before you start troubleshooting. Every subsequent section header corresponds to a problem and the section is devoted to solving it. The sections are ordered to reflect "more common" issues first and under each header the "most-eas

@cscalfani
cscalfani / MonoidsInHaskellAnIntroductions.md
Last active July 10, 2024 13:57
Monoids in Haskell, an Introduction

Monoids in Haskell, an Introduction

Why should programmers care about Monoids? Because Monoids are a common pattern that shows up over and over in programming. And when patterns show up, we can abstract them and leverage work we've done in the past. This allows us to quickly develop solutions on top of proven, stable code.

Add Commutative Property to a Monoid (Commutative Monoid) and you have something that can be executed in parallel. With the end of Moore's Law, parallelism is our only hope to increasing processing speeds.

What follows is what I've learned after studying Monoids. It is hardly complete, but hopefully will prove to be helpful as an introduction for others.

Monoid Lineage

# -*- coding: utf-8 -*-
"""
Improving approximate nearest neighbour search with k-nearest neigbors.
Using sklearn-KDTree here just for demonstration. You can plugin much faster
nearest neigbour search implementations (flann, annoy to name a few) for
better results. For benchmarks, check out:
1) Radim Řehůřek (author of gensim) -
http://rare-technologies.com/performance-shootout-of-nearest-neighbours-intro
2) Erik Bernhardsson (author of annoy) -
https://github.com/erikbern/ann-benchmarks