Skip to content

Instantly share code, notes, and snippets.

@dfeng
dfeng / NDN.md
Created December 20, 2021 10:55

Non-deep Networks

One common wisdom in the era of deep learning is that "depth trumps width". After all, we are not in the era of wide learning. Deeper networks seem to allow for more expressivity given the compositional nature of layers, holding the number of parameters fixed. From a theoretical perspective, we know that depth-2 networks are universal approximators, but the size of the network is exponential in the input dimension. I'm pretty sure that there has been work showing that dependence for deeper networks is no longer exponential in $n$, but who knows under what stringent conditions does this hold.

More recently, we've entered the era of deep learning, whereby the number of layers are in the hundreds, if not thousands. This feels like an altogether different regime, which I would call really-deep learning. This also gets closer to the question of scaling.

Also, note that a lot of the theoret

Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@dfeng
dfeng / clock.html
Last active March 9, 2021 22:57 — forked from sam0737/clock.html
OBS Studio: A HTML page for showing current date and time in the video
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>A simple clock</title>
</head>
<body translate="no" >
  • tags: #paper
  • results:
    • we have this new ratio regularization penalty $$l_1 / l_2$$ or $$l_{1,2}$$
  • outline (draft 1):
    • basically, a follow-up of Arora
      • what they show is that, in the context of matrix completion, we have this interesting theory (sort-of verified applied) that shows that depth (in the context of gradient descent) has this particular effect to essentially make the singular values more extreme (by analyzing the gradient flow, and finding that there's a $$N$$ factor there, corresponding to the depth)
      • they show that this cannot just be explained by nuclear norm minimization/regularization
    • we make a few remarks
      • firstly, the comparison between actual nuclear norm minimization (via a convex program) is not a particularly fair comparison
  • due to non-convexity, they're not equivalent
@dfeng
dfeng / mac-setup.md
Last active June 8, 2019 00:36 — forked from orlando/mac-setup.md
Fresh Mac OS Setup

1. Run Software Update

Make sure everything is up to date in the App Store

2. Install Homebrew and Homebrew-Cask

  1. Open a terminal window and execute the Homebrew install script:
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
This file has been truncated, but you can view the full file.
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta charset="utf-8">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1">
This file has been truncated, but you can view the full file.
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta charset="utf-8">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1">
This file has been truncated, but you can view the full file.
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta charset="utf-8">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1">
@dfeng
dfeng / build.R
Created March 1, 2016 19:10
build.R
local({
baseurl = paste0(servr:::jekyll_config('.', 'baseurl'), "/")
knitr::opts_knit$set(base.url = baseurl)
# fall back on 'kramdown' if markdown engine is not specified
# markdown = servr:::jekyll_config('.', 'markdown', 'kramdown')
# see if we need to use the Jekyll render in knitr
# if (markdown == 'kramdown') {
# knitr::render_jekyll()
# } else knitr::render_markdown()
@dfeng
dfeng / move.R
Last active January 5, 2016 16:29
# ----------------------------
# | Masters Application Script |
# \ __________________________ /
# \ ^__^
# \ (oo)\_______
# (__)\ )\/\
# ||----w |
# || ||
# Here's my folder structure:
#