Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@marcrasi
marcrasi / dcm.md
Created May 30, 2019 20:01
Differentiating class methods

Differentiating Class Methods

marcrasi@

Last updated: 5/21/19

Problem

When c has class type with a method named f, Swift dispatches c.f() at runtime by looking for the concrete implementation of method inside a "vtable" referenced by c.

@dan-zheng
dan-zheng / AdamOptimizer.swift
Last active November 17, 2018 23:22
Adam optimizer example
struct MNISTParameters : ParameterAggregate {
var w1 = Tensor<Float>(randomNormal: [784, 30])
var w2 = Tensor<Float>(randomNormal: [30, 10])
// Compiler-synthesized:
// static var allKeyPaths: [WritableKeyPath<MNISTParameters, Tensor<Float>>] {
// return [\MNISTParameters.w1, \MNISTParameters.w2]
// }
// Learn more about key paths here: https://github.com/apple/swift-evolution/blob/master/proposals/0161-key-paths.md
}
@lattner
lattner / TaskConcurrencyManifesto.md
Last active April 21, 2024 09:43
Swift Concurrency Manifesto

Myia (Theano 2.0)

Automatic differentiation vs. symbolic

In the literature the term automatic differentiation (AD) is reserved for a specific technique in which the gradient of a program is calculated by creating an adjoint program, which performs the gradient calculation of the given program. Note that this adjoint program includes all control flow statements. There are two approaches to implementing AD: Source code transformation (SCT) and operator overloading (OO). With source code transformation we generate the adjoint program in the host language e.g. given a Python function we manipulate the abstract syntax tree (AST) directly in order to create a new Python function which performs the gradient computation. Operator overloading on the other hand overloads each operator to add an entry to a tape (Wengert list). Once the function exits, the gradient is calculated by going through the tape in reverse order, applying the gradient operators.

Theano does not employ AD but "[a highly optimized form of

@bartvm
bartvm / dl-frameworks.rst
Last active December 7, 2020 18:18
A comparison of deep learning frameworks

A comparison of Theano with other deep learning frameworks, highlighting a series of low-level design choices in no particular order.

Overview

Differentiation

Symbolic: Theano, CGT; Automatic: Torch, MXNet

Symbolic and automatic differentiation are often confused or used interchangeably, although their implementations are significantly different.

@baopham
baopham / Monaco for Powerline.otf
Last active April 16, 2023 03:57
Patched font Monaco for OSX Vim-Powerline
@gruber
gruber / Liberal Regex Pattern for All URLs
Last active March 20, 2024 20:28
Liberal, Accurate Regex Pattern for Matching All URLs
The regex patterns in this gist are intended to match any URLs,
including "mailto:foo@example.com", "x-whatever://foo", etc. For a
pattern that attempts only to match web URLs (http, https), see:
https://gist.github.com/gruber/8891611
# Single-line version of pattern:
(?i)\b((?:[a-z][\w-]+:(?:/{1,3}|[a-z0-9%])|www\d{0,3}[.]|[a-z0-9.\-]+[.][a-z]{2,4}/)(?:[^\s()<>]+|\(([^\s()<>]+|(\([^\s()<>]+\)))*\))+(?:\(([^\s()<>]+|(\([^\s()<>]+\)))*\)|[^\s`!()\[\]{};:'".,<>?«»“”‘’]))