Skip to content

Instantly share code, notes, and snippets.

Avatar

marcrasi

  • Google
View GitHub Profile
@marcrasi
marcrasi / aunifying.md
Last active Feb 25, 2021
Unifying the differentiation modes
View aunifying.md

Unifying the differentiation modes

Today, Swift AD has two separate transforms that produce forward mode derivatives (JVPs) and reverse mode derivatives (VJPs).

We can generalize this to a single transform that produces a single "derivative function" that can evaluate derivatives in both modes.

The "derivative function" is generic over a "derivative type" that determines the derivative mode.

The advantages over the existing system are:

View gist:59c540994fd1644300c6def0b8453a08
import TensorFlow
// MARK: - Protocols
protocol Vector {
func scaled(by factor: Float) -> Self
func adding(_ other: Self) -> Self
static var zero: Self { get }
}
View gist:51844c78bbac58d4afc23651fc33d512
import TensorFlow
// MARK: Example function and its transformed version.
func cube(_ x: Float) -> Float {
return x.product(x).product(x)
}
func cubeT<A: WrapsFloat>(_ x: A) -> A {
return x.product(x).product(x)
View flamegraph.svg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View inoutsubscript2
Don't use this. ArrayReader here is better: https://groups.google.com/a/tensorflow.org/g/swift/c/Xo5YmLIt12s/m/OM8n6J4TAQAJ
View tif_loader.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View reproducer.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View swiftpm_install.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View main.swift
import TensorFlow
let g = gradient(at: Dense()) { m in m.callForward(Tensor<Float>(0)) }
print(g)
View layer.swift
import TensorFlow
protocol MyLayer: Differentiable {
associatedtype Input: Differentiable
@differentiable
func forward(_ x: Input) -> Tensor<Float>
}
extension MyLayer {