Skip to content

Instantly share code, notes, and snippets.

@rxwei
rxwei / differentiable-reduction.ipynb
Last active May 6, 2019 18:53
Differentiable reduction
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@rxwei
rxwei / autograd.md
Last active November 26, 2022 10:48
Autograd in Swift

Make Swift AD support all numpy functions via Autograd/JAX

In the Python module:

let autograd = Python.import("jax.grad")

extension PythonObject {
  @differentiable(wrt: args, vjp: _vjpDynamicallyCall)
  @discardableResult
@rxwei
rxwei / gist:0fb313833831491e87092b91110c59ae
Created April 21, 2019 00:15
Error building retro on macOS
➜ retro git:(c-api) ✗ make retro-c
[ 70%] Built target zip
[ 70%] Built target pce-submodule
[ 70%] Built target pce
[ 70%] Built target gba-submodule
[ 70%] Built target gba
[ 70%] Built target nes-submodule
[ 75%] Built target nes
[ 75%] Built target gb-submodule
[ 75%] Generating retro/cores/gambatte_libretro.dylib
sil hidden @AD__$s11leak_simple6apply2_2to10TensorFlow0E0VySfGx_AGtAD5LayerRzAG5InputRtzAG6OutputRtzlF__adjoint_src_0_wrt_1 : $@convention(thin) <τ_0_0 where τ_0_0 : Layer, τ_0_0.Input == Tensor<Float>, τ_0_0.Output == Tensor<Float>> (@guaranteed Tensor<Float>, @guaranteed _AD__$s11leak_simple6apply2_2to10TensorFlow0E0VySfGx_AGtAD5LayerRzAG5InputRtzAG6OutputRtzlF__Type__src_0_wrt_1<τ_0_0>) -> @owned Tensor<Float> {
// %0 // users: %13, %46, %2
// %1 // user: %23
bb0(%0 : $Tensor<Float>, %1 : $_AD__$s11leak_simple6apply2_2to10TensorFlow0E0VySfGx_AGtAD5LayerRzAG5InputRtzAG6OutputRtzlF__Type__src_0_wrt_1<τ_0_0>):
retain_value %0 : $Tensor<Float> // id: %2
%3 = alloc_stack $Tensor<Float> // users: %25, %13, %47, %45, %4
%4 = begin_access [init] [static] [no_nested_conflict] %3 : $*Tensor<Float> // users: %8, %10
%5 = metatype $@thin Tensor<Float>.Type // user: %7
// function_ref
@rxwei
rxwei / callable.md
Last active March 20, 2019 06:06
Callable

Hi all, @dan-zheng and I wrote a proposal to introduce static callables to Swift. This proposal is also available as a gist here. We'd love to hear your feedback.

Introduce callables

  • Proposal: SE-NNNN
  • Authors: Richard Wei, Dan Zheng
  • Review Manager: TBD
  • Status: Implementation in progress

Introduction

@rxwei
rxwei / rnn.ipynb
Created March 11, 2019 10:40
RNN.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@rxwei
rxwei / differentiable_currying.swift
Last active February 21, 2019 09:44
Currying differentiable functions (with pullback transpose).
// Function-as-a-differentiable-type rule:
// Tangent space: ((T...) -> U...)' = AnyDerivative
// Cotangent space: ((T...) -> U...)'* = AnyDerivative
// Why? Because when a function value is varying, what's varying is it's context.
// In general cases, we need this to be a constrained existential with an
// `AdditiveArithmetic` conformance for its `.zero` and `+`, and `Differentiable`
// for being able to transpose between differential and a pullback.
// New associated function type calculation rules:
// original: (T...) -> (U...)
@rxwei
rxwei / gist:be724f6634b9536518e4f9d378035be5
Last active February 24, 2019 00:33
Currying differentiable functions in Swift. Full version with transpose here: https://gist.github.com/rxwei/1cfb98027f656adb1ebfa8af56826c97.
// Function-as-a-differentiable-type rule:
// Tangent space: ((T...) -> U...)' = Any
// Cotangent space: ((T...) -> U...)'* = Any
// Why? Because when a function value is varying, what's varying is it's context.
// In general cases, we need this to be a constrained existential with an
// `AdditiveArithmetic` conformance for its `.zero` and `+`, and `Differentiable`
// for being able to transpose between differential and a pullback.
// New associated function type calculation rules:
// original: (T...) -> (U...)
@rxwei
rxwei / .swift
Created February 18, 2019 13:37
AnyDerivative implemented using a class. It's not possible because class computed properties cannot return Self :(
// Type-erased box.
fileprivate class AnyDerivativeBox : Differentiable & AdditiveArithmetic {
public typealias TangentVector = AnyDerivativeBox
public typealias CotangentVector = AnyDerivativeBox
public typealias AllDifferentiableVariables = AnyDerivativeBox
public static func == (lhs: AnyDerivativeBox, rhs: AnyDerivativeBox) -> Bool {
fatalError("Must override")
}
public static var zero: Self {
// AD__$s4test15MNISTClassifierV7applied2to10TensorFlow0E0VySfGAI_tF__adjoint_src_0_wrt_0_1
sil hidden @AD__$s4test15MNISTClassifierV7applied2to10TensorFlow0E0VySfGAI_tF__adjoint_src_0_wrt_0_1 : $@convention(method) (@guaranteed Tensor<Float>, @guaranteed _AD__$s4test15MNISTClassifierV7applied2to10TensorFlow0E0VySfGAI_tF__Type__src_0_wrt_0_1) -> (@owned MNISTClassifier.AllDifferentiableVariables, @owned Tensor<Float>) {
// %0 // users: %41, %4, %2
// %1 // users: %29, %25, %21, %17, %13, %11, %7, %3
bb0(%0 : $Tensor<Float>, %1 : $_AD__$s4test15MNISTClassifierV7applied2to10TensorFlow0E0VySfGAI_tF__Type__src_0_wrt_0_1):
retain_value %0 : $Tensor<Float> // id: %2
%3 = struct_extract %1 : $_AD__$s4test15MNISTClassifierV7applied2to10TensorFlow0E0VySfGAI_tF__Type__src_0_wrt_0_1, #_AD__$s4test15MNISTClassifierV7applied2to10TensorFlow0E0VySfGAI_tF__Type__src_0_wrt_0_1.pullback_7 // user: %4
%4 = apply %3(%0) :