Skip to content

Instantly share code, notes, and snippets.

View esafak's full-sized avatar

Emre Şafak esafak

  • Archipelago AI
  • Silicon Valley
  • 15:00 (UTC -07:00)
View GitHub Profile

On virtual spaces (for scientific conferences)

The title is a bit broad. What I am going to write about is gather.town. I complained quite a bit re how recent conferences used the gather platform. Here I try to be more constructive, and explain why I think things were bad, and also how I think they can be improved (substantially).

I think gather.town is a fantastic interface, and I think it was mis-used or mal-used in some recent xACL conferences (EACL 2021, EMNLP 2020). It is really disappointing, as there is so much potential, which was not only left unfulfilled, but even in some cases was worse than not having gather at all. This post will try to explain what I think was bad, and how I think things can be improved.

@saravanabalagi
saravanabalagi / tf2_tracing_estimators.md
Created January 16, 2020 11:20
Profiling Tensorflow Estimator in tf2

Based on Summary Trace API,

device_name = tf.test.gpu_device_name()
if not tf.test.is_gpu_available():
    raise SystemError('GPU device not found')
print('Found GPU at: {}'.format(device_name))
os.makedirs(os.path.join(args.exp_dir, 'plugins/profile'), exist_ok=True)
tf.summary.trace_on(graph=True, profiler=True)
tracing_params = params.copy()
@androidfred
androidfred / kotlin_arrow.md
Last active December 7, 2023 17:42
Kotlin Arrow

Kotlin Arrow

A lot of Kotlin features can be traced back to functional programming languages, eg

  • Heavy use of immutability-by-default and map, filter etc functions
  • Type inference
  • Data classes (which enable pattern matching)
  • Null safety and dealing with the absence of values

However, Kotlin is missing many incredibly useful data types that are ubiquitous in functional programming languages, eg Either, Try etc.

A Tour of PyTorch Internals (Part I)

The fundamental unit in PyTorch is the Tensor. This post will serve as an overview for how we implement Tensors in PyTorch, such that the user can interact with it from the Python shell. In particular, we want to answer four main questions:

  1. How does PyTorch extend the Python interpreter to define a Tensor type that can be manipulated from Python code?
  2. How does PyTorch wrap the C libraries that actually define the Tensor's properties and methods?
  3. How does PyTorch cwrap work to generate code for Tensor methods?
  4. How does PyTorch's build system take all of these components to compile and generate a workable application?

Extending the Python Interpreter

PyTorch defines a new package torch. In this post we will consider the ._C module. This module is known as an "extension module" - a Python module written in C. Such modules allow us to define new built-in object types (e.g. the Tensor) and to call C/C++ functions.

@aparrish
aparrish / understanding-word-vectors.ipynb
Last active June 16, 2024 02:26
Understanding word vectors: A tutorial for "Reading and Writing Electronic Text," a class I teach at ITP. (Python 2.7) Code examples released under CC0 https://creativecommons.org/choose/zero/, other text released under CC BY 4.0 https://creativecommons.org/licenses/by/4.0/
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@shamatar
shamatar / rwa.py
Last active January 14, 2022 20:17
Keras (keras.is) implementation of Recurrent Weighted Average, as described in https://arxiv.org/abs/1703.01253. Follows original implementation in Tensorflow from https://github.com/jostmey/rwa. Works with fixed batch sizes, requires "batch_shape" parameter in input layer. Outputs proper config, should save and restore properly. You are welcome…
from keras.layers import Recurrent
import keras.backend as K
from keras import activations
from keras import initializers
from keras import regularizers
from keras import constraints
from keras.engine import Layer
from keras.engine import InputSpec
@tokestermw
tokestermw / tf_ed_vi_tutorial.py
Last active July 19, 2019 01:18
Variational inference and Bayesian deep learning tutorial (w/ uncertainty intervals) using TensorFlow and Edward.
""" Some description.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import sys
import json
import tqdm
@udibr
udibr / gruln.py
Last active November 7, 2020 02:34
Keras GRU with Layer Normalization
import numpy as np
from keras.layers import GRU, initializations, K
from collections import OrderedDict
class GRULN(GRU):
'''Gated Recurrent Unit with Layer Normalization
Current impelemtation only works with consume_less = 'gpu' which is already
set.
# Arguments