Skip to content

Instantly share code, notes, and snippets.

@branneman
branneman / better-nodejs-require-paths.md
Last active April 27, 2024 04:16
Better local require() paths for Node.js

Better local require() paths for Node.js

Problem

When the directory structure of your Node.js application (not library!) has some depth, you end up with a lot of annoying relative paths in your require calls like:

const Article = require('../../../../app/models/article');

Those suck for maintenance and they're ugly.

Possible solutions

That’s one of the real strengths of Docker: the ability to go back to a previous commit. The secret is simply to docker tag the image you want.
Here’s an example. In this example, I first installed ping, then committed, then installed curl, and committed that. Then I rolled back the image to contain only ping:
$ docker history imagename
IMAGE CREATED CREATED BY SIZE
f770fc671f11 12 seconds ago apt-get install -y curl 21.3 MB
28445c70c2b3 39 seconds ago apt-get install ping 11.57 MB
8dbd9e392a96 7 months ago 131.5 MB
@udibr
udibr / beamsearch.py
Last active October 4, 2021 11:50
beam search for Keras RNN
# variation to https://github.com/ryankiros/skip-thoughts/blob/master/decoding/search.py
def keras_rnn_predict(samples, empty=empty, rnn_model=model, maxlen=maxlen):
"""for every sample, calculate probability for every possible label
you need to supply your RNN model and maxlen - the length of sequences it can handle
"""
data = sequence.pad_sequences(samples, maxlen=maxlen, value=empty)
return rnn_model.predict(data, verbose=0)
def beamsearch(predict=keras_rnn_predict,
@mbollmann
mbollmann / attention_lstm.py
Last active June 26, 2023 10:08
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one:
@cbaziotis
cbaziotis / Attention.py
Last active March 28, 2023 11:50
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
from keras import backend as K, initializers, regularizers, constraints
from keras.engine.topology import Layer
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
@cbaziotis
cbaziotis / AttentionWithContext.py
Last active April 25, 2022 14:37
Keras Layer that implements an Attention mechanism, with a context/query vector, for temporal data. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
x (): input
kernel (): weights
Returns:
"""
if K.backend() == 'tensorflow':
@nigeljyng
nigeljyng / AttentionWithContext.py
Last active February 10, 2021 14:02 — forked from cbaziotis/AttentionWithContext.py
Keras Layer that implements an Attention mechanism, with a context/query vector, for temporal data. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
class AttentionWithContext(Layer):
"""
Attention operation, with a context/query vector, for temporal data.
Supports Masking.
Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf]
"Hierarchical Attention Networks for Document Classification"
by using a context vector to assist the attention
# Input shape
3D tensor with shape: `(samples, steps, features)`.
# Output shape
@jarun
jarun / disassemble.md
Last active April 26, 2024 14:18
Guide to disassemble

prerequisites

  • Compile the program in gcc with debug symbols enabled (-g)
  • Do NOT strip the binary
  • To generate assembly code using gcc use the -S option: gcc -S hello.c

utilities

objdump

@paul-axe
paul-axe / insomnihack2019teaser_droops_writeup.md
Created January 20, 2019 12:42
insomnihack2019teaser_droops_writeup.md

The challenge was based on drupal7 with obvious unserialize call added.

Trying to build a chain and the first solution i found was based on following chain:

./includes/bootstrap.inc

abstract class DrupalCacheArray
    ...
    public function __destruct() {
        $data = array();