Skip to content

Instantly share code, notes, and snippets.

View mbollmann's full-sized avatar

Marcel Bollmann mbollmann

View GitHub Profile
@mbollmann
mbollmann / toggle-kitty.fish
Created March 29, 2024 11:44
Bind this to a system-wide hotkey to toggle visibility of Kitty, whether it's already started or not.
#!/usr/bin/env fish
if not command -q kitty
set error_msg "kitty not found."
else if not command -q xdotool
set error_msg "xdotool not found."
else if not command -q wmctrl
set error_msg "wmctrl not found."
end
@mbollmann
mbollmann / unicode_scripts.py
Created March 2, 2022 14:12
Access Unicode Script property in Python & find out which script(s) a string contains
#!/usr/bin/env python3
# Unicode characters are neatly categorized into different "scripts", as seen on
# the character code chart <http://www.unicode.org/charts/#scripts> and defined
# in Annex #24 <https://www.unicode.org/reports/tr24/>.
#
# Unfortunately, Python's unicodedata module doesn't provide access to this
# information. However, the fontTools library does include this.
# <https://github.com/fonttools/fonttools>
#
@mbollmann
mbollmann / best_wordle_guess.py
Last active March 26, 2022 20:32
Finding statistically best guesses for the Wordle game
#!/usr/bin/env python3
#
# MIT License
#
# Copyright (c) 2021 Marcel Bollmann
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
#!/usr/bin/env python3
"""Usage: conv_checkpoints_to_model.py MODFILE
Takes a trained model file with multiple saved checkpoints and converts these
checkpoints into standalone models. This allows the different checkpoints to be
used, e.g., as parts of a model ensemble.
This script will:
- Analyze MODFILE to find all saved model components
@mbollmann
mbollmann / theano_cuda.patch
Created July 7, 2017 14:42
Crude patch for Theano 0.9.0 to produce deterministic results with CUDA
--- theano/sandbox/cuda/opt.py 2017-05-31 23:26:09.972668647 +0200
+++ theano/sandbox/cuda/opt_patched.py 2017-06-01 00:49:43.818626738 +0200
@@ -38,10 +38,12 @@
GpuElemwise, GpuDimShuffle, GpuReshape, GpuCAReduce,
gpu_flatten,
GpuSubtensor, GpuAdvancedSubtensor1,
- GpuAdvancedIncSubtensor1, GpuAdvancedIncSubtensor1_dev20,
+ GpuAdvancedIncSubtensor1,
GpuIncSubtensor, gpu_alloc, GpuAlloc, gpu_shape, GpuSplit, GpuAllocEmpty)
from theano.sandbox.cuda.opt_util import pad_dims, unpad_dims
@mbollmann
mbollmann / bibtex_collect_stats.py
Created November 17, 2016 11:10
Collecting stats about paper titles per year in a .bib file
#!/usr/bin/python3
# -*- coding: utf-8 -*-
import argparse
import bibtexparser
from collections import Counter
import matplotlib.pyplot as plt
import seaborn as sns
import sys
@mbollmann
mbollmann / attention_lstm.py
Last active June 26, 2023 10:08
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one:
@mbollmann
mbollmann / hidden_state_lstm.py
Created August 17, 2016 10:02
Keras LSTM that inputs/outputs its internal states, e.g. for hidden state transfer
from keras import backend as K
from keras.layers.recurrent import LSTM
class HiddenStateLSTM(LSTM):
"""LSTM with input/output capabilities for its hidden state.
This layer behaves just like an LSTM, except that it accepts further inputs
to be used as its initial states, and returns additional outputs,
representing the layer's final states.
@mbollmann
mbollmann / state_transfer_lstm.py
Created June 18, 2016 08:59
StateTransferLSTM for Keras 1.x
# Source:
# https://github.com/farizrahman4u/seq2seq/blob/master/seq2seq/layers/state_transfer_lstm.py
from keras import backend as K
from keras.layers.recurrent import LSTM
class StateTransferLSTM(LSTM):
"""LSTM with the ability to transfer its hidden state.
This layer behaves just like an LSTM, except that it can transfer (or