Skip to content

Instantly share code, notes, and snippets.

View bzamecnik's full-sized avatar

Bohumír Zámečník bzamecnik

View GitHub Profile
@bzamecnik
bzamecnik / keras_input_reshape.py
Last active June 28, 2018 08:15
Reshaping input data for convolution in Keras
# In Keras the Convolution layer requirest an additional dimension which will be used for the various filter.
# When we have eg. 2D dataset the shape is (data_points, rows, cols).
# But Convolution2D requires shape (data_points, rows, cols, 1).
# Otherwise it fails with eg. "Exception: Input 0 is incompatible with layer convolution2d_5: expected ndim=4, found ndim=3"
#
# Originally I reshaped the data beforehand but it only complicates things.
#
# An easier and more elegant solution is to add a Reshape layer at the input
# of the network!
#
@bzamecnik
bzamecnik / animate_chroma_polar.py
Created November 1, 2016 13:31
Animated polar chroma plot
"""
What pitch classes are playing?
video: https://www.youtube.com/watch?v=DOJyjMQHP8U
We computed a chromagram, ie. a sequence of pitch class vectors in
time using the Python tfr library (https://github.com/bzamecnik/tfr)
and animated it with matplotlib and moviepy. The tfr library computes
very sharp spectrograms and allows to transform frequencies to pitches.
Pitches are folded to classes by ignoring the octave producing
import matplotlib as mpl
mpl.use('Agg')
import matplotlib.pyplot as plt
import moviepy.editor as mpy
from moviepy.video.io.bindings import mplfig_to_npimage
import numpy as np
from scipy.signal import medfilt
import tfr
# --- parameters ---
import matplotlib as mpl
mpl.use('Agg')
import matplotlib.pyplot as plt
import moviepy.editor as mpy
from moviepy.video.io.bindings import mplfig_to_npimage
import numpy as np
from scipy.signal import medfilt
import tfr
# --- parameters ---
# video and description: https://youtu.be/GX33y67CN-w
import matplotlib as mpl
mpl.use('Agg')
import matplotlib.pyplot as plt
import moviepy.editor as mpy
from moviepy.video.io.bindings import mplfig_to_npimage
import numpy as np
from scipy.signal import medfilt
import tfr
import matplotlib as mpl
mpl.use('Agg')
import matplotlib.pyplot as plt
import moviepy.editor as mpy
from moviepy.video.io.bindings import mplfig_to_npimage
import numpy as np
import os
from scipy.signal import medfilt
import tfr
@bzamecnik
bzamecnik / one_hot_lambda_layer_keras.py
Last active October 22, 2020 18:02
One-hot encoding in a Keras Lambda layer
"""
When traing ML models on text we usually need to represent words/character in one-hot encoding.
This can be done in preprocessing, however it may make the dataset file bigger. Also when we'd
like to use an Embedding layer, it accepts the original integer indexes instead of one-hot codes.
Can be move the one-hot encoding from pre-preprocessing directly into the model?
If so we could choose from two options: use one-hot inputs or perform embedding.
A way how to do this was suggested in Keras issue [#3680](https://github.com/fchollet/keras/issues/3680).
@bzamecnik
bzamecnik / demultiplex_inputs_in_keras.py
Last active August 29, 2017 17:55
Demultiplexing inputs within Keras layers
"""
In this example we show how to select and separately process multiple input
features within Keras layers.
Let's say we have a model with two categorical features and we can to embed
or one-hot encode each one separately. Normally in the Functional API we
would make two Input layers, one for each feature, then connect Embedding
to each, merge them and then add some more Dense/LSTM/... layers. In this
case we need to provide the model.predict() with a list of input arrays
instead of just one. It becomes a bit cumbersome if you need to index and
@bzamecnik
bzamecnik / model_summary.txt
Last active September 30, 2023 03:28
Residual LSTM in Keras
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
input_1 (InputLayer) (None, 32, 10) 0
____________________________________________________________________________________________________
lstm_1 (LSTM) (None, 32, 10) 840 input_1[0][0]
____________________________________________________________________________________________________
add_1 (Add) (None, 32, 10) 0 input_1[0][0]
lstm_1[0][0]
____________________________________________________________________________________________________
@bzamecnik
bzamecnik / lstm_with_softmax_keras.py
Created December 24, 2016 07:04
LSTM with softmax activation in Keras
"""
When classifying upon a sequence usually we stack some LSTM returning sequences,
then one LSTM returning a point, then Dense with softmax activation.
Is it possible instead to give the last non-sequential LSTM a softmax activation?
The answer is yes.
In this example we have 3 sequential layers and one layer producing the final result.