Skip to content

Instantly share code, notes, and snippets.

@endolith endolith/DFT_ANN.py
Last active Mar 22, 2019

Embed
What would you like to do?
Training neural network to implement discrete Fourier transform (DFT/FFT)

My second neural network experiment (first was FIR filter). DFT output is just a linear combination of inputs, so it should be implementable by a single layer with no activation function.

Topology of a 4-point complex DFT

Animation of weights being trained:

Neural network weights heatmap

Red are positive, blue are negative. The black squares (2336 out of 4096) are unused, and could be pruned out to save computation time (if I knew how to do that).

Even with pruning, it would be less efficient than an FFT, so if the FFT output is useful, probably best to do it externally and provide it as separate inputs?

This at least demonstrates that neural networks can figure out frequency content on their own, though, if it's useful to the problem.

The loss goes down for a while but then goes up. I don't know why:

loss vs epoch

"""
Train a neural network to implement the discrete Fourier transform
"""
from keras.models import Sequential
from keras.layers import Dense
import numpy as np
import matplotlib.pyplot as plt
N = 32
batch = 10000
# Generate random input data and desired output data
sig = np.random.randn(batch, N) + 1j*np.random.randn(batch, N)
F = np.fft.fft(sig, axis=-1)
# First half of inputs/outputs is real part, second half is imaginary part
X = np.hstack([sig.real, sig.imag])
Y = np.hstack([F.real, F.imag])
# Create model with no hidden layers, same number of outputs as inputs.
# No bias needed. No activation function, since DFT is linear.
model = Sequential([Dense(N*2, input_dim=N*2, use_bias=False)])
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(X, Y, epochs=100, batch_size=100)
# Confirm that it works
data = np.arange(N)
def ANN_DFT(x):
if len(x) != N:
raise ValueError(f'Input must be length {N}')
pred = model.predict(np.hstack([x.real, x.imag])[np.newaxis])[0]
result = pred[:N] + 1j*pred[N:]
return result
ANN = ANN_DFT(data)
FFT = np.fft.fft(data)
print(f'ANN matches FFT: {np.allclose(ANN, FFT)}')
# Heat map of neuron weights
plt.imshow(model.get_weights()[0], vmin=-1, vmax=1, cmap='coolwarm')
@aharchaoumehdi

This comment has been minimized.

Copy link

aharchaoumehdi commented Aug 18, 2018

Have you considered the 2D DFT? I'm wondering if the same network (with higher capacity) would still work.

@endolith

This comment has been minimized.

Copy link
Owner Author

endolith commented Aug 18, 2018

@aharchaoumehdi
Yes that would work fine, it would just be a lot of connections and inefficient compared to FFT.

@endolith

This comment has been minimized.

Copy link
Owner Author

endolith commented Aug 19, 2018

I tested all the optimizers. Some went down and then up, but others just trended downwards and then stopped. Need to learn more about them to understand why.

Learning DFT (discrete fourier transform) MSE (mean squared error) loss vs epoch for adam, sgd, rmsprop, adagrad, adadelta, adamax, nadam

Learning DFT (discrete fourier transform) MSLE (mean squared log error) loss vs epoch for adam, sgd, rmsprop, adagrad, adadelta, adamax, nadam

@CYHSM

This comment has been minimized.

Copy link

CYHSM commented Feb 8, 2019

Thanks for this very nice Gist! The increase in loss after around 100 epochs may come from floating point error accumulation. However I am still surprised by the pattern of the weights, any idea why they look like that? I just ran it again for different N and the pattern stays very similar.

different_n

@endolith

This comment has been minimized.

Copy link
Owner Author

endolith commented Feb 9, 2019

@CYHSM Floating point error accumulation in... what?

The pattern of weights is just what a DFT does, they're sinusoids of different frequency, discretely, evenly spaced around the unit circle. if you look at a single row or column, it's a sinusoid.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.