This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Help for Step 8 | |
# 1) Use MSE loss function | |
# 2) Inside the train loop, if you shaped your input features into a 2D array, | |
# augment their dimensionality by 1 before feeding a batch of them in the LSTM as the batch must be a 3D array, not 2D. | |
# The command for doing this is: your_batch.unsqueeze_(-1) | |
# and it is an inplace operation, you don't have to assign it to a new variable | |
# In the same way, you must .squeeze_() the outputs of the LSTM to reshape them into a 2D array. | |
# 3) In order to apply a neural network layer to a sequence you must use the given function: apply_layer_to_timesteps | |
# 4) The input sequences in the main part of the exercise will not be of the same length. For this reason, we use |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import keras.backend as K | |
from keras.layers import Layer | |
from keras.legacy import interfaces | |
from keras.engine import InputSpec | |
from keras import activations, initializers, regularizers, constraints | |
class DenseTransposeTied(Layer): | |
@interfaces.legacy_dense_support |