Skip to content

Instantly share code, notes, and snippets.

View ChunML's full-sized avatar
✍️
Landed new site at https://trungtran.io

Trung Tran ChunML

✍️
Landed new site at https://trungtran.io
View GitHub Profile
# Combine the context vector and the LSTM output
# Before combined, both have shape of (batch_size, 1, rnn_size),
# so let's squeeze the axis 1 first
# After combined, it will have shape of (batch_size, 2 * rnn_size)
lstm_out = tf.concat([tf.squeeze(context, 1), tf.squeeze(lstm_out, 1)], 1)
# lstm_out now has shape (batch_size, rnn_size)
lstm_out = self.wc(lstm_out)
# Finally, it is converted back to vocabulary space: (batch_size, vocab_size)
for e in range(10):
accuracy = []
for batch, (text, label) in enumerate(train_data.take(-1)):
with tf.GradientTape() as tape:
logits = model(text)
label = tf.expand_dims(label, 1)
loss = loss_func(label, logits)
def positional_embedding(pos, model_size):
PE = np.zeros((1, model_size))
for i in range(model_size):
if i % 2 == 0:
PE[:, i] = np.sin(pos / 10000 ** (i / model_size))
else:
PE[:, i] = np.cos(pos / 10000 ** ((i - 1) / model_size))
return PE
max_length = max(len(data_en[0]), len(data_fr_in[0]))
@ChunML
ChunML / muffy_stata.ipynb
Created October 6, 2019 14:30
muffy_stata.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
for e in range(50):
batches = get_batches(in_text, out_text, flags.batch_size, flags.seq_size)
state_h, state_c = net.zero_state(flags.batch_size)
# Transfer data to GPU
state_h = state_h.to(device)
state_c = state_c.to(device)
for x, y in batches:
iteration += 1
import * as tf from '@tensorflow/tfjs';
import React from 'react';
import ReactDOM from 'react-dom';
class CharacterTable {
constructor(chars) {
// chars must be a list of unique characters
this.chars = chars;
this.charIndices = {};
this.indicesChar = {};
What a ridiculous concept !
[[8, 5, 21, 22, 23]]
quel concept ridicule ! <end>
Your idea is not entirely crazy .
[[24, 25, 6, 26, 27, 28, 1]]
votre idee n est pas completement folle . <end>
A man s worth lies in what he is .
[[5, 29, 30, 31, 32, 9, 8, 7, 6, 1]]
la valeur d un homme reside dans ce qu il est . <end>
What he did is very wrong .
Epoch 90 Loss 0.0023
Average elapsed time: 3.77s
A man s worth lies in what he is .
[[5, 29, 30, 31, 32, 9, 8, 7, 6, 1]]
la valeur d un homme reside dans ce qu il est . <end>
Epoch 100 Loss 0.0018
Average elapsed time: 3.76s
What a ridiculous concept !
[[8, 5, 21, 22, 23]]
What a ridiculous concept !
[[8, 5, 21, 22, 23]]
ce concept ridicule ! <end>
Your idea is not entirely crazy .
[[24, 25, 6, 26, 27, 28, 1]]
votre idee n est pas completement pas . . . <end>
A man s worth lies in what he is .
[[5, 29, 30, 31, 32, 9, 8, 7, 6, 1]]
la valeur d un homme reside dans ce qu il est . <end>
What he did is very wrong .
Epoch 50 Loss 0.0141
Average elapsed time: 4.84s
I can t believe you re giving up .
[[16, 11, 12, 95, 2, 96, 14, 97, 1]]
je n arrive pas a croire que vous abandonniez . <end>
Epoch 60 Loss 0.0063
Average elapsed time: 4.66s
What a ridiculous concept !
[[8, 5, 21, 22, 23]]