Created
August 14, 2020 14:02
-
-
Save eerkaijun/7aa6e306815f7ccbe16c37b99fe7058e to your computer and use it in GitHub Desktop.
Make sentiment predictions on fine tuned BERT model
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
tf_batch = tokenizer(pred_sentences, max_length=128, padding=True, truncation=True, return_tensors='tf') | |
tf_outputs = model(tf_batch) | |
tf_predictions = tf.nn.softmax(tf_outputs[0], axis=-1) | |
labels = ['Negative','Positive'] | |
label = tf.argmax(tf_predictions, axis=1) | |
label = label.numpy() | |
for i in range(len(pred_sentences)): | |
print(pred_sentences[i], ": ", labels[label[i]]) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment