Skip to content

Instantly share code, notes, and snippets.

@paulhendricks
Created March 15, 2022 21:52
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save paulhendricks/ef3abefff9b7d38c0a7d91a80fc8449d to your computer and use it in GitHub Desktop.
Save paulhendricks/ef3abefff9b7d38c0a7d91a80fc8449d to your computer and use it in GitHub Desktop.
# Load BERT and the preprocessing model from TF Hub.
preprocess = hub.load('https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/1')
encoder = hub.load('https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/3')
# Use BERT on a batch of raw text inputs.
input = preprocess(['Batch of inputs', 'TF Hub makes BERT easy!', 'More text.'])
pooled_output = encoder(input)["pooled_output"]
print(pooled_output)
tf.Tensor(
[[-0.8384154 -0.26902363 -0.3839138 ... -0.3949695 -0.58442086 0.8058556 ]
[-0.8223734 -0.2883956 -0.09359277 ... -0.13833837 -0.6251748 0.88950026]
[-0.9045408 -0.37877116 -0.7714909 ... -0.5112085 -0.70791864 0.92950743]],
shape=(3, 768), dtype=float32)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment