Skip to content

Instantly share code, notes, and snippets.

@albahnsen
Created April 20, 2020 20:20
Show Gist options
  • Star 8 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save albahnsen/b02d2183c93067e3f248a428430c970e to your computer and use it in GitHub Desktop.
Save albahnsen/b02d2183c93067e3f248a428430c970e to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"BERT Sentence Embeddings.ipynb","provenance":[{"file_id":"1ZQvuAVwA3IjybezQOXnrXMGAnMyZRuPU","timestamp":1587408600459},{"file_id":"1FsBCkREOaDopLF3PIYUuQxLR8wRfjQY1","timestamp":1559844903389},{"file_id":"1f_snPs--PVYgZJwT3GwjxqVALFJ0T2-y","timestamp":1554843110227}],"collapsed_sections":[],"toc_visible":true},"kernelspec":{"name":"python3","display_name":"Python 3"},"accelerator":"GPU"},"cells":[{"cell_type":"markdown","metadata":{"id":"78HE8FLsKN9Q","colab_type":"text"},"source":["In this post, I take an in-depth look at word embeddings produced by Google's BERT and show you how to get started with BERT by producing your own word embeddings.\n","\n","This post is presented in two forms--as a blog post [here](http://mccormickml.com/2019/05/14/BERT-word-embeddings-tutorial/) and as a Colab notebook [here](https://colab.research.google.com/drive/1ZQvuAVwA3IjybezQOXnrXMGAnMyZRuPU). \n","The content is identical in both, but: \n","\n","* The blog post format may be easier to read, and includes a comments section for discussion. \n","* The Colab Notebook will allow you to run the code and inspect it as you read through.\n"]},{"cell_type":"markdown","metadata":{"id":"dYapTjoYa0kO","colab_type":"text"},"source":["# Introduction\n","\n"]},{"cell_type":"markdown","metadata":{"id":"WoitNQMWA1bt","colab_type":"text"},"source":["\n","### What is BERT?\n","\n","BERT (Bidirectional Encoder Representations from Transformers), released in late 2018, is the model we will use in this tutorial to provide readers with a better understanding of and practical guidance for using transfer learning models in NLP. BERT is a method of pretraining language representations that was used to create models that NLP practicioners can then download and use for free. You can either use these models to extract high quality language features from your text data, or you can fine-tune these models on a specific task (classification, entity recognition, question answering, etc.) with your own data to produce state of the art predictions.\n"]},{"cell_type":"markdown","metadata":{"id":"q-dDVmXAA3At","colab_type":"text"},"source":["\n","### Why BERT embeddings?\n","\n","In this tutorial, we will use BERT to extract features, namely word and sentence embedding vectors, from text data. What can we do with these word and sentence embedding vectors? First, these embeddings are useful for keyword/search expansion, semantic search and information retrieval. For example, if you want to match customer questions or searches against already answered questions or well documented searches, these representations will help you accuratley retrieve results matching the customer's intent and contextual meaning, even if there's no keyword or phrase overlap.\n","\n","Second, and perhaps more importantly, these vectors are used as high-quality feature inputs to downstream models. NLP models such as LSTMs or CNNs require inputs in the form of numerical vectors, and this typically means translating features like the vocabulary and parts of speech into numerical representations. In the past, words have been represented either as uniquely indexed values (one-hot encoding), or more helpfully as neural word embeddings where vocabulary words are matched against the fixed-length feature embeddings that result from models like Word2Vec or Fasttext. BERT offers an advantage over models like Word2Vec, because while each word has a fixed representation under Word2Vec regardless of the context within which the word appears, BERT produces word representations that are dynamically informed by the words around them. For example, given two sentences:\n","\n","\"The man was accused of robbing a bank.\"\n","\"The man went fishing by the bank of the river.\"\n","\n","Word2Vec would produce the same word embedding for the word \"bank\" in both sentences, while under BERT the word embedding for \"bank\" would be different for each sentence. Aside from capturing obvious differences like polysemy, the context-informed word embeddings capture other forms of information that result in more accurate feature representations, which in turn results in better model performance.\n","\n","From an educational standpoint, a close examination of BERT word embeddings is a good way to get your feet wet with BERT and its family of transfer learning models, and sets us up with some practical knowledge and context to better understand the inner details of the model in later tutorials.\n","\n","Onward!"]},{"cell_type":"markdown","metadata":{"id":"Pqa-7WXBAw8q","colab_type":"text"},"source":["# 1. Loading Pre-Trained BERT"]},{"cell_type":"code","metadata":{"id":"1RfUN_KolV-f","colab_type":"code","outputId":"580ce134-70c9-4b20-b0ff-a9a246fb762d","executionInfo":{"status":"ok","timestamp":1587408672660,"user_tz":300,"elapsed":5995,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}},"colab":{"base_uri":"https://localhost:8080/","height":34}},"source":["!pip install pytorch-pretrained-bert --quiet"],"execution_count":1,"outputs":[{"output_type":"stream","text":["\u001b[?25l\r\u001b[K |██▋ | 10kB 31.7MB/s eta 0:00:01\r\u001b[K |█████▎ | 20kB 3.3MB/s eta 0:00:01\r\u001b[K |████████ | 30kB 4.7MB/s eta 0:00:01\r\u001b[K |██████████▋ | 40kB 3.1MB/s eta 0:00:01\r\u001b[K |█████████████▎ | 51kB 3.8MB/s eta 0:00:01\r\u001b[K |███████████████▉ | 61kB 4.5MB/s eta 0:00:01\r\u001b[K |██████████████████▌ | 71kB 5.2MB/s eta 0:00:01\r\u001b[K |█████████████████████▏ | 81kB 4.0MB/s eta 0:00:01\r\u001b[K |███████████████████████▉ | 92kB 4.5MB/s eta 0:00:01\r\u001b[K |██████████████████████████▌ | 102kB 5.0MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▏ | 112kB 5.0MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▊| 122kB 5.0MB/s eta 0:00:01\r\u001b[K |████████████████████████████████| 133kB 5.0MB/s \n","\u001b[?25h"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"lJEnBJ3gHTsQ","colab_type":"code","outputId":"4165dbb5-985e-486d-9117-b30b744d40ce","executionInfo":{"status":"ok","timestamp":1587408756720,"user_tz":300,"elapsed":4457,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}},"colab":{"base_uri":"https://localhost:8080/","height":70}},"source":["import torch\n","from pytorch_pretrained_bert import BertTokenizer, BertModel, BertForMaskedLM\n","\n","# Load pre-trained model tokenizer (vocabulary-multilingual)\n","tokenizer = BertTokenizer.from_pretrained('bert-base-multilingual-cased')"],"execution_count":2,"outputs":[{"output_type":"stream","text":["The pre-trained model you are loading is a cased model but you have not set `do_lower_case` to False. We are setting `do_lower_case=False` for you but you may want to check this behavior.\n","100%|██████████| 995526/995526 [00:00<00:00, 3153884.96B/s]\n"],"name":"stderr"}]},{"cell_type":"markdown","metadata":{"id":"3gsyrAwYvBfC","colab_type":"text"},"source":["## 2. Sentence Tokenization\n"]},{"cell_type":"markdown","metadata":{"id":"2WafgQPLAWmo","colab_type":"text"},"source":["BERT provides its own tokenizer, which we imported above. Let's see how it handles the below sentence."]},{"cell_type":"code","metadata":{"id":"Pg0P9rFxJwwp","colab_type":"code","outputId":"401b5685-e8a8-4d08-b1e6-3261f67cb4ef","executionInfo":{"status":"ok","timestamp":1587409186699,"user_tz":300,"elapsed":582,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}},"colab":{"base_uri":"https://localhost:8080/","height":54}},"source":["text = \"Agregar mucha salsa molcajeteada, limones, y totopos ... gracias \"\n","marked_text = \"[CLS] \" + text + \" [SEP]\"\n","\n","# Tokenize our sentence with the BERT tokenizer.\n","tokenized_text = tokenizer.tokenize(marked_text)\n","segments_ids = [1] * len(tokenized_text)\n","\n","# Map the token strings to their vocabulary indeces.\n","indexed_tokens = tokenizer.convert_tokens_to_ids(tokenized_text)\n","\n","# Print out the tokens.\n","print (tokenized_text)"],"execution_count":6,"outputs":[{"output_type":"stream","text":["['[CLS]', 'A', '##gre', '##gar', 'mucha', 'salsa', 'mol', '##ca', '##jete', '##ada', ',', 'li', '##mone', '##s', ',', 'y', 'toto', '##pos', '.', '.', '.', 'gracias', '[SEP]']\n"],"name":"stdout"}]},{"cell_type":"markdown","metadata":{"id":"c-nY9LASLr2L","colab_type":"text"},"source":["# 3. Extracting Embeddings \n","\n"]},{"cell_type":"code","metadata":{"id":"E_t4cM6KLc98","colab_type":"code","colab":{}},"source":["# Convert inputs to PyTorch tensors\n","tokens_tensor = torch.tensor([indexed_tokens])\n","segments_tensors = torch.tensor([segments_ids])\n","\n","# Load pre-trained model (weights)\n","model = BertModel.from_pretrained('bert-base-multilingual-cased')\n","\n","# Put the model in \"evaluation\" mode, meaning feed-forward operation.\n","model.eval()"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"En4JZ41fh6CI","colab_type":"code","outputId":"edeca445-c9c4-4bd4-a9b5-cff4b8409327","executionInfo":{"status":"ok","timestamp":1587409906520,"user_tz":300,"elapsed":529,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}},"colab":{"base_uri":"https://localhost:8080/","height":34}},"source":["# Predict hidden states features for each layer\n","with torch.no_grad():\n"," encoded_layers, _ = model(tokens_tensor, segments_tensors)\n","\n","# Concatenate the tensors for all layers. We use `stack` here to\n","# create a new dimension in the tensor.\n","token_embeddings = torch.stack(encoded_layers, dim=0)\n","\n","# Remove dimension 1, the \"batches\".\n","token_embeddings = torch.squeeze(token_embeddings, dim=1)\n","\n","# Swap dimensions 0 and 1.\n","token_embeddings = token_embeddings.permute(1,0,2)\n","\n","token_embeddings.size()"],"execution_count":20,"outputs":[{"output_type":"execute_result","data":{"text/plain":["torch.Size([23, 12, 768])"]},"metadata":{"tags":[]},"execution_count":20}]},{"cell_type":"markdown","metadata":{"id":"76TdtFH8NM9q","colab_type":"text"},"source":["## 3.1 Word Vectors\n","\n","There are many methods to extract the word vectors from BERT. A simple solution is to create the word vectors by summing together the last four layers.\n","\n","\n"]},{"cell_type":"code","metadata":{"id":"pv42h9jANMRf","colab_type":"code","outputId":"0eaf3fbe-e3b8-49eb-a7c0-533d6e09771b","executionInfo":{"status":"ok","timestamp":1587410642292,"user_tz":300,"elapsed":680,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}},"colab":{"base_uri":"https://localhost:8080/","height":34}},"source":["# Stores the token vectors, with shape [23 x 768]\n","token_vecs_sum = []\n","\n","# `token_embeddings` is a [23 x 12 x 768] tensor.\n","\n","# For each token in the sentence...\n","for token in token_embeddings:\n","\n"," # `token` is a [12 x 768] tensor\n","\n"," # Sum the vectors from the last four layers.\n"," sum_vec = torch.sum(token[-4:], dim=0)\n"," \n"," # Use `sum_vec` to represent `token`.\n"," token_vecs_sum.append(sum_vec)\n","\n","print ('Shape is: %d x %d' % (len(token_vecs_sum), len(token_vecs_sum[0])))"],"execution_count":28,"outputs":[{"output_type":"stream","text":["Shape is: 23 x 768\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"zHHr_Z0SBIc-","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":50},"outputId":"944d801c-1552-4d06-f8ec-45c23d7bc14c","executionInfo":{"status":"ok","timestamp":1587410659740,"user_tz":300,"elapsed":562,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}}},"source":["token_vecs_sum[3][:15]"],"execution_count":31,"outputs":[{"output_type":"execute_result","data":{"text/plain":["tensor([-1.6947, -1.0643, 1.2002, 0.7157, 2.4702, -0.1323, 2.2795, 2.8649,\n"," -1.2580, 0.9318, 1.1011, 5.5358, 3.1182, -0.4945, 2.5315])"]},"metadata":{"tags":[]},"execution_count":31}]},{"cell_type":"markdown","metadata":{"id":"mQaco6jRLkXn","colab_type":"text"},"source":["## 3.2 Sentence Vectors\n","\n","\n","To get a single vector for our entire sentence we have multiple application-dependent strategies, but a simple approach is to average the second to last hiden layer of each token producing a single 768 length vector."]},{"cell_type":"code","metadata":{"id":"Zn0n2S-FWZih","colab_type":"code","colab":{}},"source":["# `encoded_layers` has shape [12 x 1 x 23 x 768]\n","\n","# `token_vecs` is a tensor with shape [23 x 768]\n","token_vecs = encoded_layers[11][0]\n","\n","# Calculate the average of all 23 token vectors.\n","sentence_embedding = torch.mean(token_vecs, dim=0)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"MQv0FL8VWadn","colab_type":"code","outputId":"9d2d8476-ee30-4821-8c6c-a73d4d01d510","executionInfo":{"status":"ok","timestamp":1587410718437,"user_tz":300,"elapsed":479,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}},"colab":{"base_uri":"https://localhost:8080/","height":34}},"source":["print (\"Our final sentence embedding vector of shape:\", sentence_embedding.size())"],"execution_count":33,"outputs":[{"output_type":"stream","text":["Our final sentence embedding vector of shape: torch.Size([768])\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"wRkpVuOFBjzn","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":1000},"outputId":"32f1efcb-01ab-4ff6-fbb6-9610f4ee9e4e","executionInfo":{"status":"ok","timestamp":1587410722689,"user_tz":300,"elapsed":466,"user":{"displayName":"Alejandro Correa Bähnsen","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14Gh2E-pkd5BGGS4mzLkoIKbUaeCaMINDfMJWrDDr-A=s64","userId":"04761850982152535886"}}},"source":["sentence_embedding"],"execution_count":34,"outputs":[{"output_type":"execute_result","data":{"text/plain":["tensor([-1.8472e-01, -3.1975e-01, 2.0524e-01, 1.8466e-01, 6.7442e-01,\n"," 6.5859e-02, 2.7543e-01, 3.7008e-01, 4.8470e-02, 6.4544e-02,\n"," -2.5896e-01, 3.8808e-01, 1.8641e-01, 1.8684e-01, 2.7901e-01,\n"," -1.8391e-01, 2.4382e-01, -1.4729e-02, -1.2464e-01, 3.1313e-01,\n"," -3.8217e-01, 4.9745e-02, 2.6459e-01, 1.6558e-01, 1.8589e-01,\n"," 3.9221e-01, -3.5080e-01, 2.2916e-01, -1.0496e-01, -5.7553e-01,\n"," 1.2044e-03, 2.6013e-01, -1.8011e-01, 1.1600e-01, 5.0010e-03,\n"," -4.6986e-02, -5.1652e-02, 2.4950e-01, 1.3273e-01, -1.1434e-01,\n"," -1.2804e-02, 3.4448e-01, 1.9676e-01, 1.7263e-01, 1.3718e-01,\n"," -1.2040e-01, 2.6909e-01, -4.9007e-02, 1.1978e-01, 1.9021e-01,\n"," 1.9566e-01, -2.2370e-01, -7.8432e-02, 3.2046e-02, 1.4038e-01,\n"," 3.3115e-01, 1.2462e-01, 1.8779e-01, -9.8005e-03, 2.0713e-01,\n"," -1.7347e-01, 1.4861e-01, 3.9736e-01, 3.6141e-02, -3.4722e-01,\n"," 2.5525e-01, 1.5488e-01, -1.0872e-01, 1.2644e-01, 3.1039e-01,\n"," 5.7924e-02, 6.7699e-02, 1.2709e-01, 1.6314e-01, -3.5837e-01,\n"," 7.5610e-01, 2.9489e-01, -2.9204e-01, -3.3524e-01, -2.4394e-01,\n"," 3.3921e-01, -3.2311e-02, -5.4324e-02, 2.8889e-01, -7.0762e-02,\n"," 1.9747e-01, 3.8569e-02, 5.2143e-02, 5.7869e-01, 2.7486e-01,\n"," 2.0389e-02, -8.4382e-02, 3.6380e-01, 3.1268e-01, -2.5431e-01,\n"," 7.7629e-01, 8.1681e-02, 1.0696e-01, -6.8906e-03, -3.0714e-02,\n"," -9.3182e-02, -1.0063e-01, 6.1430e-01, -1.7231e-01, 1.2712e-01,\n"," -2.0907e-01, 6.2317e-02, -1.1535e-01, -1.9046e-01, 4.5286e-01,\n"," 2.2685e-01, 2.3545e-01, 9.0525e-02, -4.1481e-01, -1.9423e-01,\n"," -3.2412e-01, -5.9183e-02, -2.4369e-01, 5.9888e-01, 4.9943e-02,\n"," 1.6175e-01, 4.8954e-01, -1.4800e-01, 1.9303e-01, 2.7567e-01,\n"," -1.3991e-02, -1.0644e-02, 2.7486e-01, -2.5166e-01, 8.6984e-02,\n"," -3.7520e-01, -2.5670e-01, -5.8578e-02, -1.6175e-01, 7.2529e-01,\n"," -9.2040e-02, 5.2522e-02, 1.5171e-01, 3.2878e-01, 2.1428e-01,\n"," 1.5756e-01, 2.5329e-01, -1.3089e-01, -3.9498e-01, -4.2697e-01,\n"," 1.6973e-01, -4.9383e-02, 3.8240e-01, -2.5166e-02, -2.1743e-02,\n"," 1.6511e-01, 1.3613e-01, 3.6428e-01, 3.6171e-02, -3.1096e-01,\n"," 1.1000e-01, -8.5889e-02, 6.6770e-02, -1.4994e-01, 1.7493e-01,\n"," 2.3595e-01, 3.0276e-01, -4.9262e-01, 1.8105e-01, -2.3963e-01,\n"," -1.2371e-01, 3.1706e-02, -3.9103e-01, -3.0992e-01, 2.6761e-01,\n"," 1.0006e-01, -1.9962e-01, -1.7967e-01, 2.9719e-01, 3.2995e-01,\n"," 3.5042e-01, -3.2674e-01, -8.8760e-02, -4.6347e-01, -1.1850e-01,\n"," -6.9532e-02, 3.7617e-01, -1.1953e-01, 6.2653e-04, -5.7204e-02,\n"," 5.7388e-02, -5.9660e-02, 2.3604e-02, -1.3457e-01, -7.8495e-03,\n"," -1.3176e-01, -1.4515e-01, -2.1889e-01, 2.0840e-01, -1.1697e-01,\n"," 4.1483e-01, -2.1924e-01, 2.9187e-01, 3.4116e-01, 6.4943e-02,\n"," -4.6726e-01, 9.7227e-02, -1.2627e-01, 5.9947e-02, -4.2103e-02,\n"," 2.8172e-01, -2.9853e-01, -1.4190e-01, -2.9978e-01, 5.1124e-01,\n"," -1.8857e-01, 9.6757e-02, -5.3368e-02, 1.1800e-01, 4.1340e-01,\n"," -9.2492e-02, 1.5677e-02, 6.1846e-02, 4.6823e-01, 9.0435e-02,\n"," -2.7844e-02, -7.7420e-02, -3.0944e-01, 7.8034e-02, -2.8021e-01,\n"," -2.1077e-01, -3.8906e-01, 5.1950e-02, -1.2247e-01, -3.4400e-01,\n"," -2.0278e-01, -4.6722e-01, -1.3053e-01, 1.6416e-02, -9.5711e-02,\n"," -1.8706e-02, 4.5931e-01, -1.1862e-01, -3.1755e-01, -3.0123e-01,\n"," 3.0937e-02, 3.5832e-01, -4.0719e-01, 7.2580e-02, 2.2241e-01,\n"," 2.0996e-01, -3.2010e-01, -5.6949e-01, 2.1039e-01, -3.7357e-01,\n"," -3.3611e-01, 1.7844e-01, 1.1987e-01, -1.5267e-01, -2.8394e-01,\n"," -3.6055e-02, -1.2303e-01, 2.8314e-01, 5.6110e-01, -4.3360e-01,\n"," 2.2199e-01, -5.6719e-01, -1.4400e-01, 3.1683e-01, 1.8077e-02,\n"," 4.2634e-01, -8.7194e-02, 1.2010e-01, 1.2126e-01, -1.0986e-02,\n"," 2.5405e-02, 6.2058e-01, 2.8179e-01, 2.2376e-01, -1.5765e-01,\n"," -9.9441e-02, 6.3375e-03, -2.1348e-02, -2.8064e-01, 6.8794e-02,\n"," -3.0017e-01, -3.1627e-01, 1.1263e-01, -2.9709e-01, 1.2642e-01,\n"," -3.0499e-01, -7.7246e-02, -5.9586e-02, -4.2003e-01, -1.0608e-01,\n"," -5.2178e-01, -4.3641e-01, -5.7505e-02, 2.9781e-01, -6.3053e-03,\n"," -2.2636e-01, -3.3298e-01, 6.3450e-02, -3.1625e-01, -2.6066e-01,\n"," 1.2011e-01, 1.1155e-01, -1.8560e-01, 9.3473e-02, -2.1773e-01,\n"," -1.0543e-01, -1.5438e-01, -4.8597e-01, 4.8935e-01, -1.1914e-01,\n"," -6.4251e-01, 7.0716e-02, -9.3549e-02, 9.2272e-02, 1.2473e-01,\n"," 8.7781e-03, 1.9223e-01, 6.4624e-02, -2.3412e-01, 1.1422e-01,\n"," 7.8621e-02, -1.7681e-01, 9.5742e-02, 2.4647e-01, -1.8688e-01,\n"," 3.0389e-02, 7.7957e-03, -4.2812e-01, 1.6308e-01, -4.9292e-02,\n"," -2.6215e-01, 7.8407e-01, -1.6900e-01, -2.8255e-01, 8.4484e-02,\n"," 1.7450e-01, -2.8875e-01, -1.7791e-01, 1.0784e-02, -3.4726e-01,\n"," -9.2580e-02, -5.1265e-02, 6.8444e-01, -3.5049e-01, 2.8899e-01,\n"," -1.3548e-01, -2.7032e-01, 5.0112e-01, -1.9291e-01, -4.0651e-01,\n"," -5.0507e-01, -5.9827e-02, 1.2365e-01, 1.1643e-02, 2.1023e-02,\n"," 1.1256e-01, -2.8345e-01, -7.4494e-02, -4.2080e-02, -1.8484e-01,\n"," 9.6973e-02, 2.8845e-01, -1.2645e-01, -3.6291e-01, -1.3303e-01,\n"," 1.1543e-01, -9.7732e-02, 9.6004e-02, 1.7043e-01, 7.3320e-01,\n"," 1.9439e-02, 7.0844e-01, -3.4338e-01, -5.3523e-01, 1.6517e-01,\n"," 3.1802e-01, 6.4000e-02, -5.3218e-01, 6.4329e-01, 1.4038e-01,\n"," 2.7253e-01, 2.2288e-02, -3.7220e-01, 1.3957e-02, 2.5965e-02,\n"," -6.9271e-02, 1.9530e-01, -1.5344e-01, 2.3666e-01, 1.4099e-01,\n"," 6.2222e-02, 5.0822e-01, 9.0492e-02, 2.4477e-01, 1.0373e-01,\n"," 1.1387e-01, 6.9165e-02, -2.5243e-02, -1.3971e-01, -4.9195e-02,\n"," 8.0140e-02, -9.2023e-02, -9.3849e-02, 4.6220e-02, -2.6207e-01,\n"," 2.3918e-01, 2.8808e-01, -5.8265e-02, -2.7857e-01, -1.5010e-01,\n"," 8.2621e-02, 7.7054e-02, 2.6570e-01, 5.7805e-01, 3.9910e-01,\n"," 2.4194e-01, -3.6002e-01, -9.4750e-02, -1.9963e-01, 2.9028e-01,\n"," 4.7464e-02, -2.0809e-01, -2.1146e-01, -1.0030e+00, 7.4292e-02,\n"," -7.6848e-02, -3.0279e-01, 1.3969e-01, -4.0051e-01, -3.8083e-01,\n"," -1.1246e-01, -1.3796e-01, -4.1523e-02, -2.8831e-01, -4.0044e-01,\n"," -2.4647e-01, -8.8503e-02, 7.1589e-02, 1.3024e-01, -3.7818e-03,\n"," -2.0455e-01, -1.3055e-01, -4.1618e-02, 1.6672e-01, 6.7958e-02,\n"," 1.3140e-01, 7.5546e-02, -2.5930e-01, -2.9910e-02, 3.2513e-01,\n"," 3.3824e-01, -1.0576e-01, -9.2584e-02, -1.0031e-01, -1.3110e-01,\n"," -3.1190e-01, -5.8103e-01, -2.0930e-01, 2.1296e-01, -5.9672e-01,\n"," 6.5020e-02, 3.1602e-01, 1.1511e-01, -6.2408e-02, -2.4838e-02,\n"," -2.0562e-01, -1.8375e-01, -1.0862e-02, 2.1676e-02, 1.1551e-01,\n"," -3.8128e-01, 4.6505e-01, 3.2864e-02, -1.7226e-01, -6.5484e-03,\n"," -7.5590e-02, 2.1847e-01, 1.5482e-01, -1.7993e-01, -2.9632e-01,\n"," -2.5808e-02, 2.8955e-01, 1.1267e+00, -7.1968e-02, 4.3966e-02,\n"," 1.2026e-01, 1.2282e-02, 1.1179e-02, -2.2804e-02, 2.5105e-01,\n"," 3.0695e-01, -9.8146e-02, -4.3016e-01, -4.1615e-01, 1.5284e-01,\n"," 3.2400e-01, 2.5436e-01, -1.1041e-01, 3.3722e-01, 2.8623e-01,\n"," 2.2994e-01, 1.1072e-01, -1.9588e-02, -4.4614e-02, 9.3248e-02,\n"," -1.8404e-01, -1.4134e-01, -1.6084e-01, 9.0789e-02, 1.4334e-01,\n"," -3.1443e-01, -6.6309e-01, -2.6948e-01, 3.1492e-01, 1.9613e-01,\n"," -5.7132e-02, -8.6362e-02, -8.4601e-02, -1.3526e-01, 1.9796e-02,\n"," 1.9403e-01, 3.0603e-01, 1.4662e-01, -2.1403e-01, -4.2749e-02,\n"," -1.9732e-01, -7.1498e-02, -1.6228e-01, 1.1908e-01, 1.6366e-01,\n"," -2.5140e-03, 8.9353e-02, -1.4462e-01, 1.4724e-01, 7.2549e-02,\n"," 4.4314e-01, 2.3441e-01, 3.8561e-02, 3.9683e-02, 4.4084e-01,\n"," 7.6466e-02, 4.6808e-02, -1.1589e-01, -9.1401e-02, 3.2124e-01,\n"," -6.6837e-02, 1.4658e-01, 2.0005e-01, -1.1892e-01, 7.0412e-02,\n"," 1.1342e-01, 9.8692e-02, 2.5994e-02, 3.3634e-02, -6.3411e-01,\n"," 2.0712e-01, 2.8033e-01, -4.4711e-01, 3.5803e-01, 1.9316e-01,\n"," -3.1182e-01, 1.9874e-01, 1.9976e-01, -5.1385e-02, 2.6822e-01,\n"," 2.6258e-01, -2.9459e-01, 1.5485e-01, 2.2129e-01, 1.7052e-01,\n"," 1.7854e-01, 2.9901e-01, -1.5497e-01, 3.0804e-01, 1.6285e-01,\n"," -3.6908e-02, -4.4349e-01, -1.2893e-01, 1.4772e-01, -1.7640e-01,\n"," -5.4158e-01, 2.5261e-01, 1.4366e-01, -1.4441e-01, -1.9142e-01,\n"," 1.1393e-01, -1.6392e-01, 1.2795e-01, -3.5259e-01, -1.4713e-01,\n"," 2.4146e-01, 7.1593e-02, 1.8710e-01, -4.1799e-02, 3.3410e-01,\n"," 7.2867e-02, 2.5829e-01, -3.3332e-01, 6.5738e-02, -2.5124e-01,\n"," 3.7760e-01, -5.8373e-01, 2.3463e-01, -5.0202e-01, -2.5784e-01,\n"," 3.4890e-02, 3.4695e-01, 1.4977e-01, -4.5025e-01, -4.2352e-01,\n"," -3.0331e-01, -1.9517e-01, -2.9222e-01, -1.1445e-01, 4.0854e-02,\n"," -3.9452e-01, -9.1322e-01, 4.1030e-02, 2.2493e-01, 3.6977e-01,\n"," 2.8613e-01, 1.0714e-01, 3.4583e-01, 1.8211e-01, -3.4417e-02,\n"," -2.3948e-02, 3.1289e-01, -2.7371e-01, 3.3591e-01, -1.8244e-01,\n"," 1.6288e-02, -9.3281e-02, 1.3320e-01, -1.4774e-01, 2.5891e-01,\n"," 2.4822e-01, -7.9205e-02, 5.9994e-02, -1.6356e-01, -2.0131e-01,\n"," 1.3503e-01, 8.8573e-02, -1.2327e-01, -5.8152e-01, 1.1491e-01,\n"," 2.7981e-01, 2.6527e-03, 1.5118e-01, -3.0204e-01, -2.5359e-01,\n"," 6.0852e-01, -3.1113e-01, 3.4722e-01, -1.9405e-01, 1.0377e-02,\n"," 3.5306e-01, 3.0108e-01, -3.2956e-01, -5.0085e-01, 2.2167e-01,\n"," 3.2193e-01, 5.2803e-01, 2.8040e-01, -3.3671e-01, -1.7194e-01,\n"," 2.3490e-01, -4.9871e-01, -2.8264e-01, 1.9016e-01, 1.3766e-01,\n"," -2.1446e-01, -1.3911e-01, 1.4757e-01, -6.1424e-02, 2.2370e-01,\n"," -3.1825e-01, 1.3458e-01, -1.7522e-01, -7.4187e-02, -6.8039e-02,\n"," 3.0977e-01, 1.8057e-01, 4.1088e-02, -5.8221e-02, 4.3792e-02,\n"," 2.3164e-01, 1.3266e-01, 7.2645e-02, -1.9730e-01, 1.2802e-01,\n"," 2.4883e-01, 3.2275e-03, 2.6869e-01, 6.9246e-02, -4.5734e-01,\n"," -7.8838e-02, -9.7958e-03, 6.3033e-02, 4.5887e-01, -2.5121e-01,\n"," 1.2462e-03, -3.0594e-01, -4.8102e-02, -4.7204e-01, -1.6430e-02,\n"," 3.7490e-01, -2.4748e-01, -1.9435e-02, 1.2495e-01, 1.4162e-01,\n"," 3.0226e-01, -2.8904e-01, 1.9648e-01, -2.2423e-01, 1.9335e-02,\n"," -1.5567e-01, 4.2223e-02, -7.9618e-01, 4.2353e-02, 4.4159e-01,\n"," 3.3022e-02, -2.0635e-01, -4.5773e-01, -5.2570e-03, 2.5613e-01,\n"," 7.0485e-02, -2.9704e-01, 7.2624e-02, 1.7797e-01, -8.4095e-02,\n"," 7.5870e-03, -1.1857e+00, -2.9231e-01, 8.9138e-03, 1.9026e-01,\n"," -2.9239e-01, -7.6378e-02, 5.0637e-02, -1.5500e-01, 3.1275e-01,\n"," 3.0466e-01, -7.6498e-01, -2.6796e-01, -1.0320e-01, -2.1724e-01,\n"," 1.3462e-01, 1.2024e-01, -1.6009e-02, 1.9848e-01, -1.4225e-01,\n"," -5.4985e-02, -2.6918e-01, -2.6752e-01, -3.3756e-01, 7.8736e-02,\n"," -3.3303e-02, 7.1563e-02, 3.5799e-01, 2.1849e-01, 3.0364e-01,\n"," -3.0524e-01, -1.7155e-01, -4.6672e-02, -2.5173e-01, 1.5638e-01,\n"," 6.1807e-04, -1.8095e-01, -3.9927e-02])"]},"metadata":{"tags":[]},"execution_count":34}]},{"cell_type":"code","metadata":{"id":"VuDY3Zv7DV0b","colab_type":"code","colab":{}},"source":[""],"execution_count":0,"outputs":[]}]}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment