graph TD
A[Christmas] -->|Get money| B(Go shopping)
B --> C{Let me think}
C -->|One| D[Laptop]
C -->|Two| E[iPhone]
C -->|Three| F[fa:fa-car Car]
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
10:M 30 Apr 2021 21:27:20.002 * <rg> GEARS: QA cached called {'event': 'keymiss', 'key': 'cache_{06S}_{PMC261870.xml:{06S}:26}_Laser Correction', 'type': 'empty', 'value': None} | |
=== REDIS BUG REPORT START: Cut & paste starting from here === | |
10:M 30 Apr 2021 21:27:20.002 # === ASSERTION FAILED === | |
10:M 30 Apr 2021 21:27:20.002 # ==> module.c:4085 '(c->flags & CLIENT_BLOCKED) == 0' is not true | |
------ STACK TRACE ------ | |
Backtrace: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
### This gears will pre-compute (encode) all sentences using BERT tokenizer for QA | |
tokenizer = None | |
def loadTokeniser(): | |
global tokenizer | |
from transformers import BertTokenizerFast | |
tokenizer = BertTokenizerFast.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad") | |
return tokenizer |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
tokenizer = None | |
import numpy as np | |
import torch | |
import os | |
config_switch=os.getenv('DOCKER', 'local') | |
if config_switch=='local': | |
startup_nodes = [{"host": "127.0.0.1", "port": "30001"}, {"host": "127.0.0.1", "port":"30002"}, {"host":"127.0.0.1", "port":"30003"}] | |
else: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
tokenizer = None | |
import numpy as np | |
import torch | |
import os | |
config_switch=os.getenv('DOCKER', 'local') | |
if config_switch=='local': | |
startup_nodes = [{"host": "127.0.0.1", "port": "30001"}, {"host": "127.0.0.1", "port":"30002"}, {"host":"127.0.0.1", "port":"30003"}] | |
else: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#standard module | |
# peak memory usage (kilobytes on Linux, bytes on OS X) | |
import resource | |
resource.getrusage(resource.RUSAGE_SELF).ru_maxrss | |
# pip install psutil | |
import os, psutil; print(psutil.Process(os.getpid()).memory_info().rss / 1024 ** 2) |
When I use docker to work with the shared workspace with host under Ubuntu, I find that files created by docker user is owned by root. This is not the same with macOS.
Maybe this is becuase docker is run by root user and the default user mapping mechanism is to map container-root to host-user or host-root. So can I map the container-root or container-any-user to host-current-user?
Fortunately the latest docker supports the re-map the container user to any host user via Linux namespace. Refer to this.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
``` | |
redis-cli -c -p 30001 -h 127.0.0.1 | |
127.0.0.1:30001> keys * | |
1) "processed_docs_stage1_para" | |
2) "sentence:PMC293432.xml:{06S}" | |
3) "sentence:PMC270701.xml:{06S}" | |
4) "edges_matched_{06S}" | |
5) "sentence:PMC222961.xml:{06S}" | |
6) "processed_docs_stage3{06S}" | |
7) "processed_docs_stage2_para{06S}" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from transformers import AutoTokenizer, AutoModel | |
tokenizer = None | |
def loadTokeniser(): | |
global tokenizer | |
from transformers import AutoTokenizer | |
tokenizer = AutoTokenizer.from_pretrained("t5-base",torchscript=True) | |
# Try RobertaTokenizerFast and BART | |
# tokenizer = AutoTokenizer.from_pretrained("emilyalsentzer/Bio_ClinicalBERT") | |
return tokenizer |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from transformers import BertForQuestionAnswering | |
import torch | |
bert_name = "bert-large-uncased-whole-word-masking-finetuned-squad" | |
model = BertForQuestionAnswering.from_pretrained(bert_name, torchscript=True) | |
model.eval() | |
inputs = [torch.ones(1, 2, dtype=torch.int64), | |
torch.ones(1, 2, dtype=torch.int64), |