Skip to content

Instantly share code, notes, and snippets.

@kurianbenoy
Created April 10, 2022 18:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kurianbenoy/272f907c58c98fe5987d7d4dda622aed to your computer and use it in GitHub Desktop.
Save kurianbenoy/272f907c58c98fe5987d7d4dda622aed to your computer and use it in GitHub Desktop.

Downloading: 0%| | 0.00/1.72k [00:00<?, ?B/s] Downloading: 4.50kB [00:00, 5.51MB/s]

Downloading: 0%| | 0.00/1.12k [00:00<?, ?B/s] Downloading: 3.31kB [00:00, 4.10MB/s]
Fetching model from: https://huggingface.co/kurianbenoy/kde_en_ml_translation_model Running on local URL: http://localhost:7860/

To create a public link, set share=True in launch().

Downloading: 0%| | 0.00/1.09k [00:00<?, ?B/s] Downloading: 100%|██████████| 1.09k/1.09k [00:00<00:00, 917kB/s]

Downloading: 0%| | 0.00/42.0 [00:00<?, ?B/s] Downloading: 100%|██████████| 42.0/42.0 [00:00<00:00, 25.1kB/s]

Downloading: 0%| | 0.00/439k [00:00<?, ?B/s] Downloading: 100%|██████████| 439k/439k [00:00<00:00, 66.4MB/s]

Downloading: 0%| | 0.00/600k [00:00<?, ?B/s] Downloading: 100%|██████████| 600k/600k [00:00<00:00, 77.4MB/s]

Downloading: 0%| | 0.00/934k [00:00<?, ?B/s] Downloading: 100%|██████████| 934k/934k [00:00<00:00, 76.3MB/s]

Downloading: 0%| | 0.00/219M [00:00<?, ?B/s] Downloading: 3%|▎ | 7.47M/219M [00:00<00:02, 78.3MB/s] Downloading: 7%|▋ | 15.2M/219M [00:00<00:02, 80.0MB/s] Downloading: 10%|█ | 22.9M/219M [00:00<00:02, 80.2MB/s] Downloading: 14%|█▍ | 30.6M/219M [00:00<00:02, 80.5MB/s] Downloading: 18%|█▊ | 38.4M/219M [00:00<00:02, 81.1MB/s] Downloading: 21%|██ | 46.2M/219M [00:00<00:02, 80.5MB/s] Downloading: 25%|██▍ | 53.9M/219M [00:00<00:02, 80.4MB/s] Downloading: 28%|██▊ | 61.7M/219M [00:00<00:02, 81.0MB/s] Downloading: 32%|███▏ | 69.6M/219M [00:00<00:01, 81.7MB/s] Downloading: 35%|███▌ | 77.5M/219M [00:01<00:01, 82.0MB/s] Downloading: 39%|███▉ | 85.5M/219M [00:01<00:01, 82.4MB/s] Downloading: 43%|████▎ | 93.3M/219M [00:01<00:01, 82.1MB/s] Downloading: 46%|████▋ | 101M/219M [00:01<00:01, 82.6MB/s] Downloading: 50%|█████ | 109M/219M [00:01<00:01, 83.3MB/s] Downloading: 54%|█████▎ | 117M/219M [00:01<00:01, 83.5MB/s] Downloading: 57%|█████▋ | 126M/219M [00:01<00:01, 84.2MB/s] Downloading: 61%|██████ | 134M/219M [00:01<00:01, 84.0MB/s] Downloading: 65%|██████▍ | 142M/219M [00:01<00:00, 83.8MB/s] Downloading: 69%|██████▊ | 150M/219M [00:01<00:00, 84.4MB/s] Downloading: 72%|███████▏ | 158M/219M [00:02<00:00, 84.4MB/s] Downloading: 76%|███████▌ | 166M/219M [00:02<00:00, 84.5MB/s] Downloading: 80%|███████▉ | 174M/219M [00:02<00:00, 84.7MB/s] Downloading: 83%|████████▎ | 182M/219M [00:02<00:00, 84.6MB/s] Downloading: 87%|████████▋ | 190M/219M [00:02<00:00, 84.7MB/s] Downloading: 91%|█████████ | 198M/219M [00:02<00:00, 84.4MB/s] Downloading: 94%|█████████▍| 206M/219M [00:02<00:00, 84.4MB/s] Downloading: 98%|█████████▊| 215M/219M [00:02<00:00, 84.9MB/s] Downloading: 100%|██████████| 219M/219M [00:02<00:00, 83.1MB/s]

Downloading: 0%| | 0.00/231M [00:00<?, ?B/s] Downloading: 4%|▎ | 8.21M/231M [00:00<00:02, 82.1MB/s] Downloading: 7%|▋ | 16.7M/231M [00:00<00:02, 83.7MB/s] Downloading: 11%|█ | 25.2M/231M [00:00<00:02, 84.4MB/s] Downloading: 15%|█▍ | 33.7M/231M [00:00<00:02, 84.6MB/s] Downloading: 18%|█▊ | 42.2M/231M [00:00<00:02, 84.5MB/s] Downloading: 22%|██▏ | 50.7M/231M [00:00<00:02, 84.8MB/s] Downloading: 26%|██▌ | 59.3M/231M [00:00<00:02, 85.0MB/s] Downloading: 29%|██▉ | 67.8M/231M [00:00<00:01, 84.6MB/s] Downloading: 33%|███▎ | 76.3M/231M [00:00<00:01, 84.7MB/s] Downloading: 37%|███▋ | 84.7M/231M [00:01<00:01, 84.4MB/s] Downloading: 40%|████ | 93.2M/231M [00:01<00:01, 84.5MB/s] Downloading: 44%|████▍ | 102M/231M [00:01<00:01, 84.1MB/s] Downloading: 48%|████▊ | 110M/231M [00:01<00:01, 84.2MB/s] Downloading: 51%|█████▏ | 119M/231M [00:01<00:01, 84.3MB/s] Downloading: 55%|█████▌ | 127M/231M [00:01<00:01, 84.3MB/s] Downloading: 59%|█████▉ | 136M/231M [00:01<00:01, 84.6MB/s] Downloading: 62%|██████▏ | 144M/231M [00:01<00:01, 84.3MB/s] Downloading: 66%|██████▌ | 152M/231M [00:01<00:00, 84.5MB/s] Downloading: 70%|██████▉ | 161M/231M [00:01<00:00, 84.5MB/s] Downloading: 73%|███████▎ | 169M/231M [00:02<00:00, 84.8MB/s] Downloading: 77%|███████▋ | 178M/231M [00:02<00:00, 84.9MB/s] Downloading: 81%|████████ | 187M/231M [00:02<00:00, 84.2MB/s] Downloading: 85%|████████▍ | 195M/231M [00:02<00:00, 84.0MB/s] Downloading: 88%|████████▊ | 203M/231M [00:02<00:00, 84.1MB/s] Downloading: 92%|█████████▏| 212M/231M [00:02<00:00, 83.6MB/s] Downloading: 95%|█████████▌| 220M/231M [00:02<00:00, 77.3MB/s] Downloading: 99%|█████████▉| 229M/231M [00:02<00:00, 79.3MB/s] Downloading: 100%|██████████| 231M/231M [00:02<00:00, 83.4MB/s] Traceback (most recent call last): File "/home/user/.local/lib/python3.8/site-packages/gradio/routes.py", line 265, in predict output = await run_in_threadpool(app.launchable.process_api, body, username) File "/home/user/.local/lib/python3.8/site-packages/starlette/concurrency.py", line 39, in run_in_threadpool return await anyio.to_thread.run_sync(func, *args) File "/home/user/.local/lib/python3.8/site-packages/anyio/to_thread.py", line 28, in run_sync return await get_asynclib().run_sync_in_worker_thread(func, *args, cancellable=cancellable, File "/home/user/.local/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 818, in run_sync_in_worker_thread return await future File "/home/user/.local/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 754, in run result = context.run(func, *args) File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 573, in process_api prediction, durations = self.process(raw_input) File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 615, in process predictions, durations = self.run_prediction( File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 531, in run_prediction prediction = predict_fn(*processed_input) File "app.py", line 19, in translate_text model = download() File "app.py", line 15, in download model = load_learner(hf_hub_download("kurianbenoy/kde_en_ml_translation_model", "saved_model.pkl")) File "/home/user/.local/lib/python3.8/site-packages/fastai/learner.py", line 386, in load_learner try: res = torch.load(fname, map_location='cpu' if cpu else None, pickle_module=pickle_module) File "/home/user/.local/lib/python3.8/site-packages/torch/serialization.py", line 712, in load return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args) File "/home/user/.local/lib/python3.8/site-packages/torch/serialization.py", line 1046, in _load result = unpickler.load() File "/home/user/.local/lib/python3.8/site-packages/transformers/models/marian/tokenization_marian.py", line 367, in setstate self.spm_source, self.spm_target = (load_spm(f, self.sp_model_kwargs) for f in self.spm_files) File "/home/user/.local/lib/python3.8/site-packages/transformers/models/marian/tokenization_marian.py", line 367, in self.spm_source, self.spm_target = (load_spm(f, self.sp_model_kwargs) for f in self.spm_files) File "/home/user/.local/lib/python3.8/site-packages/transformers/models/marian/tokenization_marian.py", line 394, in load_spm spm.Load(path) File "/home/user/.local/lib/python3.8/site-packages/sentencepiece/init.py", line 367, in Load return self.LoadFromFile(model_file) File "/home/user/.local/lib/python3.8/site-packages/sentencepiece/init.py", line 171, in LoadFromFile return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg) OSError: Not found: "/root/.cache/huggingface/transformers/9c7933a8cb164e6207557926768a9a265d1bd79d39f5356e412c6abbe425b649.3b0201e0c6666c1f97cb872a091dba1baec985f9f13044c690a95a8af2f48ae6": Permission denied Error #13

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment