Skip to content

Instantly share code, notes, and snippets.

@gblazex
Created January 9, 2024 15:00
Show Gist options
  • Save gblazex/f4355171ff7a25018db97a51d18872bb to your computer and use it in GitHub Desktop.
Save gblazex/f4355171ff7a25018db97a51d18872bb to your computer and use it in GitHub Desktop.
2024-01-09T14:51:49.894270414Z return fn(*args, **kwargs)
2024-01-09T14:51:49.894273580Z File "/lm-evaluation-harness/lm_eval/evaluator.py", line 69, in simple_evaluate
2024-01-09T14:51:49.894279732Z lm = lm_eval.models.get_model(model).create_from_arg_string(
2024-01-09T14:51:49.894283779Z File "/lm-evaluation-harness/lm_eval/base.py", line 115, in create_from_arg_string
2024-01-09T14:51:49.894316350Z return cls(**args, **args2)
2024-01-09T14:51:49.894323294Z File "/lm-evaluation-harness/lm_eval/models/gpt2.py", line 67, in __init__
2024-01-09T14:51:49.894355253Z self.tokenizer = transformers.AutoTokenizer.from_pretrained(
2024-01-09T14:51:49.894361435Z File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/tokenization_auto.py", line 787, in from_pretrained
2024-01-09T14:51:49.894470349Z return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
2024-01-09T14:51:49.894475349Z File "/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained
2024-01-09T14:51:49.894690833Z return cls._from_pretrained(
2024-01-09T14:51:49.894695512Z File "/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained
2024-01-09T14:51:49.894932426Z tokenizer = cls(*init_inputs, **init_kwargs)
2024-01-09T14:51:49.894937525Z File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/tokenization_llama_fast.py", line 124, in __init__
2024-01-09T14:51:49.894976649Z super().__init__(
2024-01-09T14:51:49.894983452Z File "/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_fast.py", line 114, in __init__
2024-01-09T14:51:49.895031091Z fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
2024-01-09T14:51:49.895036761Z File "/usr/local/lib/python3.10/dist-packages/transformers/convert_slow_tokenizer.py", line 1336, in convert_slow_tokenizer
2024-01-09T14:51:49.895208654Z return converter_class(transformer_tokenizer).converted()
2024-01-09T14:51:49.895213132Z File "/usr/local/lib/python3.10/dist-packages/transformers/convert_slow_tokenizer.py", line 459, in __init__
2024-01-09T14:51:49.895283534Z requires_backends(self, "protobuf")
2024-01-09T14:51:49.895288824Z File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1276, in requires_backends
2024-01-09T14:51:49.895446360Z raise ImportError("".join(failed))
2024-01-09T14:51:49.895450317Z ImportError:
2024-01-09T14:51:49.895452451Z LlamaConverter requires the protobuf library but it was not found in your environment. Checkout the instructions on the
2024-01-09T14:51:49.895454585Z installation page of its repo: https://github.com/protocolbuffers/protobuf/tree/master/python#installation and follow the ones
2024-01-09T14:51:49.895456338Z that match your environment. Please note that you may need to restart your runtime after installation.
2024-01-09T14:51:49.895457941Z
2024-01-09T14:51:50.594419423Z Elapsed Time: 297 seconds
2024-01-09T14:51:51.300514832Z Uploaded gist successfully! URL: https://gist.github.com/gblazex/5cc372cce1743c093c7e12a29dd5ab62
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment