You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Trying to load LlamaTokenizerFast in code like this:
import transformers
# Load the tokenizer (replace 'bert-base-uncased' with the tokenizer you prefer)
#tokenizer = transformers.LlamaTokenizerFast.from_pretrained('/var/models/meta-llama/Meta-Llama-3-8B-Instruct/')
tokenizer = transformers.LlamaTokenizerFast.from_pretrained('meta-llama/Meta-Llama-3-8B-Instruct')
# Predefine your favorite sayings
sayings = [
"The quick brown fox",
"jumps over the lazy dog",
"Carpe diem",
"Veni vidi vici"
]
# Tokenize each phrase into a list of tokens
tokenized_sayings = [tokenizer.encode(phrase, add_special_tokens=False) for phrase in sayings]
# Print out the tokenized representations
for idx, tokenized_phrase in enumerate(tokenized_sayings):
print(f"Phrase {idx+1}: {tokenized_phrase}")
Results in this:
Error importing huggingface_hub.hf_api: partially initialized module 'transformers' has no attribute 'LlamaTokenizerFast' (most likely due to a circular import)
Error importing huggingface_hub.hf_api: partially initialized module 'transformers' has no attribute 'LlamaTokenizerFast' (most likely due to a circular import)
Traceback (most recent call last):
File "/Users/matt/code/hftest/tokenize.py", line 1, in <module>
import transformers
File "/Users/matt/code/hftest/venv/lib/python3.10/site-packages/transformers/__init__.py", line 26, in <module>
from . import dependency_versions_check
File "/Users/matt/code/hftest/venv/lib/python3.10/site-packages/transformers/dependency_versions_check.py", line 16, in <module>
from .utils.versions import require_version, require_version_core
File "/Users/matt/code/hftest/venv/lib/python3.10/site-packages/transformers/utils/__init__.py", line 18, in <module>
from huggingface_hub import get_full_repo_name # for backward compatibility
ImportError: cannot import name 'get_full_repo_name' from 'huggingface_hub' (/Users/matt/code/hftest/venv/lib/python3.10/site-packages/huggingface_hub/__init__.py)
Reproduction
mikoshi:code matt$ mkdir hftest
mikoshi:code matt$ cd hftest
mikoshi:hftest matt$ python --version
Python 3.10.14
mikoshi:hftest matt$ python -m venv venv
mikoshi:hftest matt$ source venv/bin/activate
(venv) mikoshi:hftest matt$ pip install transformers huggingface_hub
Collecting transformers
Using cached transformers-4.44.2-py3-none-any.whl (9.5 MB)
[snip]
(venv) mikoshi:hftest matt$ vi tokenize.py
(venv) mikoshi:hftest matt$ !py
python tokenize.py
Error importing huggingface_hub.hf_api: partially initialized module 'transformers' has no attribute 'LlamaTokenizerFast' (most likely due to a circular import)
Error importing huggingface_hub.hf_api: partially initialized module 'transformers' has no attribute 'LlamaTokenizerFast' (most likely due to a circular import)
Traceback (most recent call last):
File "/Users/matt/code/hftest/tokenize.py", line 1, in <module>
[snip]
I've played with a bunch of downgrades here trying to do hf hub <0.25 or transformers <4.44 but I get the same result on all of them.
First batch was OSX, identical result on Ubuntu 22.04 LTS+python 3.10.12:
(venv) matt@firestorm:~/nvme/code/hftest$ python tokenize.py
Error importing huggingface_hub.hf_api: partially initialized module 'transformers' has no attribute 'LlamaTokenizerFast' (most likely due to a circular import)
Error importing huggingface_hub.hf_api: partially initialized module 'transformers' has no attribute 'LlamaTokenizerFast' (most likely due to a circular import)
Traceback (most recent call last):
File "/home/matt/nvme/code/hftest/tokenize.py", line 1, in <module>
import transformers
File "/home/matt/nvme/code/hftest/venv/lib/python3.10/site-packages/transformers/__init__.py", line 26, in <module>
from . import dependency_versions_check
File "/home/matt/nvme/code/hftest/venv/lib/python3.10/site-packages/transformers/dependency_versions_check.py", line 16, in <module>
from .utils.versions import require_version, require_version_core
File "/home/matt/nvme/code/hftest/venv/lib/python3.10/site-packages/transformers/utils/__init__.py", line 18, in <module>
from huggingface_hub import get_full_repo_name # for backward compatibility
ImportError: cannot import name 'get_full_repo_name' from 'huggingface_hub' (/home/matt/nvme/code/hftest/venv/lib/python3.10/site-packages/huggingface_hub/__init__.py)
(venv) matt@firestorm:~/nvme/code/hftest$ uname -a
Linux firestorm 6.8.0-40-generic #40~22.04.3-Ubuntu SMP PREEMPT_DYNAMIC Tue Jul 30 17:30:19 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
(venv) matt@firestorm:~/nvme/code/hftest$
Logs
No response
System info
- huggingface_hub version: 0.25.1
- Platform: macOS-14.6.1-arm64-arm-64bit
- Python version: 3.10.14
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: /Users/matt/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: m9e
- Configured git credential helpers: osxkeychain
- FastAI: N/A
- Tensorflow: N/A
- Torch: N/A
- Jinja2: N/A
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: N/A
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 2.1.1
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /Users/matt/.cache/huggingface/hub
- HF_ASSETS_CACHE: /Users/matt/.cache/huggingface/assets
- HF_TOKEN_PATH: /Users/matt/.cache/huggingface/token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
linux version:
- huggingface_hub version: 0.25.1
- Platform: Linux-6.8.0-40-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: /home/matt/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: m9e
- Configured git credential helpers: cache
- FastAI: N/A
- Tensorflow: N/A
- Torch: N/A
- Jinja2: N/A
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: N/A
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 2.1.1
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /home/matt/.cache/huggingface/hub
- HF_ASSETS_CACHE: /home/matt/.cache/huggingface/assets
- HF_TOKEN_PATH: /home/matt/.cache/huggingface/token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
The text was updated successfully, but these errors were encountered:
I should add, I submitted here because I think the root cause is, as you can see in the errors:
ImportError: cannot import name 'get_full_repo_name' from 'huggingface_hub' (/home/matt/nvme/code/hftest/venv/lib/python3.10/site-packages/huggingface_hub/__init__.py)
Hi @m9e 👋
I've tried to reproduce the issue using the same versions of transformers==4.44.2 and huggingface_hub==0.25.1, but I was not able to replicate the error. I suspect it may be related to package compatibility or maybe installation issues rather than a bug in the code itself.
You can try creating a new virtual env and then upgrade to the latest versions pip install -U transformers huggingface_hub
Describe the bug
Trying to load LlamaTokenizerFast in code like this:
Results in this:
Reproduction
I've played with a bunch of downgrades here trying to do hf hub <0.25 or transformers <4.44 but I get the same result on all of them.
First batch was OSX, identical result on Ubuntu 22.04 LTS+python 3.10.12:
Logs
No response
System info
The text was updated successfully, but these errors were encountered: