Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support env HF_ENDPOINT? #416

Open
yufeng97 opened this issue Sep 30, 2024 · 1 comment
Open

Support env HF_ENDPOINT? #416

yufeng97 opened this issue Sep 30, 2024 · 1 comment

Comments

@yufeng97
Copy link

Feature request

Is it possible to support HuggingFace mirror website? Such as env HF_ENDPOINT . Like huggingface_hub library, it has a environment variable HF_ENDPOINT which can use huggingface mirror website to download models.

export HF_ENDPOINT=https://hf-mirror.com
huggingface-cli download --resume-download gpt2 --local-dir gpt2

https://hf-mirror.com/

Motivation

It's difficult to download model in China. Use HF mirror to accelerate model download.

Your contribution

Moral support

@nbroad1881
Copy link

nbroad1881 commented Oct 4, 2024

You can download the model locally and then use TEI to load the local model.

model=/path/to/model/weights
volume=/path/to/model

docker run --gpus all  -p 8080:80 -v $volume:/data --pull always ghcr.io/huggingface/text-embeddings-inference:1.5 --model-id /data/weights

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants