Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestion: We can add huggingface candle inference. #23

Open
akashicMarga opened this issue Dec 1, 2024 · 1 comment
Open

Suggestion: We can add huggingface candle inference. #23

akashicMarga opened this issue Dec 1, 2024 · 1 comment

Comments

@akashicMarga
Copy link

I am using huggigface candle for smollm local inference. it's fast-growing optimized framework that can be used across different devices. we can add it here. i can raise PR, it's already integrated in candle as it's architecture is same as llama.

https://github.com/huggingface/candle/tree/main/candle-examples/examples/quantized

https://github.com/huggingface/candle/tree/main/candle-examples/examples/llama

@eliebak
Copy link
Collaborator

eliebak commented Dec 23, 2024

Hello! Yes, feel free to make a PR, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants