-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Codellama generates wierd tokens with TGI 0.0.24 #704
Comments
@pinak-p I reproduce your issue, both on SageMaker and locally with a 0.0.24 image. I verified that deploying the model with neuronx-tgi 0.0.23 leads to meaningful results, so this seems to be only that version.
|
@pinak-p this is not only a TGI issue: I also get gibberish with |
@pinak-p could you check with version |
What's the URL for 0.0.25 ? I don't see it here https://github.com/aws/deep-learning-containers/blob/master/available_images.md ... nor does the sagemaker SDK have the version. |
@pinak-p it is still being deployed, but you can use the neuronx-tgi docker image on an ec2 instance. https://github.com/huggingface/optimum-neuron/pkgs/container/neuronx-tgi. Alternatively, you can use directly |
System Info
Who can help?
@dacorvo
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
I'm using the below configuration to deploy the model on SageMaker.
Text Generation:
Output:
Expected behavior
Expectation is to get some text that is not weird and makes some sense.
The text was updated successfully, but these errors were encountered: