You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If people are interested in contributing to tortoise.cpp, a great first task would be getting the tokenizer to always match the tokenization tortoise-tts uses. The tokenizer I'm using in tortoise.cpp seems to be able to load the tokenizer vocab but the regex had issues with some of the special chars which I bandaided at least for spaces, but more perplexingly, the tortoise-tts tokenizer isn't greedy with respect to always choosing the longest possible next token, while the default tokenizer I copied from ggml gpt-2 seems to be greedy. So the task would be studying the tokenizer tortoise-tts uses, and modifying the tokenizer in tortoise,cpp to exactly match this behavior. Please reply here to claim this task, and feel free to ask any questions here if you need help getting set up with development.
Part of solving this would be coming up with a number of test phrases to make sure the tortoise.cpp tokenizer finds matching tokenizations for all of them.
The text was updated successfully, but these errors were encountered:
If people are interested in contributing to tortoise.cpp, a great first task would be getting the tokenizer to always match the tokenization tortoise-tts uses. The tokenizer I'm using in tortoise.cpp seems to be able to load the tokenizer vocab but the regex had issues with some of the special chars which I bandaided at least for spaces, but more perplexingly, the tortoise-tts tokenizer isn't greedy with respect to always choosing the longest possible next token, while the default tokenizer I copied from ggml gpt-2 seems to be greedy. So the task would be studying the tokenizer tortoise-tts uses, and modifying the tokenizer in tortoise,cpp to exactly match this behavior. Please reply here to claim this task, and feel free to ask any questions here if you need help getting set up with development.
Part of solving this would be coming up with a number of test phrases to make sure the tortoise.cpp tokenizer finds matching tokenizations for all of them.
The text was updated successfully, but these errors were encountered: