-
-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: --resume flag is still invoking LLM #938
Comments
An LLM will be used for other purposes (answering form questions, determining suitability, summarizing job description etc..) throughout the application process so you need to provide a model name, and valid API key. Provide the model info in config.py line 19 & 20. In your data_folder/secrets.yaml you need to specify a valid API key for the model. |
Keep in mind that the resume builder only works with GPT 4o for now, or at least I checked. If you want to use other models you have to use --resume until someone fixes that. |
This issue has been marked as stale due to inactivity. Please comment or update if this is still relevant. |
This issue was closed due to prolonged inactivity. |
Describe the bug
--resume flag is still invoking LLM
Steps to reproduce
run cmd
python main.py --resume D:/workspace/Auto_Jobs_Applier_AIHawk/data_folder/Resume_2024.pdf
This should bypass the LLM because I do not need to generate a resume.
However I get this error.
2024-11-25 09:32:14.354 | DEBUG | ai_hawk.llm.llm_manager:__call__:336 - Attempting to call the LLM with messages 2024-11-25 09:32:14.354 | DEBUG | ai_hawk.llm.llm_manager:invoke:95 - Invoking OpenAI API 2024-11-25 09:32:14.730 | ERROR | ai_hawk.llm.llm_manager:__call__:380 - Unexpected error occurred: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-11KRr***************************************u2FR. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Expected behavior
Bypass LLM
Actual behavior
LLM is invoked
Branch
None
Branch name
No response
Python version
No response
LLM Used
No response
Model used
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: