-
Notifications
You must be signed in to change notification settings - Fork 703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parsing JSON error when increasing evidence_k with mixtral:8x7b #736
Comments
The JSON parsing error you're encountering when increasing Here are a few things you can try to address this issue:
These steps should help mitigate the JSON parsing errors you're experiencing with the |
Hi @Snikch63200, looking at the stack:
Specifically:
You can see the model seems to have truncated the JSON, there is no closing brace |
Hi @jamesbraza, Thanks for your answer. I guess you're right, it's a problem related to My llm model is setted up as follows :
Have you an idea about this ? Best regards. |
@Snikch63200 yeah if you look at the error message:
You can try this out, basically it moves "evidence" from JSON to just text blobs |
Hello, I tried to specify But I solved this issue (that's not a paperqa issue). It seems Llama3.1 accepts -1 or -2 as Thanks for you help. PS : Maybe it's good idea to open a discussion about LLMs optimal configuration settings for PaperQA. I'll try to get time to write a short synthesis abour my own experience with several models. |
Thanks for sharing your resolution, appreciated.
We do have a couple pre-made settings in Also, I expanded #749 to get retrying cover this case as well, I am going to close this out, thanks for the report |
Hello,
I paperQA returns a JSON parsing error when I increase evidence_k when I use mixtral:7xb ass LLM model.
Works fine with these parameters :
Doesn't works with these parameters :
Returns this error :
Works fine, with llama3.1:70b with these parameters :
Does anyone have an idea about this issue ?
@dosu.
The text was updated successfully, but these errors were encountered: