Replies: 1 comment 3 replies
-
Try adjusting the settings. nPredict option in createCompletion can be increased to have a longer output to the llm's response. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
System Info
The modules below are not related with this problem, I guess. But just in case.
Information
Reproduction
Find the bugs in the code, and show the improvements as the improved code.:
Expected behavior
Need to get the same answer with the UI using NodeJS. At least, a completed answer.
Beta Was this translation helpful? Give feedback.
All reactions