You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? If so, please describe.
No response
Describe the solution you'd like
Using the new llama.cpp server api to save/load slots for group chats (or maybe even regular chats and/or message histories "Start new chat"?). This is very useful to not have to reprocess the whole context on every message from different character.
Describe alternatives you've considered
Using group chats with context re-processing with each message from different character.
Have you searched for similar requests?
Yes
Is your feature request related to a problem? If so, please describe.
No response
Describe the solution you'd like
Using the new llama.cpp server api to save/load slots for group chats (or maybe even regular chats and/or message histories "Start new chat"?). This is very useful to not have to reprocess the whole context on every message from different character.
Describe alternatives you've considered
Using group chats with context re-processing with each message from different character.
Additional context
I can create a PR that implements this feature.
Related: ggerganov/llama.cpp#6341
Priority
Low (Nice-to-have)
Are you willing to test this on staging/unstable branch if this is implemented?
Yes
The text was updated successfully, but these errors were encountered: