-
I have an application that requires async batch process and currently I am serving this llm through langserve where I provide the batch endpoint to the user. It is not clear to me for this endpoint, does it implemented in async or sync manner? langchain chain does support abatch out of box but langserve endpoint doesnt seem to have this abatch. If batch endpoint is not async - any suggestions how to serve an endpoint that acommplicate async batch? Thanks a lot! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Server code is always async in langserve. It'll use the async implementation of all code by default. If async implementations are not available it will fall back on sync implementations. |
Beta Was this translation helpful? Give feedback.
-
Great thanks for clarity
…On Tue, Apr 30, 2024, 12:37 Eugene Yurtsev ***@***.***> wrote:
Server code is always *async* in langserve. It'll use the async
implementation of all code by default. If async implementations are not
available it will fall back on sync implementations.
—
Reply to this email directly, view it on GitHub
<#624 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADDPIVETYF23NOTHOHVQQ7TY77XH5AVCNFSM6AAAAABG3NHGZ2VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TENZZGIZTG>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
Server code is always async in langserve. It'll use the async implementation of all code by default. If async implementations are not available it will fall back on sync implementations.