Replies: 5 comments 14 replies
-
This code is merged on master. If you see anything failing or want changes to the schema API, please post here! |
Beta Was this translation helpful? Give feedback.
-
Documentation is available here: General Docs: https://python.langchain.com/docs/expression_language/streaming |
Beta Was this translation helpful? Give feedback.
-
@eyurtsev Linking to this discussion, as per your request. |
Beta Was this translation helpful? Give feedback.
-
For my use case I would like to stream only the final Answer of the Agent. However, my astream_events returns all kind of events by the "on_chat_model_stream", any clue why I am not getting this event that I would need for streaming the final generated output? |
Beta Was this translation helpful? Give feedback.
-
model = AutoModelForCausalLM.from_pretrained(model_id, pipe = pipeline( llm = HuggingFacePipeline(pipeline=pipe) rag_chain = ( async for chunk in rag_chain.astream("How are you?"): Returns error-:
SyntaxError: 'async for' outside async function |
Beta Was this translation helpful? Give feedback.
-
Hi everyone!
We want to improve the streaming experience in LangChain. We're considering adding a
astream_event
method to theRunnable
interface. The code below is from the following PR and has not yet been committed into langchain-core.Does this API work for your use case? Anything you'd like to see done better or differently? We would love to hear any and all feedback.
Gist
First 8 events:
Example Notebook
Please check out this notebook.
The notebook shows how to get streaming working from LLMs used within tools.
Event Hooks Reference
Here is a reference table that shows some events that might be emitted by the various Runnable objects.
Definitions for some of the Runnable are included after the table.
end
hook rather thanstart
event.Here are declarations associated with the events shown above:
format_docs
:some_tool
:prompt
:Beta Was this translation helpful? Give feedback.
All reactions