ChatOpenAI lost with_structured_output #20268
Replies: 2 comments 3 replies
-
Hey @wangcailin, great to see you diving into LangChain again! Hope your project's shaping up excitingly. Let's tackle this issue together. The To address this issue, ensure your from langchain_core.pydantic_v1 import BaseModel
class GradeHallucinations(BaseModel):
# Define your fields here
grade: str
explanation: str Then, when using structured_llm = llm.with_structured_output(
GradeHallucinations,
method="function_calling", # or "json_mode" depending on your use case
include_raw=False # or True if you want the raw response as well
) This approach should help you bypass the If you need further assistance or clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
I have a similar problem, I'm trying to use the LLM with structured output for tool calling, and it's not working:
This gives the error |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
File "/root/Dev/lookstar/lookstar-ai/packages/self_rag/hallucination_grader_chain.py", line 18, in
structured_llm_grader = llm.with_structured_output(GradeHallucinations)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/Dev/lookstar/lookstar-ai/.venv/lib/python3.11/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper
return wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/Dev/lookstar/lookstar-ai/.venv/lib/python3.11/site-packages/langchain_core/language_models/base.py", line 204, in with_structured_output
raise NotImplementedError()
NotImplementedError
System Info
langchain==0.1.15
langchain-cli==0.0.20
langchain-community==0.0.32
langchain-core==0.1.41
langchain-openai==0.0.6
langchain-text-splitters==0.0.1
linux
Python 3.11.6
Beta Was this translation helpful? Give feedback.
All reactions