Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

using models as fastAPI response_model freezes fastAPI docs #840

Open
JoshDLane opened this issue Nov 22, 2023 · 1 comment
Open

using models as fastAPI response_model freezes fastAPI docs #840

JoshDLane opened this issue Nov 22, 2023 · 1 comment

Comments

@JoshDLane
Copy link

Bug description

When i try to use a model from prisma.models as a response_model on a fastAPI endpoint, the fastAPI docs freeze when I try to open that endpoint to see the response type. I have tried to use partials to resolve this. I can only get it to work when i have no relational fields in the partial. I find it hard to believe its completely freezing just because the type is large. Anyone have experience with this?

  • is there a way to control the depth of the relation generated for partials like we can do when querying?

How to reproduce

Here is a simple partial im generating to test this and its failing. pillar model also has some relations.

from prisma.models import Bot
ChatOverview = Bot.create_partial("ChatOverview", include={"pillar"})

and in fastAPI endpoint...

response_model=ChatOverview,
@RobertCraigie
Copy link
Owner

I haven't encountered this myself, what FastAPI version are you using?

I'd currently recommend searching the FastAPI repository to see if anyone else is running into this if you haven't already & if you can reproduce this without using Prisma then open an issue with FastAPI.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants