Feature Request - Limit recursion depth with Fetch Link #823
RomainDGrey
started this conversation in
Feature Request
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When multiples collections are linked together by Links/BackLinks, an issue could be met when numerous (big) documents exists.
Example :
1 inventory mission document contains 500 locations which contains on average 5 products.
When one tries to load partially an inventory mission, it will load the 2500 elements while it could interesting to limit to the 500 locations.
When one tries to load a location, as there is a backlink between location and inventory mission, it will load ~2500 elements.
When one tries to load a specific product, it will load the 2500 elements as there are backlinks between all elements.
It's possible to limit the issue by fetching only specific field but without it, the quantity of data loaded by a query can exceed the maximum:
pymongo.errors.OperationFailure: PlanExecutor error during aggregation :: caused by :: Total size of documents in Location matching pipeline's $lookup stage exceeds 104857600 bytes, full error: {'ok': 0.0, 'errmsg': "PlanExecutor error during aggregation :: caused by :: Total size of documents in Location matching pipeline's $lookup stage exceeds 104857600 bytes", 'code': 4568, 'codeName': 'Location4568'}
Could you please implement a parameter on fetch link to limit the number of recursions ?
An another way of doing it would be to option to avoid loading BackLinks after the retrival of a linked document or loading the Link after a backlink retrival.
Thank you,
RD
Beta Was this translation helpful? Give feedback.
All reactions