You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Originally posted by yannispgs October 10, 2024
Hi !
I am trying to run an aggregation pipeline on one of my collection that has multiple stages. The purpose of it is to get an array of data, and then, depending on a condition, running (or not) another aggregation pipeline with this transformed data.
We already have an aggregation pipeline who does that by having a $facet as last stage, with one variable metrics that store all the documents as they are, and another variable tests that store the result of the second aggregation pipeline. This $facet pipeline only runs if certain conditions are met. That way, if the conditions are met, the second pipeline is run and we retrieve the results in both variables metrics and tests. Otherwise, we just return the output in the metrics variable.
The issue is that $facet load all the documents of every subpipeline in a single variable. My first pipeline can return dozens of thousands of results for some apps, thus triggering the error about the max size a single document can have (16MB).
To resolve that, I tried to run 2 aggregation pipelines sequentially, but I don't want to use a "temporary collection" to store the results of the first pipeline to put as input in the 2nd one (with $out or $merge). However, I did not find another way to do it with Mongoose. The $documents stage could work, but it is not supported by Mongoose...
Is there soemone who managed to achieve something similar ?
The text was updated successfully, but these errors were encountered:
Similar to db-level bulkWrite() that we added in 8.9
Discussed in #14953
Originally posted by yannispgs October 10, 2024
Hi !
I am trying to run an aggregation pipeline on one of my collection that has multiple stages. The purpose of it is to get an array of data, and then, depending on a condition, running (or not) another aggregation pipeline with this transformed data.
We already have an aggregation pipeline who does that by having a
$facet
as last stage, with one variablemetrics
that store all the documents as they are, and another variabletests
that store the result of the second aggregation pipeline. This$facet
pipeline only runs if certain conditions are met. That way, if the conditions are met, the second pipeline is run and we retrieve the results in both variablesmetrics
andtests
. Otherwise, we just return the output in themetrics
variable.The issue is that
$facet
load all the documents of every subpipeline in a single variable. My first pipeline can return dozens of thousands of results for some apps, thus triggering the error about the max size a single document can have (16MB).To resolve that, I tried to run 2 aggregation pipelines sequentially, but I don't want to use a "temporary collection" to store the results of the first pipeline to put as input in the 2nd one (with
$out
or$merge
). However, I did not find another way to do it with Mongoose. The$documents
stage could work, but it is not supported by Mongoose...Is there soemone who managed to achieve something similar ?
The text was updated successfully, but these errors were encountered: