You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 25, 2018. It is now read-only.
so the issue here is that we wrote special emitters for streaming data from these endpoints, because the results are potentially very large. that involves sending a "[" and a "200 OK", and then streaming one-record-at-a-time.
the problem is that, for queries like the one you cite, the query takes long enough in postgres that the django app times out during the streaming part, even though it has already sent you a "200 OK" response.
This is a pathological case, and although we'll keep this ticket open, it's possible that this won't be fixed. We're moving towards unifying our data sources (see this blog post), and that means active development on a new API, where we can make the kinds of fundamental decisions that will avoid this circumstance altogether. For the short term, though, "[" is going to have to suffice as an error message. Sorry about that.
reported by @sckott:
The text was updated successfully, but these errors were encountered: