You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Problem:
If the size of list passed through evaluation parameters(Not loading params from db, loading them from python list) to this expectation crosses 100, it breaks the underlying json.
Error Message:
sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input syntax for type json
LINE 1: ...rendered_task_instance_fields SET rendered_fields='{"run_nam...
^
DETAIL: Token "Infinity" is invalid.
CONTEXT: JSON data, line 1: ...6e5", 150641421, "e6096a21", "e6093099", Infinity...
Question:
Is it a bug that expectation crashes on list size more than 100 or is it desired behaviour?
I have verified that it only crashes when I pass 100+ size value_set through evaluation params, expectation works fine if I add the list of 100+ size directly into expectation json. So I think it must be something related to handling of big json objects by operator.
The text was updated successfully, but these errors were encountered:
Dialect Used: Snowflake
Expectation Used : expect_column_values_to_be_in_set
Example of expectation:
Problem:
If the size of list passed through evaluation parameters(Not loading params from db, loading them from python list) to this expectation crosses 100, it breaks the underlying json.
Error Message:
Question:
Is it a bug that expectation crashes on list size more than 100 or is it desired behaviour?
I have verified that it only crashes when I pass 100+ size value_set through evaluation params, expectation works fine if I add the list of 100+ size directly into expectation json. So I think it must be something related to handling of big json objects by operator.
The text was updated successfully, but these errors were encountered: