Replies: 2 comments
-
See also #513 for original discussion. |
Beta Was this translation helpful? Give feedback.
0 replies
-
I think I mostly agree with your analysis. More context to also keep in mind:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
It would be great if we had a general, customisable way of generating inputs for tests.
At the moment, most tests only use a single or a few handcrafted Variables and DataArrays. Those are good for probing special cases. But we should increase test coverage by adding ways to generate more data (semi-) automatically.
One can use explicit loops in the test cases to iterate over a large space of inputs.
+
Customisable.+
Can cover whole product space of parameters.-
Completely manual (could be slightly improved by having data generators outside of tests).-
Doesn't produce structured output.-
Bloats the test code.Use a general fixture for related tests and instantiate it on multiple different sets of parameters
(see #1488)
+
Set up once, use many times.+
Integrated into gtest -> produces useful progress / failure reports.+
Has 'parameter generators' that can create ranges, Cartesian products, and more.-
No integration of random number generators. Could seed one globally and use to instantiate fixtures but there would be no good way of recording its state to inspect and replay failed tests.-
Blanket set of parameters for all test cases, cannot exclude parameters that don't work in a particular test.-
Not composable (I think), e.g. how can we generate datasets from variables?-
Cannot combine type- and value-parameterised tests (I think).-
Generators are still configured manually and ignorant of code to test, cannot adapt.Test frameworks that can generate input data intelligently to probe the code for weaknesses
The gtest fixture option looks best to me right now. But we need to think of ways of building a good corpus of test data. We also should see if there is a way to skip unusable combinations of parameters in a test that is not too verbose.
Beta Was this translation helpful? Give feedback.
All reactions