-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interop score for interop-2023-motion [stable] seems nonsense #201
Comments
I just did some investigating on this. It is caused by the number of tests in the focus area fluctuating over time (presumably adding new tests later in the year). The experimental chart shows the same strange interop score, which corrects itself sometime in May. Before that time, the focus area was scored on 73 individual tests. And after May, the focus area moved to using 93 tests, which is nearly the number that is used today. The likely way to circumvent this is to calculate the interop scores at the end of the entire yearly calculation, averaging by the number of tests present in the final runs, rather than the tests present in the runs of that day. |
Currently the Motion Path [stable] graph looks like:
At the beginning of the year, the Interop score was higher than the Firefox or Chrome score, which seems impossible—54.7% of tests cannot pass in all three browsers if one browser only passes 44.3%.
The text was updated successfully, but these errors were encountered: