Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tournament - add non-error test case with a different team name #1987

Open
norbs57 opened this issue Mar 17, 2022 · 3 comments
Open

Tournament - add non-error test case with a different team name #1987

norbs57 opened this issue Mar 17, 2022 · 3 comments

Comments

@norbs57
Copy link
Contributor

norbs57 commented Mar 17, 2022

For the tournament exercise, one of my students took the team names from the instructions and hard-coded them in the source code. Despite this, his solution passed all the tests. I believe this is because in all of the "non-error" test cases, the team names are the same as in the instructions. To discourage this hard-coding of team names, I would suggest to either

  1. add a non-error test case with different team names, or
  2. to change (at least) one of the team names in one of the existing non-error test cases.
@SaschaMann
Copy link
Contributor

SaschaMann commented Mar 17, 2022

Changing test cases is not allowed according to the spec, so (2) is not an option here.

Perhaps you could open a PR that adds such a case?


As a side note in case you're newish to Exercism: the tests on Exercism generally assume that the student is actually willing to learn and doesn't try to "cheat" the exercises, that's why a lot of exercises don't check for completeness and don't have guards against hardcoding solutions. You will encounter the same issue in many exercises.

@norbs57
Copy link
Contributor Author

norbs57 commented Mar 17, 2022

I will open a PR.

Thanks for the side note - I understand. I guess it still makes sense to have test cases that reduce the chance that certain poor coding practices happen. But maybe "hardcoding test data" is something the automatic code analysers could catch (sometime in the future)?

@SaschaMann
Copy link
Contributor

I guess it still makes sense to have test cases that reduce the chance that certain poor coding practices happen.

Yea, I agree

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants