Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Darts]: Mention what should happen when a dart lands 'on the line' #3163

Closed
KaiAragaki opened this issue Aug 21, 2022 · 11 comments
Closed

[Darts]: Mention what should happen when a dart lands 'on the line' #3163

KaiAragaki opened this issue Aug 21, 2022 · 11 comments
Assignees
Labels
abandoned 🏚 claimed 🐾 For new exercises being written by contributors and maintainers. enhancement 🦄 ⭐ Changing current behaviour or enhancing/adding to what's already there.

Comments

@KaiAragaki
Copy link
Contributor

Though it's implied in the tests, it might be nice to include that if a dart lands on the border between one ring and the other (say it lands at 0, 5) that it should go to the higher values. If this is acceptable, I can make a PR to update the exercise.

@github-actions

This comment was marked as resolved.

@bobahop
Copy link
Member

bobahop commented Aug 22, 2022

I think the test_on_the_XXX_circle(self) tests are specific that the border hit counts higher, but I have nothing against making it clearer in the instructions. It's not my call, though.

@KaiAragaki
Copy link
Contributor Author

A question more about the philosophy of these exercises: should we consider reading the tests as part of the exercise, or should one be able to successfully complete the exercise without ever reading the tests?

@IsaacG
Copy link
Member

IsaacG commented Aug 22, 2022

See https://exercism.org/docs/tracks/python/test-driven-development where the track explicitly discusses this issue.

Exercism's Python track utilizes TDD methodology in its exercises. Unit tests are already written. The tests may be viewed by the student to gather a more detailed understanding of what is required for a solution to pass. A solution stub may be provided to the student.

The test is the definitive embodied exercise requirement. The prose simply provides an overview and context.

@KaiAragaki
Copy link
Contributor Author

Good point. I'll close the issue.

@BethanyG
Copy link
Member

BethanyG commented Aug 22, 2022

@KaiAragaki - I am going to reopen this issue, as I intended to have a discussion, but was delayed by other work. I'll chime in with more thoughts later today tomorrow my time. 😄 To your point about the exercise: students should be engaging with both the instructions and the tests. Instructions can't give every detail, but tests and stubs can't replace directions. As IsaacG has mentioned, we do strongly encourage TDD - we want students to look at and learn from tests. But we also want them to be able to parse specifications (directions) too.

@IsaacG - while you are technically correct, I think we should always consider suggestions from students - especially those new to the track when they say something needs to be clarified in the instructions or hints. Yes - we do indeed practice TDD, but that doesn't mean that because there are tests that instructions are frozen in stone, or that something can't be more detailed or amended. Please don't imply to students that that is the case unless we have a discussion first. It's likely that I and other maintainers of the track will agree with you - but some exercises have a history of issues or approaches that you might not know.

This exercise has already had a hints file added because of the issue linked above, as well as another issue on the same topic that I cannot locate right now. The upshot is that instructions are hard to parse for some students, and in trying to figure out the whole euclidian distance thing, they may miss the subtleties of what to do if the dart is on a line. I think the instructions might benefit from some clarification/further discussion in problem specifications, and might also benefit from an instruction append (maybe even with a diagram) here on the Python track. Remember: this exercise appears fairly early in the progression.

@BethanyG BethanyG reopened this Aug 22, 2022
@BethanyG
Copy link
Member

@KaiAragaki - Because this is a practice exercise, we have less leeway (actually, more complication) in editing problem instructions and tests. We pull both descriptions and test specification from the problem-specificaions repo using a tool called configlet. Test data is then run through a generator script to produce the Python test files. Problem descriptions are mostly taken as-is, or amended via instruction_append files on the track.

Changes to the main descriptions/instructions and test data are proposed/discussed cross-language track in problem-specifications. We try to keep both fairly generic and language agnostic. Once three or more track maintainers agree that a change is needed, it is PR'd there and then pulled down into various tracks.

Tests and instructions that are language specific are then added via additional_tests.json (example here) and instructions_append (involved example here)

TL;DR:

  1. I think you have a point, and like @bobahop, I do think the test name is pretty specific, but the instructions would not hurt from a mention of a dart on the line.
  2. Since problem specifications #1971 is already open as an issue, my first recommendation would be to comment there, and mention that in addition to clarifying the math, you also think that the instructions might benefit from wording about darts on the line between sections.
  3. If people agree with you, the next step would be to propose a PR in problem-specs (and I think you'd get extra love for also addressing the math part).
  4. Additionally/instead, we can consider making an instructions_append specifically for the Python track. This could include words and or diagrams (We support mermaid and SVG. Here is an example of a diagram that can be included.)
  5. We can also consider adding to the existing hints.md file for this exercise.
  6. Commenting in the test file is a tad more involved, since it requires editing the test generation template and re-generating the test cases. But that is also an option, if we think a comment there would help.

So let me know what you'd like to do. 😄

@BethanyG BethanyG added claimed 🐾 For new exercises being written by contributors and maintainers. enhancement 🦄 ⭐ Changing current behaviour or enhancing/adding to what's already there. labels Aug 23, 2022
@KaiAragaki
Copy link
Contributor Author

@BethanyG - Thanks for your detailed response.

It looks like #1976 in problem-specs already addresses the math issue, unless you believe more clarification is still needed. Should I still add my recommendation to this issue?

@BethanyG
Copy link
Member

BethanyG commented Aug 23, 2022

@KaiAragaki -- 🤔 I .. somehow didn't track that Sascha made adjustments to the description text already.

Maybe in that case we both close # 1791, and open a new issue in problem-specs with your recommendations? Unless you think they'd be better as a Python-only addendum. Does that make sense?

@github-actions
Copy link
Contributor

This issue has been automatically marked as abandoned 🏚 because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@github-actions
Copy link
Contributor

Closing stale issue. If this issue is still relevant, please reopen it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
abandoned 🏚 claimed 🐾 For new exercises being written by contributors and maintainers. enhancement 🦄 ⭐ Changing current behaviour or enhancing/adding to what's already there.
Projects
None yet
Development

No branches or pull requests

4 participants