-
-
Notifications
You must be signed in to change notification settings - Fork 272
Refactor/implement rewrite coverage #858
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Refactor/implement rewrite coverage #858
Conversation
|
Your PR's title isn't in the expected format. Please check the expected title format, and update yours to match. Reason: Wrong number of parts separated by |s If this PR is not coursework, please add the NotCoursework label (and message on Slack in #cyf-curriculum or it will probably not be noticed). |
- Add Jest coverage configuration with 100% threshold - Create inline test verification script that: - Runs inline assertion tests with Node.js - Detects console.assert failures - Verifies test parity between inline and Jest tests - Reports disparities in both directions - Add GitHub Actions workflow that: - Triggers on coursework/sprint-3-implement-and-rewrite* branches - Runs inline tests with parity check - Runs Jest tests with coverage - Generates coverage report on PRs
- Set branches, lines, statements to 40% - Keep functions at 100% to ensure every function is tested - More practical for educational purposes while ensuring core functionality is covered
- Remove ArtiomTr/jest-coverage-report-action (was failing) - Add simple coverage summary to GitHub Actions summary - Upload coverage artifacts with if: always() to ensure they're saved even on failure
- Add inline test results table to summary (passed/failed counts) - Add test parity comparison table showing mismatches - Generate HTML coverage reports for download - Parse and display coverage metrics in summary with threshold checks - Visual status indicators (✅/❌) for all metrics - Clear instructions to download HTML coverage report artifact
- Replace custom parsing with maintained jest-coverage-comment action - Displays coverage badge, summary table, and file-by-file coverage - Shows in both PR comments and GitHub Actions summary - More reliable than manual JSON parsing
- Add peaceiris/actions-gh-pages to deploy HTML reports - Coverage reports accessible at github.io URL per branch - Add coverage report link to GitHub Actions summary - No need to download artifacts, view directly in browser
- Remove GitHub Pages deployment - Remove MishaKav/jest-coverage-comment - Let ArtiomTr action run tests and generate coverage itself - Simpler workflow with just inline tests + ArtiomTr coverage report
- Run Jest with required flags: --coverage --ci --json --testLocationInResults --outputFile=report.json - Pass report.json to ArtiomTr action via coverage-file parameter - Matches the documented example from ArtiomTr/jest-coverage-report-action
- Add pull-requests: write permission to allow action to comment on PRs - ArtiomTr action will now automatically post coverage reports as PR comments - Comments will include coverage summary and file-by-file breakdown
- Remove pull-requests: write permission - Coverage report will only show in GitHub Actions Summary - Avoids permission issues when forking to upstream repos
0cccc97 to
003b0ec
Compare
57372bb to
4c72c29
Compare
|
Your PR's title isn't in the expected format. Please check the expected title format, and update yours to match. Reason: Wrong number of parts separated by |s If this PR is not coursework, please add the NotCoursework label (and message on Slack in #cyf-curriculum or it will probably not be noticed). If this PR needs reviewed, please add the 'Needs Review' label to this PR after you have resolved the issues listed above. |
|
Your PR's title isn't in the expected format. Please check the expected title format, and update yours to match. Reason: Wrong number of parts separated by |s If this PR is not coursework, please add the NotCoursework label (and message on Slack in #cyf-curriculum or it will probably not be noticed). If this PR needs reviewed, please add the 'Needs Review' label to this PR after you have resolved the issues listed above. |
|
Your PR's title isn't in the expected format. Please check the expected title format, and update yours to match. Reason: Wrong number of parts separated by |s If this PR is not coursework, please add the NotCoursework label (and message on Slack in #cyf-curriculum or it will probably not be noticed). If this PR needs reviewed, please add the 'Needs Review' label to this PR after you have resolved the issues listed above. |
|
Your PR's title isn't in the expected format. Please check the expected title format, and update yours to match. Reason: Wrong number of parts separated by |s If this PR is not coursework, please add the NotCoursework label (and message on Slack in #cyf-curriculum or it will probably not be noticed). If this PR needs reviewed, please add the 'Needs Review' label to this PR after you have resolved the issues listed above. |
LonMcGregor
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, This is a good idea that helps reviewers verify trainees are doing the task, and could help to visualise why we are getting trainees to practice rewriting tests to use a framework.
It looks like you have changed the reference files that trainees will start working on in the Sprint-3 directory. How much of that is intended to be committed back to main?
I am looking at the sample output here: https://github.com/CodeYourFuture/Module-Structuring-and-Testing-Data/actions/runs/19738349167
I have some thoughts:
- Is there an easier way for trainees to see this output? Currently it exists in the "checks" tab, but I don't see any obvious means of directing the trainee / reviewer towards this output, other than some comments in the README.
- Can trainees run this in vs code / local CLI, as the README suggests for jest on its own, so they can avoid pushing a failing PR? Could instructions be added to that effect ?
- Are the tables going to be clear enough to read. I.e. are terms like "Inline Assertion Tests" and "Test Parity Check" understandable enough for trainees that may be brand new to the concept of testing?
- Following from the two points above, Is there some way we could make a comment under a PR appear with a clear explanation of what needs to be changed to make it pass? This would be in line with how automated PR checks currently work.
- In the linked run, the test finds issues, but github is recording the test as having passed. that should probably be fixed so it shows a cross. This might surface issues better in the github UI.
Extra question about the learning:
Do you know if currently the concept of github / CI test output is covered in a general sense, or is this the trainees first introduction to it? Does the prep for this sprint give enough background for them to understand this new kind of PR submission?
| continue-on-error: true | ||
|
|
||
| - name: Run Jest Tests with Coverage | ||
| run: npx jest --config=jest.config.jest-tests.js --coverage --ci |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the npm test command is defined in the package.json.Is there a reason you use npx to run the same command here?
| coveragePathIgnorePatterns: [ | ||
| "/node_modules/", | ||
| ], | ||
| coverageReporters: ["text", "lcov", "json-summary", "html"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we reduce the coverageReporters here, or do we need all of them?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file looks the same as for coverage-2. Could they be combined?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test coverage runs for task 2 and task 3 in this sprint. Either the instructions should be replicated in those readmes, or a more general set of instructions should be given in the main sprint3 readme rather than here.
The pull requests consist in adding coverage for the sprint-3 coursework "IMPLEMENT AND REWRITE".
When users open a PR with the pattern 'coursework/sprint-3-implement-and-rewrite from their forks, it should run the expected tests for the coursework scenarios in a way the Reviewer have a smooth perspective of the written test results.
In addition, introduces a shell script that compares the implementation with the test cases to see if their match in equality of test cases.
The summary looks like below
And that is how the checks tab look like where we can see if the tests are actually passing or not.
