vakenbolt / go-test-report Goto Github PK
View Code? Open in Web Editor NEWCaptures go test output and parses it into a single self-contained HTML file.
License: Apache License 2.0
Captures go test output and parses it into a single self-contained HTML file.
License: Apache License 2.0
In the Firefox browser, the test status icons that appear when a test group is expanded is rendered incorrectly. It should be in centered vertically in in the row.
This is most likely due to an incompatibility within the CSS code that renders the test status icon in Firefox found in the test_report.html file.
Make it possible to show test names as sentences:
https://bitfieldconsulting.com/golang/test-names
}
+ if flags.testdox {
+ status.DisplayName = gotestdox.Prettify(status.TestName)
+ }
allTests[key] = status
https://github.com/bitfield/gotestdox
TestValidIsFalseForInvalidInput
=> Valid is false for invalid input
TestValidIsTrueForValidInput
=> Valid is true for valid input
It did not catch an error:panic: fatal error config file: open /xxx/xxx/etc/configs/xxx.yaml: no such file or directory, This error caused the test case of a module to not be executed
Currently this fork is not being developed, or maintained with updates...
go:embed
gotestsum
parametersMight want to start a new project for v1.0, to work on new features as well.
-go 1.13
+go 1.17
Flags:
+ -e, --elapsed string the test elapsed time from the original test run (in "0.000s" format)
+ -d, --executionDate string the test execution date of the original test run (in RFC 3339 format)
-g, --groupSize int the number of tests per test group indicator (default 20)
-h, --help help for go-test-report
+ -i, --input string the JSON input file
+ -l, --list string the JSON module list
-o, --output string the HTML output file (default "test_report.html")
Running go test -json | go-test-report
works as expected; but running go test ./... -json | go-test-report -v
does not.
Using the -v
flag for verbose output, I only see the test output. go-test-report
does not generate any logs, which makes debugging incredibly complicated.
Since go test ./...
is very common with modules that have subpackages, I think this is a must-have feature, and I'm unsure what's causing the failure. It may be due to an unexpected output; but since logging is non-existent, I can't debug this.
There are some features in go1.16, that would improve things.
Suggest going with go1.17 as a reasonable minimum version...
@vakenbolt Looking for a new maintainer, I guess ?
There seem to be some dependabot issues piling up.
Currently it takes around 10 seconds, for a project with around 1000 tests.
[go-test-report] finished in 11.258970993s
This means it takes almost as long to generate report, as to run the tests...
DONE 924 tests, 2 skipped in 15.500s
Most of the time is spent running go list -json
This can be done in parallel (go), or even cached (file).
Parallel: [go-test-report] finished in 8.929561069s
Cached: [go-test-report] finished in 109.123483ms
When generating the test report, maybe you want to link to other pages
One example would be the coverage report, from go tool cover -html -o
Another example I am working on: https://github.com/afbjorklund/go-test-trace
These could be made accessible, by adding links populated by parameters ?
Hello,
I think this is a very cool tool. My team is a Ginkgo BDD Test shop and we are interested in representing the json results as an HTML report. Would you guys be will to accept a feature that could generate a html report based on the JSON output the Ginkgo framework generates? I would be willing to submit it.
Thanks
Since it allows both progress and json, at the same time (not either / or).
It also has a default format that is more friendly on the eyes than go test
.
$ gotestsum --jsonfile test_report.json
https://github.com/gotestyourself/gotestsum
Main issue is recording the current date and execution time "somewhere"...
Duration
(should be the test execution time, not the report execution time)
testExecutionDate
(should be the test executation date, not the current date)
I don't want the tests to be sorted by name before generating a report. Is it possible to do sort only if provided by flag?
I see that the tests are sorted here
https://github.com/vakenbolt/go-test-report/blob/master/main.go#L421
Trying to use this tool, I ran into the below issue using go get
❯ go get -u github.com/vakenbolt/go-test-report/
go get: github.com/vakenbolt/go-test-report@none updating to
github.com/vakenbolt/[email protected]: parsing go.mod:
module declares its path as: go-test-report
but was required as: github.com/vakenbolt/go-test-report
I am using Mac M1 laptop and go 1.18.2.
The command go get github.com/vakenbolt/go-test-report/
finish successfully. But the command isn't installed under ~/go/bin/
directory.
When passing in a pre-generated json file, the duration is for reading the file
$ gotestsum --jsonfile test_report.json
✓ . (8ms)
∅ embed_assets
DONE 16 tests in 0.569s
$ go-test-report < test_report.json
[go-test-report] finished in 115.295321ms
⧉ Total: 16 Duration: 1ms ✓ Passed: 16 ‐ Skipped: 0 ✗ Failed: 0
If you want to show the original time it took to run the test, it needs a param
--duration 0.569
perhaps parse any "s" or "ms" suffix in the parameter, for user convenience.
It would be nice if there was a search bar, that allowed you to find a particular test (by name)
Then it would jump directly to the page with the test (highlight?), without having to page through...
Have groups for each specific test areas in the report, like a group for Login tests, a group for Home page tests, etc.
Maybe have a group for each go test file?
Example:
This will help to organize the report and make it easier to track failures in each area.
The excellent gotestsum
tool has a feature to show the slowest tests:
Usage:
gotestsum tool slowest [flags]
Read a json file and print or update tests which are slower than threshold.
The json file may be created with 'gotestsum --jsonfile' or 'go test -json'.
If a TestCase appears more than once in the json file, it will only appear once
in the output, and the median value of all the elapsed times will be used.
[...]
Flags:
--debug enable debug logging.
--jsonfile string path to test2json output, defaults to stdin
--skip-stmt string add this go statement to slow tests, instead of printing the list of slow tests
--threshold duration test cases with elapsed time greater than threshold are slow tests (default 100ms)
It would be nice if this list was available on a separate tab/page, with links ?
Example output of tool, after make out/unittest.json
and "DONE 899 tests, 2 skipped in 13.313s"
BTW: something is fishy here, go-test-report says 895 (not 899) "✓ Passed: 893 ‐ Skipped: 2 ✗ Failed: 0" ? --> filed Bug #19
$ gotestsum tool slowest < out/unittest.json
k8s.io/minikube/pkg/drivers/kic/oci TestRunCmdWarnSlowOnce 6.02s
k8s.io/minikube/pkg/minikube/perf TestTimeCommandLogs 4.03s
k8s.io/minikube/pkg/minikube/tunnel TestTunnelManagerDelayAndContext 1.1s
k8s.io/minikube/pkg/minikube/bootstrapper TestSetupCerts 1.09s
k8s.io/minikube/pkg/minikube/tunnel TestTunnelManagerEventHandling 420ms
k8s.io/minikube/pkg/minikube/tunnel TestTunnelManagerEventHandling/tunnel_always_quits_when_ctrl_c_is_pressed 420ms
k8s.io/minikube/pkg/util TestGenerateSignedCert 380ms
k8s.io/minikube/pkg/util TestGenerateSignedCert/valid_cert 270ms
k8s.io/minikube/cmd/minikube/cmd TestDeleteAllProfiles 270ms
k8s.io/minikube/cmd/minikube/cmd TestDeleteProfile 270ms
k8s.io/minikube/pkg/util TestGenerateCACert 230ms
k8s.io/minikube/cmd/minikube/cmd TestValidateImageRepository 140ms
So it shows all tests that ran slower than 100ms (slider?), sorted by duration...
Most of them run in "0s" (as they should), so this a nice way of finding culprits.
Web enhancements:
https://blog.golang.org/subtests
https://github.com/golang/go/wiki/TableDrivenTests
These will look like "Test/foo" and "Test/bar" in the results.
https://dave.cheney.net/2019/05/07/prefer-table-driven-tests
Maybe they could also be grouped somehow, in the report ?
Like nested/folded in a tree structure, or something similar.
Hi @vakenbolt , thank you for this project!
I'm using this project as part of my own GitHub Action, and the output of the test results is amazing!
Link to GitHub Action logs - https://github.com/unfor19/release-action-test/actions/runs/999217153
Link to test results artifacts -
Terraform one failed
Terraform all passed
I won't go into too many details, but I'm able to build any Golang repository with this action. I wanted to integrate tests including a test report as part of my action, and your project did the trick 🎉
It would be amazing if this repo would also support exporting to Markdown so that PR comments and GitHub issue can be created from the reports.
Currently, if you have a test that has the same name in multiple packages it only shows up once.
This causes some tests to "disappear", and the total number of to differ from the actual test run.
As title, clicking each test group can become quite laborious.
navigation via keyboard arrow keys would help with this
I'll raise a PR when i get time
Consider adding a metric to the report: test coverage, both total coverage and coverage per function. I actually have this functionality written locally, so I can submit code if you're interested
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.