Giter Site home page Giter Site logo

vakenbolt / go-test-report Goto Github PK

View Code? Open in Web Editor NEW
143.0 3.0 33.0 3.72 MB

Captures go test output and parses it into a single self-contained HTML file.

License: Apache License 2.0

Dockerfile 0.66% Go 79.94% JavaScript 17.59% Makefile 1.80%
golang go testing-tools reporting-tool reporting command-line command-line-tool

go-test-report's People

Contributors

afbjorklund avatar dependabot[bot] avatar vakenbolt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

go-test-report's Issues

Improperly rendered test status icon in Firefox

In the Firefox browser, the test status icons that appear when a test group is expanded is rendered incorrectly. It should be in centered vertically in in the row.
go-test-report_firefox_test_status_icon_issue

This is most likely due to an incompatibility within the CSS code that renders the test status icon in Firefox found in the test_report.html file.

Move project to add new features

Currently this fork is not being developed, or maintained with updates...

  • Update go version
  • Use go:embed
  • Update dependencies
  • Add gotestsum parameters

Might want to start a new project for v1.0, to work on new features as well.

  • Grouping
  • Sorting
  • Markdown
  • Testdox

-go 1.13
+go 1.17
 Flags:
+  -e, --elapsed string         the test elapsed time from the original test run (in "0.000s" format)
+  -d, --executionDate string   the test execution date of the original test run (in RFC 3339 format)
   -g, --groupSize int          the number of tests per test group indicator (default 20)
   -h, --help                   help for go-test-report
+  -i, --input string           the JSON input file
+  -l, --list string            the JSON module list
   -o, --output string          the HTML output file (default "test_report.html")

Bug: Exit status 1 when generating Test Report for Subpackages

Issue

Running go test -json | go-test-report works as expected; but running go test ./... -json | go-test-report -v does not.

Output

Using the -v flag for verbose output, I only see the test output. go-test-report does not generate any logs, which makes debugging incredibly complicated.

Since go test ./... is very common with modules that have subpackages, I think this is a must-have feature, and I'm unsure what's causing the failure. It may be due to an unexpected output; but since logging is non-existent, I can't debug this.

Update go version and make release

There are some features in go1.16, that would improve things.

Suggest going with go1.17 as a reasonable minimum version...

@vakenbolt Looking for a new maintainer, I guess ?

There seem to be some dependabot issues piling up.

Speeding up execution time for large go projects

Currently it takes around 10 seconds, for a project with around 1000 tests.

[go-test-report] finished in 11.258970993s

This means it takes almost as long to generate report, as to run the tests...

DONE 924 tests, 2 skipped in 15.500s


Most of the time is spent running go list -json

This can be done in parallel (go), or even cached (file).

Parallel: [go-test-report] finished in 8.929561069s

Cached: [go-test-report] finished in 109.123483ms

Support for HTML report based on Ginkgo Test JSON output

Hello,

I think this is a very cool tool. My team is a Ginkgo BDD Test shop and we are interested in representing the json results as an HTML report. Would you guys be will to accept a feature that could generate a html report based on the JSON output the Ginkgo framework generates? I would be willing to submit it.

Thanks

Advocate gotestsum for running go test

Since it allows both progress and json, at the same time (not either / or).

It also has a default format that is more friendly on the eyes than go test.

$ gotestsum --jsonfile test_report.json

https://github.com/gotestyourself/gotestsum


Main issue is recording the current date and execution time "somewhere"...

Duration (should be the test execution time, not the report execution time)

testExecutionDate (should be the test executation date, not the current date)

"Module Declares Its Path..." Error on go get

Trying to use this tool, I ran into the below issue using go get

❯ go get -u github.com/vakenbolt/go-test-report/
go get: github.com/vakenbolt/go-test-report@none updating to
	github.com/vakenbolt/[email protected]: parsing go.mod:
	module declares its path as: go-test-report
	        but was required as: github.com/vakenbolt/go-test-report

Can't install the command on mac

I am using Mac M1 laptop and go 1.18.2.

The command go get github.com/vakenbolt/go-test-report/ finish successfully. But the command isn't installed under ~/go/bin/ directory.

Need (optional) parameter for test duration

When passing in a pre-generated json file, the duration is for reading the file

$ gotestsum --jsonfile test_report.json
✓  . (8ms)
∅  embed_assets

DONE 16 tests in 0.569s
$ go-test-report < test_report.json 
[go-test-report] finished in 115.295321ms

⧉ Total: 16 Duration: 1ms ✓ Passed: 16 ‐ Skipped: 0 ✗ Failed: 0

If you want to show the original time it took to run the test, it needs a param

--duration 0.569

perhaps parse any "s" or "ms" suffix in the parameter, for user convenience.

Search and jump directly to a particular test

It would be nice if there was a search bar, that allowed you to find a particular test (by name)

Then it would jump directly to the page with the test (highlight?), without having to page through...

Group tests by area in report

Have groups for each specific test areas in the report, like a group for Login tests, a group for Home page tests, etc.

Maybe have a group for each go test file?
Example:

  • home_test.go -> All tests in home_test.go file will be grouped.
  • login_test.go -> All tests in login_test.go file will be grouped.

This will help to organize the report and make it easier to track failures in each area.

Show separate summary with the slowest tests

The excellent gotestsum tool has a feature to show the slowest tests:

Usage:
    gotestsum tool slowest [flags]

Read a json file and print or update tests which are slower than threshold.
The json file may be created with 'gotestsum --jsonfile' or 'go test -json'.
If a TestCase appears more than once in the json file, it will only appear once
in the output, and the median value of all the elapsed times will be used.

[...]

Flags:
      --debug                enable debug logging.
      --jsonfile string      path to test2json output, defaults to stdin
      --skip-stmt string     add this go statement to slow tests, instead of printing the list of slow tests
      --threshold duration   test cases with elapsed time greater than threshold are slow tests (default 100ms)

It would be nice if this list was available on a separate tab/page, with links ?

Example output of tool, after make out/unittest.json and "DONE 899 tests, 2 skipped in 13.313s"
BTW: something is fishy here, go-test-report says 895 (not 899) "✓ Passed: 893 ‐ Skipped: 2 ✗ Failed: 0" ? --> filed Bug #19

$ gotestsum tool slowest < out/unittest.json
k8s.io/minikube/pkg/drivers/kic/oci TestRunCmdWarnSlowOnce 6.02s
k8s.io/minikube/pkg/minikube/perf TestTimeCommandLogs 4.03s
k8s.io/minikube/pkg/minikube/tunnel TestTunnelManagerDelayAndContext 1.1s
k8s.io/minikube/pkg/minikube/bootstrapper TestSetupCerts 1.09s
k8s.io/minikube/pkg/minikube/tunnel TestTunnelManagerEventHandling 420ms
k8s.io/minikube/pkg/minikube/tunnel TestTunnelManagerEventHandling/tunnel_always_quits_when_ctrl_c_is_pressed 420ms
k8s.io/minikube/pkg/util TestGenerateSignedCert 380ms
k8s.io/minikube/pkg/util TestGenerateSignedCert/valid_cert 270ms
k8s.io/minikube/cmd/minikube/cmd TestDeleteAllProfiles 270ms
k8s.io/minikube/cmd/minikube/cmd TestDeleteProfile 270ms
k8s.io/minikube/pkg/util TestGenerateCACert 230ms
k8s.io/minikube/cmd/minikube/cmd TestValidateImageRepository 140ms

So it shows all tests that ran slower than 100ms (slider?), sorted by duration...

Most of them run in "0s" (as they should), so this a nice way of finding culprits.

Web enhancements:

  • show inline bar of the times, for comparison
  • slider for dynamically choosing the threshold

Kudos! I've integrated this project in my release-action

Hi @vakenbolt , thank you for this project!

I'm using this project as part of my own GitHub Action, and the output of the test results is amazing!

Link to GitHub Action logs - https://github.com/unfor19/release-action-test/actions/runs/999217153

Link to test results artifacts -

Terraform one failed

image

Terraform all passed

image

I won't go into too many details, but I'm able to build any Golang repository with this action. I wanted to integrate tests including a test report as part of my action, and your project did the trick 🎉

Suggestion: Export to Markdown

It would be amazing if this repo would also support exporting to Markdown so that PR comments and GitHub issue can be created from the reports.

Allow duplicated test names in different packages

Currently, if you have a test that has the same name in multiple packages it only shows up once.

This causes some tests to "disappear", and the total number of to differ from the actual test run.

Suggestion: Report with coverage

Consider adding a metric to the report: test coverage, both total coverage and coverage per function. I actually have this functionality written locally, so I can submit code if you're interested

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.