treosh / lighthouse-ci-action Goto Github PK
View Code? Open in Web Editor NEWAudit URLs using Lighthouse and test performance with Lighthouse CI.
License: MIT License
Audit URLs using Lighthouse and test performance with Lighthouse CI.
License: MIT License
the configPath is loaded relative to the action's own sourcecode.
In this commit ChromeDevTools/debugger-protocol-viewer@27154fb ...
you can see configPath: ./.lighthouserc.js
I was attempting to reference the .lighthouserc.js
you see in the same commit.
But it failed to find it:
(run link)
probably because these lines expect a path that resolves from cwd/pwd:
lighthouse-ci-action/src/config.js
Lines 28 to 29 in 65160cb
The readme has the example of configPath: ./lighthouserc.json
so i figured it would be read local to my checkout instead of local to the action's source.
some path.resolve() should sort this out?
Hi,
we would like to run Lighthouse on pre-PRO environment which is only available internally.
That's why we have a custom GitHub runner running on on-premise Kubernetes.
Google Chrome browser is already installed.
But it fails with the following error. Maybe someone has a clue what went wrong.
events.js:187
Running Lighthouse 10 time(s) on https://www.mypage.de/
throw er; // Unhandled 'error' event
^
Error: spawn node ENOENT
at Process.ChildProcess._handle.onexit (internal/child_process.js:264:19)
at onErrorNT (internal/child_process.js:456:16)
at processTicksAndRejections (internal/process/task_queues.js:80:21)
Emitted 'error' event on ChildProcess instance at:
at Process.ChildProcess._handle.onexit (internal/child_process.js:270:12)
at onErrorNT (internal/child_process.js:456:16)
at processTicksAndRejections (internal/process/task_queues.js:80:21) {
errno: 'ENOENT',
code: 'ENOENT',
syscall: 'spawn node',
path: 'node',
spawnargs: [
'/runner/_work/_actions/treosh/lighthouse-ci-action/v8/node_modules/lighthouse/lighthouse-cli/index.js',
'https://www.mypage.de/',
'--output',
'json',
'--output-path',
'stdout',
'--cli-flags-path',
'/runner/_work/mypage-ui/mypage-ui/.lighthouseci/flags-30755ad7-427c-4a5d-87fb-b48693f0278e.json'
]
}
Run #1...::error::LHCI 'collect' has encountered a problem.
Hi there!
Cool work, I'm trying to audit my PR deploys, but they include a URL which differs per run.
I thought I'd reach out if you had any ideas?
Current (not working):
lighthouse:
runs-on: ubuntu-latest
needs: build
steps:
- uses: actions/checkout@master
- name: Run Lighthouse and test budgets
uses: treosh/lighthouse-ci-action@v1
with:
urls: |
https://typescript-v2-$PR_NUMBER.ortam.now.sh
https://typescript-v2-$PR_NUMBER.ortam.now.sh/tsconfig
https://typescript-v2-$PR_NUMBER.ortam.now.sh/docs/handbook/integrating-with-build-tools.html
Fri, 08 Nov 2019 13:28:38 GMT GatherRunner:error DNS servers could not resolve the provided domain. https://typescript-v2-$pr_number.ortam.now.sh/
Some options:
straight up copy the shell style $THING replacement (which would mean looking through all env vars and seeing if they can be replaced in strings)
explicitly add a template variable replacement syntax (faster, & more explicit)
lighthouse:
runs-on: ubuntu-latest
needs: build
steps:
- uses: actions/checkout@master
- name: Run Lighthouse and test budgets
uses: treosh/lighthouse-ci-action@v1
with:
urls: |
https://typescript-v2-$(PR_NUM).ortam.now.sh
https://typescript-v2-$(PR_NUM).ortam.now.sh/tsconfig
https://typescript-v2-$(PR_NUM).ortam.now.sh/docs/handbook/integrating-with-build-tools.html
env:
PR_NUM: ${{ secrets.$PR_NUMBER }}
The action does for some reason use an incorrect path to budget.json. The repo-name in the path is duplicated like so: /home/runner/work/%REPO%/%REPO%/budget.json
Also happens in your demo here:
https://github.com/denar90/lightouse-ci-netlify-preact/runs/891728484?check_suite_focus=true
Asserting
Error: ENOENT: no such file or directory, open '/home/runner/work/lightouse-ci-netlify-preact/lightouse-ci-netlify-preact/budget.json'
at Object.openSync (fs.js:440:3)
at Object.readFileSync (fs.js:342:35)
at readBudgets (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/assert/assert.js:41:24)
at Object.runCommand (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/assert/assert.js:54:57)
at run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/cli.js:88:23)
at Object.<anonymous> (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/cli.js:119:1)
at Module._compile (internal/modules/cjs/loader.js:959:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:995:10)
at Module.load (internal/modules/cjs/loader.js:815:32)
I deploy via Travis CI, so how do I express that I want this to run after travis deployment?
https://s.natalian.org/2020-01-24/order-dependency.mp4
Thank you in advance,
I'm having this error when I activate temporaryPublicStorage: true
.
This is the run: https://github.com/frontity/frontity/pull/244/checks?check_run_id=339869140
And this is the workflow job:
lighthouse:
runs-on: ubuntu-latest
needs: deploy
steps:
- name: Audit URLs using Lighthouse
uses: treosh/lighthouse-ci-action@v2
with:
temporaryPublicStorage: true
urls: |
https://mars-theme-frontity.worona.now.sh/
https://mars-theme-frontity.worona.now.sh/2016/the-beauties-of-gullfoss/
By looking at the specific line that throws the error:
https://github.com/GoogleChrome/lighthouse-ci/blob/master/packages/utils/src/build-context.js#L54
it looks like lighthouse is trying to use git rev-list -n1 HEAD
to get the commit hash.
Just before that, it looks for some environment variables. I have solved the issue adding the env LHCI_BUILD_CONTEXT__CURRENT_HASH
populated with ${{ github.sha }}
.
I'm not sure why I am having this problem, but in case you can reproduce it, maybe this package should add the LHCI_BUILD_CONTEXT__CURRENT_HASH
env variable by default:
env:
LHCI_BUILD_CONTEXT__CURRENT_HASH: ${{ github.sha }}
Would be great to have the main LH Scores available as output parameters in the github action.
That way it would be highly flexible so you could e.g.
How to combine the github action with the field plugin https://github.com/treosh/lighthouse-plugin-field-performance ?
I've got a yaml
lighthouserc
file, and it throws the following error which leads me to believe it only currently works with json
files:
Run treosh/lighthouse-ci-action@v2
undefined:1
ci:
^
SyntaxError: Unexpected token c in JSON at position 0
at JSON.parse (<anonymous>)
@Snugug shared a cool Github Actions pattern he's using to run multiple jobs in parallel but only after the project has been built. Basically it's a combo of the needs
job property combined with uploading/downloading artifacts.
it gives two benefits:
https://github.com/chromeos/static-site-scaffold/blob/master/.github/workflows/nodejs.yml shows it in use. pretend the commented-out job is still there. ;)
users of this action probably would like to use this kinda pattern.
Is there a way to invoke lighthouse with custom headers? The lighthouse docs say you can do this via a CLI flag: https://github.com/GoogleChrome/lighthouse/blob/master/docs/authenticated-pages.md#option-3-pass-custom-request-headers-with-lighthouse-cli, but I don't see a way to pass raw Lighthouse CLI flags. I have sensitive authorization headers stored in GitHub Secrets that I don't want to add to a file on a file system (or worse, source control).
I'm getting the following error in github actions when I try to run this on push on an Eleventy site.
Full output
Run Lighthouse against a static dist dir
1s
at async run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js:85:7)
Run treosh/lighthouse-ci-action@v2
Action config
Collecting
Started a web server on port 38615...
Error: ENOENT: no such file or directory, scandir '/home/runner/work/derekjdev/_site'
at Object.readdirSync (fs.js:854:3)
at FallbackServer.getAvailableUrls (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/fallback-server.js:62:26)
at startServerAndDetermineUrls (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js:150:25)
at processTicksAndRejections (internal/process/task_queues.js:93:5)
at async Object.runCommand (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js:189:25)
##[error]LHCI 'collect' has encountered a problem.
at async run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js:85:7)
.github/workflows/main.yml
name: Lighthouse
on: push
jobs:
static-dist-dir:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Run Lighthouse against a static dist dir
uses: treosh/lighthouse-ci-action@v2
with:
# no urls needed, since it uses local folder to scan .html files
configPath: '.github/lighthouse/lighthouserc.json'
.github/lighthouse/lighthouserc.json*
{
"ci": {
"collect": {
"staticDistDir": "../_site"
}
}
}
What am I missing?
I'd like the ability to have this post the links to the reports in the PR
I'm running the action on a static dir and get the following error:
Started a web server on port 44475...
Running Lighthouse 1 time(s) on http://localhost:44475/index.html
Run #1...failed!
Error: Lighthouse failed with exit code 1
at ChildProcess.<anonymous> (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/lighthouse-runner.js:103:21)
at ChildProcess.emit (events.js:210:5)
at Process.ChildProcess._handle.onexit (internal/child_process.js:272:12)
Fri, 18 Jun 2021 21:02:37 GMT ChromeLauncher Waiting for browser.
Fri, 18 Jun 2021 21:02:37 GMT ChromeLauncher Waiting for browser...
Fri, 18 Jun 2021 21:02:38 GMT ChromeLauncher Waiting for browser.....
Fri, 18 Jun 2021 21:02:38 GMT ChromeLauncher Waiting for browser.......
Fri, 18 Jun 2021 21:02:39 GMT ChromeLauncher Waiting for browser.........
Fri, 18 Jun 2021 21:02:39 GMT ChromeLauncher Waiting for browser...........
Fri, 18 Jun 2021 21:02:40 GMT ChromeLauncher Waiting for browser.............
Fri, 18 Jun 2021 21:02:40 GMT ChromeLauncher Waiting for browser...............
Fri, 18 Jun 2021 21:02:41 GMT ChromeLauncher Waiting for browser.................
Fri, 18 Jun 2021 21:02:41 GMT ChromeLauncher Waiting for browser.................โ
Fri, 18 Jun 2021 21:02:41 GMT ChromeLauncher Killing Chrome instance 2616
Runtime error encountered: perf.getEntriesByName is not a function
TypeError: perf.getEntriesByName is not a function
at Object.exports.stop (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/marky/lib/marky.cjs.js:55:24)
at Function.timeEnd (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse-logger/index.js:128:11)
at Function.requireGatherers (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/config/config.js:819:9)
at new Config (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/config/config.js:337:27)
at generateConfig (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/index.js:60:10)
at lighthouse (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/index.js:43:18)
at runLighthouse (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-cli/run.js:193:32)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
Error: LHCI 'collect' has encountered a problem.
Here are the relevant files:
lighthouse.yaml
name: LH
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
jobs:
static-dist-dir:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup Node
uses: dcodeIO/setup-node-nvm@master
with:
node-version: node
- run: |
npm ci --ignore-scripts
npm run setup
npm run build
- name: Run Lighthouse on urls and validate with lighthouserc
uses: treosh/lighthouse-ci-action@v2
with:
configPath: './.github/lighthouserc.json'
uploadArtifacts: true
- name: Report
uses: manrueda/lighthouse-report-action@master
with:
reports: '.lighthouseci'
github-token: ${{ secrets.GITHUB_TOKEN }}
{
"ci": {
"preset": "lighthouse:recommended",
"assert": {
"assertions": {
"categories:performance": ["error", { "minScore": 0.5 }],
"categories:accessibility": ["error", { "minScore": 0.7 }],
"categories:best-practices": ["error", { "minScore": 0.85 }],
"categories:seo": ["error", { "minScore": 0.8 }],
"categories:pwa": ["error", { "minScore": 0.25 }]
}
},
"collect": {
"staticDistDir": "./packages/app/dist",
"settings": {
"chromeFlags": [
"--enable-webgl2-compute-context",
"--use-fake-device-for-media-stream",
"--use-fake-ui-for-media-stream"
]
}
}
}
}
Major change: Lighthouse 6.0 is a new major version. It brings new metrics, scores, budgets, and LHCI upgrade.
Also, It would be great, to ship all planned enhancements & docs updates with this release.
I'm trying to get my Github build to fail if my Accessibility Score falls below a certain threshold.
I suspect it's because it's not reading my lighthouserc.json
file correctly (or, more likely, my file is incorrect).
Please tell me what I'm doing wrong! (Also, please tell me if there's somewhere better to ask questions like this)
My lighthouse.yml
name: Lighthouse
on: [push]
jobs:
lighthouse:
# Setup cut for berevity
- name: Audit URLs using Lighthouse
uses: treosh/lighthouse-ci-action@v7
with:
urls: |
http://[my-url]/
http://[my-url]/welcome
configPath: ./.github/lighthouse/lighthouserc.json
uploadArtifacts: true
temporaryPublicStorage: true
And the config file (lighthouserc.json)
{
"ci": {
"assert": {
"assertions": {
"color-contrast": "off",
// NOTE: I've tried both 0.99 and 99, and neither cause my build to fail
"categories:accessibility": ["error", {"minScore": 99}],
"categories:best-practices": "off",
"categories:performance": "off",
"categories:seo": "off"
}
}
}
}
Even though everything's mostly off, all the assertions still get run, and even though my required A11y score is 99 (and the reported score is 92), my build still passes.
Any insight here would be greatly appreciated!
Hi, I have a case of the numbers of the lighthouse score in the report are not equal as we see them in the PR.
Type Bug
The result
variable below which inside the console.log
is retrieved by ${{ steps.lighthouse_audit.outputs.manifest }}[0].summary
. lighthouse_audit
is the identifier for the above step that runs the audit!
Part of lighthouse.yaml..
...
- name: Audit URLs using Lighthouse
id: lighthouse_audit
uses: treosh/lighthouse-ci-action@v7
with:
configPath: './lighthouserc.json'
uploadArtifacts: true
temporaryPublicStorage: true
- name: Format lighthouse score
id: format_lighthouse_score
uses: actions/github-script@v3
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const result = ${{ steps.lighthouse_audit.outputs.manifest }}[0].summary
console.log({ result })
const links = ${{ steps.lighthouse_audit.outputs.links }}
const formatResult = (res) => Math.round((res * 100))
Object.keys(result).forEach(key => result[key] = formatResult(result[key]))
const score = res => res >= 90 ? '๐ข ' : res >= 50 ? '๐ ' : '๐ด '
const comment = [
`โก๏ธ [Lighthouse report](${Object.values(links)[0]}) for the changes in this PR:`,
'| Category | Score |',
'| --- | --- |',
`| ${score(result.performance)} Performance | ${result.performance} |`,
`| ${score(result.accessibility)} Accessibility | ${result.accessibility} |`,
`| ${score(result['best-practices'])} Best practices | ${result['best-practices']} |`,
`| ${score(result.seo)} SEO | ${result.seo} |`,
`| ${score(result.pwa)} PWA | ${result.pwa} |`,
].join('\n');
core.setOutput("comment", comment);
...
Expected
The results returned to the script are exactly the same as the generated report!
I am using lighthouse ci github action of the latest version. My goal is to assert the following:
I have configured the budget.json(desktop) and mob-budget.json(mobile) for validation of metrics and also I have separate config file for dekstop and mobile.
Below is the desktop config file:
Below is the mobile config file
Below is the workflow using above mentioned files:
Expectation:
The lighthouse scan should be done for all pages mentioned in urls and it should asserted for metrics and categories for specified values.
Observed:
Only metrics is getting asserted and not category.
Link of one of the execution: https://github.com/mkathu/ul-witelabel-visual-test/runs/4042090535?check_suite_focus=true
Sharing the report of one of the page which does not match the configured performance value of 0.9 for desktop, but not getting asserted.
Any pointers would be greatly appreciated in making this work.
Say I want to send the list of urls in a webhook, or calculate it based on some state in my repo, or define it using a javascript function.
It would be great to be able to dynamically build the list of urls.
Cheers
I can see the results were uploaded somewhere: https://github.com/kaihendry/ltabus/runs/251577950
Now how do I view them?
Thanks!
Is there any sample using this action to return JSON response and not HTML file?
I've been continuously getting the same error when trying to run this action with the configPath
option:
Error: Unexpected token ๏ฟฝ in JSON at position 0
.github/workflows/on-push.yaml
:
- name: run-lighthouse-ci
uses: treosh/lighthouse-ci-action@v7
with:
configPath: './lighthouserc.json'
I've found that if I remove the configPath
option I get a different error related to LHCI, which is probably due to me not passing url
.
./lighthouserc.json
{
"ci": {
"collect": { "staticDistDir": "./dist/public/static" }
}
}
Public repo that is affected (at the commit that is has the above example)
https://github.com/bradtaniguchi/bradtaniguchi.github.io/tree/dd4d980a2f9036b6e2e20333a4559cb4a64864f3
I'm using v3, running a live url 3 times. The URLs are all hitting Firebase's hosting CDN, so they're very stable performance-wise. In my latest run, my Performance score across 3 runs was 24, 73, 73. The exact same site, with identical code, run from the same infra, but at a different URL is reporting values of 89, 74, and 72. When running Lighthouse from Chrome, the same URLs consistently have a performance score of between 85-87. I'm curious as to why there is so much variance in the actions version of Lighthouse as compared to the browser version, and if there's any way to stabilize the results.
Thanks!
It seems that the default report is only for mobile. How to generate both desktop and mobile report ? Just like page speed insights https://developers.google.com/speed/pagespeed/insights/?
I created a sample project with a Lighthouse Plugin for my blogpost: https://engineering.q42.nl/making-a-lighthouse-plugin-work-with-lighthouse-ci/.
I couldn't get it to work on GitHub Actions without adding the lines:
- run: npm install lighthouse
- run: mv node_modules/lighthouse-plugin-social-sharing
Otherwise it gave te error messages:
Runtime error encountered: Cannot find module 'lighthouse'
Runtime error encountered: Unable to locate plugin: lighthouse-plugin-field-social-sharing
My sample repo is: https://github.com/Q42/lighthouse-plugin-sample-project
@alekseykulikov are you able to release a new version?
Is this correct that the upload does not proceed if the budget fails? In your examples it looks like the budget uploads.
My workflow
name: CI
on: push
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Lighthouse CI Action
uses: treosh/lighthouse-ci-action@v1
with:
urls: 'http://milesalex.github.io/gatsby'
budgetPath: .github/workflows/budget.json
- name: Upload results
uses: actions/upload-artifact@master
with:
name: lighthouse-results
path: './results'
I'd like to see my lighthouse scores for each of my deployments. I use nextjs with now for my site so I have deployments for each push, so it should be possible to test against the latest deployment in the PR and show the results inline. Is that possible?
Hi all. I have an error when trying to check report. After I'm redirected to the report page from git I see the next error in the browse console - "Uncaught (in promise) DOMException: Failed to execute 'setItem' on 'Storage': Setting the value of 'lastCompareReport' exceeded the quota." with error trace. Here is the screenshot of console - http://i.imgur.com/W9GzJMo.png
I'm using chrome to open report pages, but the same result is when opening them in FF or chrome incognito.
Any help is appreciated, thx.
Hi I am working on same steps to Integrate lighthouse(lhci) with Jenkins.
Building upon issue #31 from @denar90 - I've attempted to use the approach to wait for a preview site to be available from Netlify, and run the lighthouse-ci-action to determine SEO, Accessibility scores on PRs.
However, I'm encountering issues (as at https://github.com/chrisreddington/hugo-community/runs/1526907604?check_suite_focus=true#step:4:92). From some quick searching, it looks like there are some wider issues around this and potentially Chrome versions, if I understand rightly?
But was curious if anyone else is going down this path with netlify/also encountering issues? I wondered @denar90 if you're still using it / have any tips? Thanks!
Running this a few times with only 1 run on my personal site gh repo I got inconsistent results. This seemed to be due to the underlying VMs.
100 score & 1026 benchmark idx
98 score & 347 benchmark idx
88 score & 106 benchmark idx
Ideas:
Less than a 500 benchmark index, the run should be killed. These results won't be accurate and can't be asserted against.
We should run a pre-flight check with the benchmarker Lighthouse uses (it's some pretty simple js to run, seen here: https://benchmark.exterkamp.codes) and check if the VM we got for our action was DOA. If it was, then exit code 1 and print out something about re-running the action because of a bad VM.
No matter what I try, I cannot get the configPath
to work. Always results in error:
##[error]ENOENT: no such file or directory, open '/home/runner/work/simpixelated.com/simpixelated.com/lighthouserc.json'
You can see the build here:
https://github.com/simpixelated/simpixelated.com/pull/19/checks?check_run_id=989101553
Seems similar to #47 and I had hoped the new release would fix it, but it's not working for me
Here's my worfklow:
name: Lighthouse CI for Netlify sites
on: pull_request
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Use Node.js 12.x
uses: actions/setup-node@v1
with:
node-version: 12.x
- name: Wait for the Netlify Preview
uses: jakepartusch/wait-for-netlify-action@v1
id: netlify
with:
site_name: "simpixelated"
max_timeout: 120
- uses: actions/checkout@v2
- name: Audit URLs using Lighthouse
uses: treosh/[email protected]
with:
urls: |
${{ steps.netlify.outputs.url }}
${{ steps.netlify.outputs.url }}/two-year-work-retrospective
configPath: "./lighthouserc.json"
temporaryPublicStorage: true
Is it possible to add flags to the lighthouse action? For the normal LHCI we can do the following:
lighthouse-ci [URL] --chrome-flags=--ignore-certificate-errors
Is this possible in the YAML?
My workflow links a config file, that in return links a puppeteerScript. This works locally when running lhci with puppeteer globally installed, but I attempted to get it running with this Action in vain. I tried a regular npm dependency as well as the Action ianwalter/[email protected]
.
// workflow.yml
jobs:
lighthouse:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- uses: treosh/lighthouse-ci-action@v2
with:
budgetPath: ./lighthouse-budget.json
configPath: ./lighthouse.json
// ...
// lighthouse.json
{
"ci": {
"assert": {
"preset": "lighthouse:recommended"
},
"collect": {
"numberOfRuns": 3,
"puppeteerScript": "./lighthouse-login.js"
},
"upload": {
"target": "temporary-public-storage"
}
}
}
Run treosh/lighthouse-ci-action@v2
Action config
Collecting
Error: Cannot find module 'puppeteer'
Require stack:
- /home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/puppeteer-manager.js
- /home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js
- /home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js
at Function.Module._resolveFilename (internal/modules/cjs/loader.js:797:15)
at Function.Module._load (internal/modules/cjs/loader.js:690:27)
at Module.require (internal/modules/cjs/loader.js:852:19)
at require (internal/modules/cjs/helpers.js:74:18)
at PuppeteerManager._getBrowser (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/puppeteer-manager.js:27:23)
at PuppeteerManager.invokePuppeteerScriptForUrl (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/puppeteer-manager.js:58:32)
at Object.runCommand (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js:176:23)
##[error]LHCI 'collect' has encountered a problem.
done in 1.17184977s
at async run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js:84:7)
Ideas for new recipes:
An LHCI assertion supports error
or warning
status.
Currently, every annotation has an error
status, where it lists errors and warnings separated visually.
If a URL only has warnings, the status should be a warning
.
console.log("::warning some text%0Aafter new line")
We could also use ::debug
annotations to report LH results or other useful information.
Could an option be added so that if an overall score drops (maybe by a certain margin, or just a certain metric). Seeing as the results can be stored as an artifact this should be possible
Actions has at least two parameters that replicate the ones present on LHCI config:
Which of them do have priority if both specified? Looks like action params override configuration options from config file but please confirm explicitly. What if action param is not specified? Will it use value from config file or use its own default (e.g. action runs=1 by default)?
Also huge thanks for an awesome tool! Great job ๐
looked into this and the CLI arg is set regardless.. and the CLI will prioritize its args above anything in the config. (which makes sense IMO)
So we shouldn't set the CLI numberOfRuns if it was just falling back to the actions' default of 1.
There's an alternative though.. Instead, the action could just default to the default of the CLI (which is 3 runs). If we did that, then you could just delete some code and this problem would work itself out. ;)
Thanks for this project!
Would it be possible to see the results of each Lighthouse audit as a comment on the PR (similar to the behaviour of the deprecated lighthousebot project)?
Lighthouse CI latest version uses Lighthouse 8 (https://github.com/GoogleChrome/lighthouse-ci/releases). It would be nice if the GitHub Action could use this version as well. Thanks in advance!
Hi,
Maybe I'm missing something but is there any way to pass a username & password to this action for uploading?
See here: https://github.com/GoogleChrome/lighthouse-ci/blob/master/docs/server.md#basic-authentication
Thanks!
I'm working on getting Lighthouse CI to work on our repo . However the 404.html file is picked up and tested. But we don't want that to happen.
I noticed autodiscoverUrlBlocklist and tried that in the lighthouserc file, but i'm not sure how to use it.
I tried only the file, and the full url. But the port always changes so i can't really specify the exact url with port.
Any other way how i can exclude the error file?
Digging into the output, it would appear No GitHub token set, skipping GitHub status check. ?
https://github.com/kaihendry/ltabus/runs/748414075
Which doesn't make sense to me, as shouldn't it be set automatically?
https://github.com/kaihendry/ltabus/blob/master/.github/workflows/main.yml
Any interest from community so we can start initial research?
Nice to have check for none production URLs (preview). Netlify has this feature, it just builds user site against branch/PR user/collaborator pushed. Then user sees perf results for the new changes in PR using the same action.
PoC PR - #30
Main problem is race conditions: Netlify can deploy longer then action starts to check, we need to wait for the Netlify to finish and then run action
I've researched a bit and looks like it's tricky to wait for Netlify build to be finished.
Q: on a pr -> run a node script -> script calls the github checks API -> creates a NEW completed check with a preview url.
A: Actions currently doesn't support the behavior you're describing. On the other hand, a GitHub App could do the kind of thing that you're talking about by updating thedetails_url
value on the check run that it creates.
https://github.community/t5/GitHub-Actions/Use-action-to-deploy-and-return-a-preview-URL/td-p/29214
I think it can be a bit complicated for PoC. Let's try just ping preview URL for N minutes (we can even add option for that in case of really long builds) until it returns 200.
Thoughts?
I have the following issue, trying to run the Github Action on a "mono repo". So for example I have:
root/
| - Client
| |- Lighthouse should run here
|
|- Server
So with that folder structure, when I run lighthouse throw the following error...
error Couldn't find a package.json file in "/home/runner/work/***/***"
Error: LHCI 'collect' has encountered a problem.
And make sense, since the package.json in on "/home/runner/work///client", right now I'm using configPath: ./client/lighthouserc.js'
but I wasn't able to fin something like that to say, runPath
or workingDirectory
.
Thanks for you time guys! You did and excellent work with this!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.