mdreizin / gatsby-plugin-robots-txt Goto Github PK
View Code? Open in Web Editor NEWGatsby plugin that automatically creates robots.txt for your site
Home Page: https://mdreizin.github.io/gatsby-plugin-robots-txt
License: MIT License
Gatsby plugin that automatically creates robots.txt for your site
Home Page: https://mdreizin.github.io/gatsby-plugin-robots-txt
License: MIT License
1.0.0-beta.8
to 1.0.0-rc.0
.๐จ View failing branch.
This version is covered by your current version range and after updating it in your project the build failed.
eslint-plugin-markdown is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
The new version differs by 12 commits.
f9258b7
1.0.0-rc.0
8bc7c0c
Build: changelog update for 1.0.0-rc.0
8fe9a0e
New: Enable autofix with --fix (fixes #58) (#97)
a5d0cce
Fix: Ignore anything after space in code fence's language (fixes #98) (#99)
6fd340d
Upgrade: [email protected] (#100)
dff8e9c
Fix: Emit correct endLine numbers (#88)
83f00d0
Docs: Suggest disabling strict in .md files (fixes #94) (#95)
3b4ff95
Build: Test against Node v10 (#96)
6777977
Breaking: required node version 6+ (#89)
5582fce
Docs: Updating CLA link (#93)
24070e6
Build: Upgrade to [email protected] (#92)
6cfd1f0
Docs: Add unicode-bom to list of unsatisfiable rules (#91)
See the full diff
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
Hello Marat, thank you for your work on the plugin! I read the source code and found out that the plugin passes non-truthy host values to generate-robotstxt, which then doesn't add a host directive. I think that should be a feature documented in the README. Many people probably want to get rid of the host directive, because Bing's sitemap checker otherwise complains about a syntax error and AFAIK the only search engine supporting the directive is Yandex. Also the host directive lacks the ability to specify a protocol and people should generally rely on 301 redirects instead.
Hello,
Because of the change in the package.json requiring a peerDependency of gatsby 5.0.0, this plugin breaks on a Gatsby 2 website.
As you can see here v1.7.1 is compatible with Gatsby 2, but 1.8.x is not.
https://github.com/mdreizin/gatsby-plugin-robots-txt/blob/v1.7.1/package.json
master
branch failed. ๐จI recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.
You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. Iโm sure you can resolve this ๐ช.
Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.
Once all the errors are resolved, semantic-release will release your package the next time you push a commit to the master
branch. You can also manually restart the failed CI job that runs semantic-release.
If you are not sure how to resolve this, here is some links that can help you:
If those donโt help, or if this issue is reporting something you think isnโt right, you can always ask the humans behind semantic-release.
The npm token configured in the NPM_TOKEN
environment variable must be a valid token allowing to publish to the registry https://registry.npmjs.org/
.
If you are using Two-Factor Authentication, make configure the auth-only
level is supported. semantic-release cannot publish with the default auth-and-writes
level.
Please make sure to set the NPM_TOKEN
environment variable in your CI with the exact value of the npm token.
Good luck with your project โจ
Your semantic-release bot ๐ฆ๐
UNHANDLED REJECTION ENOENT:
no such file or directory, open '/node_modules/gatsby-plugin-robots-txt/src/gatsby-node.js'
Gatsby didn't load root directory of plugin gatsby-plugin-robots-txt/gatsby-node.js
I've try two gatsby-plugin-robots-txt version: 1.5.1, 1.5.0
Gatsby CLI version: 2.12.34
Gatsby version: 2.22.9
To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:
.travis.yml
If youโre interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.
Greenkeeper has checked the engines
key in any package.json
file, the .nvmrc
file, and the .travis.yml
file, if present.
engines
was only updated if it defined a single version, not a range..nvmrc
was updated to Node.js 10.travis.yml
was only changed if there was a root-level node_js
that didnโt already include Node.js 10, such as node
or lts/*
. In this case, the new version was appended to the list. We didnโt touch job or matrix configurations because these tend to be quite specific and complex, and itโs difficult to infer what the intentions were.For many simpler .travis.yml
configurations, this PR should suffice as-is, but depending on what youโre doing it may require additional work or may not be applicable at all. Weโre also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, Iโm a humble robot and wonโt feel rejected ๐ค
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
is there a way to configure firebase deploys ie to staging site
Package: yargs-parser
Patched-in: >=13.1.2 <14.0.0 || >=15.0.1 <16.0.0 || >=18.1.2
Dependency of: gatsby-plugin-robots-txt
Path: gatsby-plugin-robots-txt > generate-robotstxt > meow > yargs-parser
More-info: (https://npmjs.com/advisories/1500)
5.2.1
to 5.3.0
.๐จ View failing branch.
This version is covered by your current version range and after updating it in your project the build failed.
eslint-plugin-babel is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
15.10.3
to 15.10.4
.๐จ View failing branch.
This version is covered by your current version range and after updating it in your project the build failed.
semantic-release is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
Since Gatsby 5.0.0 (and gatsby-plugin-sitemap 6.0.0), the default location of the sitemap has been changed from ${siteMetadata.siteUrl}/sitemap/sitemap.xml
to ${siteMetadata.siteUrl}/sitemap.xml
.
Should the default sitemap
property of this plugin be changed as well? If so, I'm willing to make a pull request.
Relevant information:
๐ Thanks for creating this! With the default usage:
plugins: ['gatsby-plugin-robots-txt']
Running gatsby develop
or gatsby build
doesn't seem to be actually generating the robots.txt
file anywhere. I started going into the plugin option and setting paths, etc; but I'm not actually seeing any response. Is the default output ./static/robots.txt
?
I am just wondering if we can add "noindex" in the list of policy rules. For now it has only "disallow".
v4.0.0 of gatsby-plugin-sitemap changes where sitemaps are automatically generated to be inside /sitemap. This isn't the standard location of sitemaps so i'm not sure if changing the default of this plugin is the right move but maybe a line in the readme which lets users of both plugins know there's been change and to update the sitemap url?
There's an issue open following this change for more information: #31095
5.12.0
to 5.12.1
.๐จ View failing branch.
This version is covered by your current version range and after updating it in your project the build failed.
eslint is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
eb5c401
Chore: use meta.messages in some rules (2/4) (refs #9870) (#10773) (่ๅฎ่ฐ็็ซ)aa56247
Fix: avoid loading core rules dynamically from FS in Linter (#11278) (Peter Metz)04450bb
Docs: clarify process for adding committers (#11272) (Kai Cataldo)3ffcf26
Docs: add @g-plane as committer (#11277) (Kai Cataldo)c403445
Fix: warn constant on RHS of || in no-constant-condition (fixes #11181) (#11253) (Merlin Mason)9194f45
Fix: Manage severity of 1 with TAP reporter (fixes #11110) (#11221) (Gabriel Cousin)000f495
Docs: fix example for sort-imports ignoreDeclarationSort (#11242) (Remco Haszing)7c0bf2c
Docs: Add npx
usage to Getting Started guide (#11249) (eyal0803)da9174e
Docs: fixes typo peerDepencies (#11252) (Christian Kรผhl)9c31625
Docs: Improve custom formatter docs (#11258) (Nicholas C. Zakas)The new version differs by 12 commits.
faf3c4e
5.12.1
1010c98
Build: changelog update for 5.12.1
eb5c401
Chore: use meta.messages in some rules (2/4) (refs #9870) (#10773)
aa56247
Fix: avoid loading core rules dynamically from FS in Linter (#11278)
04450bb
Docs: clarify process for adding committers (#11272)
3ffcf26
Docs: add @g-plane as committer (#11277)
c403445
Fix: warn constant on RHS of || in no-constant-condition (fixes #11181) (#11253)
9194f45
Fix: Manage severity of 1 with TAP reporter (fixes #11110) (#11221)
000f495
Docs: fix example for sort-imports ignoreDeclarationSort (#11242)
7c0bf2c
Docs: Add npx
usage to Getting Started guide (#11249)
da9174e
Docs: fixes typo peerDepencies (#11252)
9c31625
Docs: Improve custom formatter docs (#11258)
See the full diff
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
I want to disallow specific pages that I have defined in enviromental variables. How can I do that?
Why not make sitemaps
an array instead? Multiple sitemap entries are allowed according to the specification.
https://developers.google.com/search/reference/robots_txt
[absoluteURL] points to a Sitemap, Sitemap Index file or equivalent URL. The URL does not have to be on the same host as the robots.txt file. Multiple sitemap entries may exist. As non-group-member records, these are not tied to any specific user-agents and may be followed by all crawlers, provided it is not disallowed.
Hi. After the plugin update, I'm getting an error when building my Gatsby page. In the configuration, I use sitemap: null
Error:
"gatsby-plugin-robots-txt" threw an error while running the onPostBuild lifecycle:
Invalid URL: null
73 | ) {
74 |
> 75 | mergedOptions.sitemap = new URL(path.posix.join(pathPrefix, 'sitemap', 'sitemap-index.xml'), mergedOptions.host).toString();
| ^
76 | } else {
77 | try {
78 | new URL(mergedOptions.sitemap)
File: node_modules/gatsby-plugin-robots-txt/src/gatsby-node.js:75:29
Error: TypeError [ERR_INVALID_URL]: Invalid URL: null
5.0.9
to 5.0.10
.๐จ View failing branch.
This version is covered by your current version range and after updating it in your project the build failed.
travis-deploy-once is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
The new version differs by 3 commits.
b360e09
fix(package): update p-retry to version 3.0.0
fe68469
chore(package): update nyc and sinon
0f8d0d3
chore(package): update commitizen to version 3.0.0
See the full diff
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
When submitting my site to search engines, they report "Syntax not understood" for the line in robots.txt generated by this plugin...
Host: http://www.example.com
master
branch failed. ๐จI recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.
You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. Iโm sure you can resolve this ๐ช.
Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.
Once all the errors are resolved, semantic-release will release your package the next time you push a commit the master
branch. You can also manually restart the failed CI job that runs semantic-release.
If you are not sure how to resolve this, here is some links that can help you:
If those donโt help, or if this issue is reporting something you think isnโt right, you can always ask the humans behind semantic-release.
semantic-release cannot push the version tag to the branch master
on remote Git repository.
Please refer to the authentication configuration documentation to configure the Git credentials on your CI environment.
Good luck with your project โจ
Your semantic-release bot ๐ฆ๐
This works fine if put in gatsby-config.js. The result being a robots.txt with Disallow: /
in it.
resolveEnv: () => process.env.NODE_ENV,
env: {
development: {
policy: [{ userAgent: '*', disallow: ['/'] }],
},
production: {
policy: [{ userAgent: '*', disallow: ['/'] }],
},
},
But if I create a config file and reference it:
gatsby-config.js
configFile: 'robots-txt.config.js',
robots-txt.config.js
resolveEnv: () => process.env.NODE_ENV,
env: {
development: {
policy: [{ userAgent: '*', disallow: ['/'] }],
},
production: {
policy: [{ userAgent: '*', disallow: ['/'] }],
},
},
then the policies in the env object are ignored. The result being a robots.txt with Allow: /
in it.
I'm using Strapi as a CMS for a website that I'm working on. I want to be able to set the policies
array dynamically, using allStrapiPage
data (which I can get via a GraphQL query).
Is there any way to be able to set the policies
array using GraphQL queries?
npm install seems to run fine, but the plugin never appears as a dependency in the package.json file.
$ npm install --save gatsby-plugin-robots-txt
npm WARN [email protected] requires a peer of [email protected] - 3 but none is installed. You must
install peer dependencies yourself.
npm WARN [email protected] requires a peer of popper.js@^1.14.7 but none is installed. You must install peer dependencies yourself.
npm WARN [email protected] requires a peer of typescript@>=2.8.0 || >= 3.2.0-dev || >= 3.3.0-dev || >= 3.4.0-dev || >= 3.5.0-dev || >= 3.6.0-dev || >= 3.6.0-beta || >= 3.7.0-dev || >= 3.7.0-beta but none is installed. You must install peer dependencies yourself.
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\webpack-dev-server\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\watchpack\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\babel-plugin-add-module-exports\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
Hey,
For some reason when building and deploying it removes the robots.txt,
https://www.craftmc.net/robots.txt just returns the homepage, whereas in development mode it returns a robots.txt file, (localhost:8000/robots.txt)
Do you think this could be build related or my Apache2 delivery?
I unfortunately can't use this plugin because I'm still on Gatsby v4. Would it break if this line were changed?
From:
"peerDependencies": {
"gatsby": "^5.0.0"
}
To:
"peerDependencies": {
"gatsby": "^5.0.0 || ^4.0.0"
}
Happy to open a PR.
1.14.3
to 1.15.0
.๐จ View failing branch.
This version is covered by your current version range and after updating it in your project the build failed.
prettier is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
There is a collection of frequently asked questions. If those donโt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot ๐ด
I'm trying to include multiple sitemaps using the below and I'm struggling to figure out why it's not working. Any help greatly appreciated. ๐
{
resolve: `gatsby-plugin-robots-txt`,
options: {
sitemap: ['https://domain.com/sitemap1.xml', 'https://domain.com/sitemap2.xml'],
},
},
"gatsby-plugin-robots-txt": "^1.5.0",
When I build my project using [email protected] I get
warn Plugin gatsby-plugin-robots-txt is not compatible with your gatsby version 4.0.0 - It requires gatsby@^3.0.0 || ^2.0.0
version 1.6.13
I set env property like this in gatsby-config.js
and execute NODE_ENV=development gatsby build
.
{
resolve: "gatsby-plugin-robots-txt",
options: {
host: "https://hoge.com",
sitemap: "https://hoge.com/sitemap.xml",
env: {
development: {
policy: [{ userAgent: "*", disallow: "/" }]
},
production: {
policy: [{ userAgent: "*", allow: "/" }]
}
}
}
}
But robots.txt is below. development
is not working. Please help me.
User-agent: *
Allow: /
Sitemap: https://hoge.com/sitemap.xml
Host: https://hoge.com
Resently upgraded gatsby-plugin-robots-txt
from 1.5.5 to 1.6.2 and I am getting this Netlify build error:
7:56:33 AM: error "gatsby-plugin-robots-txt" threw an error while running the onPostBuild lifecycle:
7:56:33 AM: Cannot read property 'startsWith' of null
7:56:33 AM: 78 | new URL(mergedOptions.sitemap)
7:56:33 AM: 79 | } catch {
7:56:33 AM: > 80 | mergedOptions.sitemap = new URL(mergedOptions.sitemap.startsWith(pathPrefix) ? mergedOptions.sitemap : path.posix.join(pathPrefix, mergedOptions.sitemap), mergedOptions.host).toString()
7:56:33 AM: | ^
7:56:33 AM: 81 | }
7:56:33 AM: 82 | }
7:56:33 AM: 83 |
My config is a one-to-one copy of the Netlify example in the docs. Went back to 1.5.6 and everything is fine. I'm assuming it has something to do with the changes in #441 and the docs just need to be updated with a new elimination.
Upgrading a Gatsby v1.x project to use Gatsby v2.x where the project uses this plugin results in a peer dependency warning in NPM, because of this plugin's peer dependency on Gatsby v1, which excludes major versions:
"peerDependencies": {
"gatsby": "^1.0.0"
},
I don't think there is anything in V2 that should cause a problem with this plugin.
When deploying our gatsby site to firebase, we got this error. I believe it has to do with this new feature of switching fs
for fs/promise
.
Gatsby version in my package.json:
I was able to successfully build the project locally with node version 14.17.1. Can the different versioning of Node be the potential cause of breaking during deployment?
Any help here would be appreciated!
Hi there,
I have the following options for this plugin:
{
resolve: 'gatsby-plugin-robots-txt',
options: {
host: siteUrl,
sitemap: `${siteUrl}/sitemap-index.xml`,
policy: [
{
userAgent: '*',
sitemap: `${siteUrl}/sitemap-index.xml`
},
{
userAgent: 'Yandex',
host: siteUrl,
sitemap: `${siteUrl}/sitemap-index.xml`
}
]
}
}
But in my output file I have:
User-agent: *
User-agent: Yandex
Sitemap: https://www.website.com/sitemap-index.xml
Host: https://www.website.com
How can I add sitemap under User-agent: *
?
master
branch failed. ๐จI recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.
You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. Iโm sure you can resolve this ๐ช.
Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.
Once all the errors are resolved, semantic-release will release your package the next time you push a commit the master
branch. You can also manually restart the failed CI job that runs semantic-release.
If you are not sure how to resolve this, here is some links that can help you:
If those donโt help, or if this issue is reporting something you think isnโt right, you can always ask the humans behind semantic-release.
The npm token configured in the NPM_TOKEN
environment variable must be a valid token allowing to publish to the registry https://registry.npmjs.org/
.
If you are using Two-Factor Authentication, make configure the auth-only
level is supported. semantic-release cannot publish with the default auth-and-writes
level.
Please make sure to set the NPM_TOKEN
environment variable in your CI with the exact value of the npm token.
Good luck with your project โจ
Your semantic-release bot ๐ฆ๐
Hi,
I'm trying to run a two-years old gatsby project, and I have this error on build:
`#14 0.657 $ gatsby build
#14 5.021 Environment set: production
success open and validate gatsby-configs - 0.131s
#14 7.253 ERROR
#14 7.253
#14 7.253 Error in "/var/www/html/node_modules/gatsby-plugin-robots-txt/gatsby-node.js":
#14 7.253 Cannot find module 'fs/promises'
#14 7.253 Require stack:
#14 7.253 - /var/www/html/node_modules/gatsby-plugin-robots-txt/gatsby-node.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/bootstrap/resolve-module-exports.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/bootstrap/load-plugins/validate.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/bootstrap/load-plugins/load.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/bootstrap/load-plugins/index.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/services/initialize.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/services/index.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/bootstrap/index.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/commands/build.js
#14 7.253 - /var/www/html/node_modules/gatsby-cli/lib/create-cli.js
#14 7.253 - /var/www/html/node_modules/gatsby-cli/lib/index.js
#14 7.253 - /var/www/html/node_modules/gatsby/dist/bin/gatsby.js
#14 7.253 - /var/www/html/node_modules/gatsby/cli.js
#14 7.253
This is my package.json
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"clean": "gatsby clean",
"lint": "eslint --fix src .js",
"start": "node server.js",
"serve": "gatsby serve",
"test": "echo "Write tests! -> https://gatsby.dev/unit-testing\"",
"storybook": "NODE_ENV=production start-storybook -s ./public -p 49999",
"storybook:build": "NODE_ENV=production build-storybook -c .storybook -o storybook_public/ -s ./static && cp -r public/static storybook_public"
},
"husky": {
"hooks": {
"pre-commit": "lint-staged"
}
},
"lint-staged": {
".js": [
"eslint --fix",
"git add"
]
},
"dependencies": {
"@babel/core": "^7.5.5",
"@babel/runtime": "^7.5.5",
"@bugsnag/js": "^6.3.2",
"@bugsnag/plugin-react": "^6.2.0",
"@storybook/addons": "^5.0.11",
"@storybook/components": "^5.0.11",
"autoprefixer": "^9.6.1",
"bugsnag": "^2.4.3",
"classnames": "^2.2.6",
"core-js": "^2",
"dotenv": "^8.0.0",
"dotenv-webpack": "^1.7.0",
"eslint-plugin-graphql": "^3.0.3",
"event-source-polyfill": "^1.0.5",
"express": "^4.17.1",
"gatsby": "^2.13.44",
"gatsby-image": "^2.2.7",
"gatsby-plugin-i18n": "^1.0.1",
"gatsby-plugin-jss": "^2.1.3",
"gatsby-plugin-manifest": "^2.2.13",
"gatsby-plugin-offline": "2.2.9",
"gatsby-plugin-react-helmet": "^3.1.2",
"gatsby-plugin-react-redux": "^1.0.10",
"gatsby-plugin-remove-trailing-slashes": "^2.1.4",
"gatsby-plugin-robots-txt": "^1.5.0",
"gatsby-plugin-sharp": "^2.2.17",
"gatsby-plugin-sitemap": "^2.2.3",
"gatsby-plugin-styled-jsx": "^3.1.2",
"gatsby-source-apiserver": "^2.1.3",
"gatsby-source-contentful": "^2.1.15",
"gatsby-source-filesystem": "^2.1.16",
"gatsby-transformer-sharp": "^2.2.11",
"graphql-tag": "^2.10.1",
"hoist-non-react-statics": "^3.3.0",
"intersection-observer": "^0.7.0",
"keyboard-focus": "^1.0.1",
"lodash": "^4.17.15",
"markdown-it": "^9.1.0",
"path-to-regexp": "^3.0.0",
"picturefill": "^3.0.3",
"postcss-js": "^2.0.2",
"prop-types": "^15.7.2",
"ptz-i18n": "^1.0.0",
"react": "^16.8.6",
"react-dom": "^16.8.6",
"react-helmet": "^5.2.1",
"react-jss": "^10.0.0-alpha.23",
"react-redux": "^7.1.0",
"redux": "^4.0.4",
"redux-act": "^1.7.7",
"redux-devtools-extension": "^2.13.8",
"redux-saga": "^1.0.5",
"regenerator-runtime": "^0.13.2",
"seamless-immutable": "^7.1.4",
"styled-jsx": "^3.2.1",
"webpack-bugsnag-plugins": "^1.4.0"
},
"devDependencies": {
"@babel/plugin-proposal-decorators": "^7.4.4",
"@babel/plugin-proposal-optional-chaining": "^7.2.0",
"@storybook/addon-a11y": "^5.0.10",
"@storybook/addon-actions": "^5.0.3",
"@storybook/addon-console": "^1.1.0",
"@storybook/addon-notes": "^5.0.3",
"@storybook/addon-viewport": "^5.0.10",
"@storybook/react": "5.2.0-alpha.30",
"@storybook/theming": "^5.0.5",
"babel-loader": "^8.0.6",
"babel-preset-gatsby": "^0.2.8",
"eslint-config-airbnb": "^17.1.1",
"eslint-config-prettier": "^6.0.0",
"eslint-config-react-app": "^4.0.1",
"eslint-plugin-prettier": "^3.0.1",
"eslint-plugin-react-hooks": "^1.6.1",
"husky": "^3.0.2",
"lint-staged": "^9.2.1",
"prettier": "^1.18.2",
"storybook-addon-react-docgen": "^1.2.1"
}
`
I know that dependencies need to be updated, but I was told by CTO to not make any updates for the moment.
I'm running node 12.20 in a docker container.
Thanks for your help!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.