This project was created, first, with react-create-app.
First, generate the React project, so you have files to be served from the ./build directory:
npm run build
Execute the express-based server:
npm run server
localhost:8080
- Click the Link - notice it should load another page/component
- Copy and "/page" link to your clipboard and open in another browser window
Notice that the above works because of the statically defined handler in the server.js:
.
.
.
app.get('/page', function (req, res, next) {
var options = {
root: __dirname + '/build/',
};
var fileName = 'index.html';
res.sendFile(fileName, options, function (err) {
if (err) {
next(err);
} else {
console.log('Sent:', fileName);
}
});
});
.
.
.
Now the challenge is to serve something to a simple browser, say a crawler; say one that does not want to render React JavaScript.
To just get and serve the main index.html file, as done in the above case, does not help; because the file generated by npm run build (./build/index.html) does not bring some content relative to the actual "URL path" that you may want. The following is an example of how this main index.html build file looks:
<!doctype html>
<html lang="en">
<head><meta charset="utf-8">
<meta name="viewport" content="width=device-width,initial-scale=1,shrink-to-fit=no">
<meta name="theme-color" content="#000000">
<link rel="manifest" href="/manifest.json">
<link rel="shortcut icon" href="/favicon.ico">
<title>React App</title>
<link href="/static/css/main.c17080f1.css" rel="stylesheet">
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root">
</div>
<script type="text/javascript" src="/static/js/main.f95bdb9c.js"></script>
</body>
</html>
In the system being used with react-create-app, notice that this static file includes links to other generated files, such as main.f95bdb9c.js and main.c17080f1.css as shown. These are always dynamically generated when you do the build stage using npm run build.
Therefore, if you attempt to change the content of a served HTML, say to include a chunk of markup for a possible crawler, you would need to be able to be compatible with the above — to make sure you are distributing the most up to date files.