mike-marcacci / fs-capacitor Goto Github PK
View Code? Open in Web Editor NEWFilesystem-bufferred, passthrough stream that buffers indefinitely rather than propagate backpressure from downstream consumers.
License: MIT License
Filesystem-bufferred, passthrough stream that buffers indefinitely rather than propagate backpressure from downstream consumers.
License: MIT License
When fs-capacity is used frequently in the same process, it can trigger MaxListenersExceededWarning
(best debugged with the --trace-warnings
Node.js flag):
(node:72923) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added. Use emitter.setMaxListeners() to increase limit
at _addListener (events.js:243:17)
at process.addListener (events.js:259:10)
at _fs.default.open (/Users/jaydenseric/Sites/apollo-upload-server/node_modules/fs-capacitor/lib/index.js:140:17)
at FSReqWrap.oncomplete (fs.js:145:20)
(node:72923) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGINT listeners added. Use emitter.setMaxListeners() to increase limit
at _addListener (events.js:243:17)
at process.addListener (events.js:259:10)
at _fs.default.open (/Users/jaydenseric/Sites/apollo-upload-server/node_modules/fs-capacitor/lib/index.js:141:17)
at FSReqWrap.oncomplete (fs.js:145:20)
See https://travis-ci.org/jaydenseric/apollo-upload-server/jobs/405666803#L2191.
Here is where the relevant listeners are attached: https://github.com/mike-marcacci/fs-capacitor/blob/v0.0.3/src/index.mjs#L142
I'm not sure I understand why the classes are namespaced under tty
instead of fs
like you have in the module, its probably local to my machine? or not? idk, but ReadStream
/ WriteStream
from fs
are interfaces, hence the error
- import { ReadStream as FSReadStream, WriteStream as FSWriteStream } from "fs";
+ import { ReadStream as FSReadStream, WriteStream as FSWriteStream } from "tty";
Any insight would be great, thanks
This is from version 2.0.0
fs-capacitor
registers a exit
and a SIGINT
listener for each WriteStream
created. This means that when constructing a lot of WriteStreams at once, you'll often run into Node's MaxListenersExceededWarning
:
(node:63179) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added to [process]. Use emitter.setMaxListeners() to increase limit
(node:63179) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGINT listeners added to [process]. Use emitter.setMaxListeners() to increase limit
An alternative would be to register a single handler and use a global list of objects to clean up. This technique is used by prominent modules like tmp
.
With freshly installed node_modules
, running npm run prepare
results in TypeScript errors:
https://travis-ci.org/github/mike-marcacci/fs-capacitor/jobs/690761205#L212-L225
Upon further investigation from jaydenseric/graphql-upload#318 there are chances that fs.read might return zero bytes when processing a lot of files. I was able to replicate this issue at 100, 10248, 102416, 102416+1, 102416*2 byte files.
I have pinned point the issue to the line of codes https://github.com/mike-marcacci/fs-capacitor/blob/main/src/index.ts#L61-L70 basically it ends the stream whenever the write stream has already finished even if the read stream has not yet fully read the file as seen on the image.
Nearly 80% of active users of this library use the CJS version, presumably because many people are stuck on projects that cannot easily be migrated to ESM—or are consuming fs-capacitor
via libraries that are still published using CJS.
This means that the large majority of users are potentially exposed to #75, which is pretty severe: a silent data corruption bug.
While I do not expect that you want to go back to CJS, and I understand that future feature development will likely be ESM only, it would seem reasonable to me to backport a fix for a bug that critical to the 6.x branch, which is by far the most widely used.
@mike-marcacci , is there any common js support?
I am trying to use fs-capacitor
in the following way and I am getting this weird error now.
import { ReadStream, ReadStreamOptions, WriteStream } from 'fs-capacitor'
Error [ERR_REQUIRE_ESM]: require() of ES Module .../node_modules/fs-capacitor/dist/index.js from .../server.ts not supported.
Instead change the require of index.js in .../server.ts to a dynamic import() which is available in all CommonJS modules.
...
I am using ts-node/ts-node-dev
for the development and here is the tsconfig.json
file.
{
"compilerOptions": {
"module": "commonjs",
"incremental": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"declaration": true,
"target": "ES2018",
"noImplicitAny": true,
"moduleResolution": "node",
"sourceMap": true,
"skipLibCheck": true,
"outDir": "dist",
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"strict": true,
"resolveJsonModule": true,
"removeComments": true,
"forceConsistentCasingInFileNames": true,
"baseUrl": ".",
"rootDir": "."
},
"include": ["src"],
"exclude": ["node_modules/**", "dist"]
}
I am also using "module": "commonjs"
in package.json
.
Hello, I had these issues when updating to the latest version of this library.
I believe most node code bases are written in commonjs or use a bundler from Typescript so solving this issue will greatly reduce some headaches.
I can submit a pull request in case my worries are worth looking into. Just wanted to understand your reasons for dropping support for commonjs before making a pull request.
Thanks
Hi,
I'm using this package with nuxt (apollo-server-express) and experience the following issue:
If I don't set:
build: { analyze: false, extend: (config) => { config.node = { fs: 'empty' } } }
i get:
Module not found: Error: Can't resolve 'fs' in
setting it resolved the issue until updated all packages yesterday. Now the browser logs:
app.js:970 [nuxt] Error while initializing app TypeError: Class extends value undefined is not a constructor or null at Module.<anonymous> (core_data.8fb92f55.js:35632) at Module../node_modules/fs-capacitor/lib/index.mjs (core_data.8fb92f55.js:35816) at __webpack_require__ (runtime.js:787) at fn (runtime.js:150) at Module.<anonymous> (core_data.8fb92f55.js:41785) at Module../node_modules/graphql-upload/lib/processRequest.mjs (core_data.8fb92f55.js:42016) at __webpack_require__ (runtime.js:787) at fn (runtime.js:150) at Module../node_modules/graphql-upload/lib/index.mjs (core_data.8fb92f55.js:41754) at __webpack_require__ (runtime.js:787)
I know that your code is correct and that the error must be somewhat webpack / browser related. Maybe you have a hint for me. Otherwise I'll upate this issue as soon as I found an solution.
.../clinic-node/dist/src/graphql_upload/processRequest.js:4
const fs_capacitor_1 = require("fs-capacitor");
^
Error [ERR_REQUIRE_ESM]: require() of ES Module .../clinic-node/node_modules/.pnpm/[email protected]/node_modules/fs-capacitor/dist/index.js from .../clinic-node/dist/src/graphql_upload/processRequest.js not supported.
Instead change the require of index.js in .../clinic-node/dist/src/graphql_upload/processRequest.js to a dynamic import() which is available in all CommonJS modules.
at Object. (.../clinic-node/dist/src/graphql_upload/processRequest.js:4:24)
at Object. (.../clinic-node/dist/src/graphql_upload/graphqlUploadExpress.js:14:29)
at Object. (.../clinic-node/dist/src/main.js:5:35)
Once the file is stored in the temp directory, its file size could be checked with fs.fstat()
right? It would probably be faster than reading the entire ReadStream
to get the total byte count and would be useful for validating upload size.
Maybe the ReadStream
interface of this package could be adjusted to also include a method for checking the file size, which would execute fs.fstat()
behind the scenes.
ReadStream.prototype.open() has been deprecated in Node.js 13.0.1. I think this is the reason why WriteStream.prototype.createReadStream() causes: "RangeError: Maximum call stack size exceeded".
Code to reproduce:
const {WriteStream} = require('fs-capacitor')
const fs = require('fs')
const stream = fs.createReadStream('package-lock.json')
const capacitor = new WriteStream()
stream.pipe(capacitor)
capacitor.createReadStream()
I haven't worked out why yet, but for some reason unnecessary files such as dev tool config files and yarn.lock
have been mistakenly published to npm:
When unread === 0 we add listeners on write and finish events.
But it seems the write event is never emited.
So the problem I have is when my writeStream has no more to read because the upload is too slow it waits to the finish event (full upload) before restart writing.
Do you think it should be added manually. Because I can not find a write event in nodejs doc.
I am currently debugging weird behavior with graphql-upload
where requests just get stuck indefinitely (not sure if I also need to open an issue there). I narrowed the issue down to the os.tmpdir()
returned by Node not being present on the machine / in the docker image. In this case fs-capacitor
does not seem to forward any information about this to the stream created by createReadStream
and just silently not emitting any events at all. This also appies for any cases where fs.open
fails, e.g. permission issues. Seems like with this behavior this error can never be handled (e.g. by graphql-upload). Is this intended behavor?
Reproduction:
SOME EXISTING FILE
string in the example below with a path to an existing filenode sample.js
: Events are emitted as expected.TMPDIR=/tmp/definitely-does-not-exist sample.js
: No events are emitted at all, Node just terminates.import fs from "fs";
import os from "os";
import { WriteStream } from "fs-capacitor";
const capacitor = new WriteStream();
const source = fs.createReadStream("SOME EXISTING FILE", { encoding: "utf-8" });
source.pipe(capacitor);
const readStream = capacitor.createReadStream();
console.log("Tempdir", os.tmpdir());
readStream.on("error", (e) => console.error("Read Error", e));
readStream.on("data", (d) => console.log("Read Data", d.toString('utf-8')));
readStream.on("end", () => console.log("Read End"));
I've done a rework of fs-capacitor to support node 13 that involved using the readable-stream
package instead of node's stream library in order to simply continued compatibility with node v8.x, which we can drop at the end of 2019.
I spent the better part of a day trying to figure out exporting interoperable es6 and commonjs modules now that I have a commonjs dependency, and I finally just gave up. Without any useful guidance or pathway put forward by node for interoperability, the only choice here is to chose a single module system, which will be commonjs for the immediate future.
I'm very unhappy with this situation and fear this is going to severely fragment the ecosystem, but don't have the time to get involved any further.
I receive the following error while executing tests with jest
in TypeScript
TypeError: Class extends value undefined is not a constructor or null
at Object.<anonymous> (node_modules/fs-capacitor/lib/index.js:22:38)
at Object.<anonymous> (node_modules/graphql-upload/lib/processRequest.js:10:20)
where fs-capacitor/lib/index.js contains
var _fs = _interopRequireDefault(require("fs"));
class ReadStream extends _fs.default.ReadStream {
// ^ the error
my tsconfig.json is
{
"include": ["**/*.spec.ts", "**/*.d.ts"],
"compilerOptions": {
"strictFunctionTypes": false,
"lib": ["es2017", "esnext.asynciterable"],
"module": "commonjs",
"target": "es2017",
"outDir": "../dist/",
"sourceMap": true,
"typeRoots": ["node_modules/@types"],
"types": ["jest", "supertest", "node"]
}
}
I tried to use capacitor stream directly with form-data
but there is no path
property set on resulting stream causing form-data
to fail generating multipart/form-data request.
https://nodejs.org/api/fs.html#fs_readstream_path
const FormData = require('form-data');
const { createReadStream } = require('fs');
const { WriteStream } = require('fs-capacitor');
const { resolve } = require('path');
const data1 = new FormData();
const data2 = new FormData();
const capacitor = new WriteStream();
const fsStream = createReadStream(resolve(__dirname, './testhandler.js'));
const capacitorStream = capacitor.createReadStream();
fsStream.pipe(capacitor);
data1.append(0, fsStream, 'testhandler.js');
data1.on('end', () => {
capacitor.destroy();
});
data1.getLength((err, len) => {
console.log('length 1', len);
});
data2.append(0, capacitorStream, 'testhandler.js');
data2.getLength((err, len) => {
console.log('length 2', len);
});
Data 1 returns length of 254
which is correct, data 2 returns 221
.
So if the stream should be compatible with file streams, it should set path
property.
https://github.com/form-data/form-data/blob/master/lib/form_data.js#L104
This library is currently only published as an ES module without a CommonJS counterpart. As a zero-dependency package, it had appeared less susceptible to the whiplash in guidance for migration to ES modules, and my hope was that doing so might enable consuming packages to leverage the benefits of this newer approach.
However, I want to make sure that this continues to follow best practice. I have a couple changes ready for release as v8.0.0
(most notably #76) but want to give @jaydenseric the chance to figure out if there is anything this library could do to improve the ability to be used here.
I'm not fully up-to-date on the ever-changing recommendations for how to designate the various types of package exports in node, so I wanted to open a conversation for anyone with insight or a strong preferences.
At the moment, the graphql-upload
tests are failing for Node.js v14:
https://github.com/jaydenseric/graphql-upload/runs/701487224?check_suite_focus=true#step:4:82
The error is:
_stream_readable.js:636
throw new ERR_METHOD_NOT_IMPLEMENTED('_read()');
^
Error [ERR_METHOD_NOT_IMPLEMENTED]: The _read() method is not implemented
at Readable._read (_stream_readable.js:636:9)
at Readable.read (_stream_readable.js:475:10)
at resume_ (_stream_readable.js:962:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21) {
code: 'ERR_METHOD_NOT_IMPLEMENTED'
}
Is it because of this in fs-capacitor
?:
Line 28 in 45ab8cb
Does it relate to this Node.js v14 change?:
Here are the Node.js v14 release notes:
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.