Giter Site home page Giter Site logo

module-deps's Introduction

module-deps

walk the dependency graph to generate json output that can be fed into browser-pack

build status

example

var mdeps = require('module-deps');
var JSONStream = require('JSONStream');

var md = mdeps();
md.pipe(JSONStream.stringify()).pipe(process.stdout);
md.end({ file: __dirname + '/files/main.js' });

output:

$ node example/deps.js
[
{"id":"/home/substack/projects/module-deps/example/files/main.js","source":"var foo = require('./foo');\nconsole.log('main: ' + foo(5));\n","entry":true,"deps":{"./foo":"/home/substack/projects/module-deps/example/files/foo.js"}}
,
{"id":"/home/substack/projects/module-deps/example/files/foo.js","source":"var bar = require('./bar');\n\nmodule.exports = function (n) {\n    return n * 111 + bar(n);\n};\n","deps":{"./bar":"/home/substack/projects/module-deps/example/files/bar.js"}}
,
{"id":"/home/substack/projects/module-deps/example/files/bar.js","source":"module.exports = function (n) {\n    return n * 100;\n};\n","deps":{}}
]

and you can feed this json data into browser-pack:

$ node example/deps.js | browser-pack | node
main: 1055

usage

usage: module-deps [files]

  generate json output from each entry file

methods

var mdeps = require('module-deps')

var d = mdeps(opts={})

Return an object transform stream d that expects entry filenames or { id: ..., file: ... } objects as input and produces objects for every dependency from a recursive module traversal as output.

Each file in files can be a string filename or a stream.

Optionally pass in some opts:

  • opts.transform - a string or array of string transforms (see below)

  • opts.transformKey - an array path of strings showing where to look in the package.json for source transformations. If falsy, don't look at the package.json at all.

  • opts.resolve - custom resolve function using the opts.resolve(id, parent, cb) signature that browser-resolve has

  • opts.detect - a custom dependency detection function. opts.detect(source) should return an array of dependency module names. By default detective is used.

  • opts.filter - a function (id) to skip resolution of some module id strings. If defined, opts.filter(id) should return truthy for all the ids to include and falsey for all the ids to skip.

  • opts.postFilter - a function (id, file, pkg) that gets called after id has been resolved. Return false to skip this file.

  • opts.packageFilter - transform the parsed package.json contents before using the values. opts.packageFilter(pkg, dir) should return the new pkg object to use.

  • opts.noParse - an array of absolute paths to not parse for dependencies. Use this for large dependencies like jquery or threejs which take forever to parse.

  • opts.cache - an object mapping filenames to file objects to skip costly io

  • opts.packageCache - an object mapping filenames to their parent package.json contents for browser fields, main entries, and transforms

  • opts.fileCache - an object mapping filenames to raw source to avoid reading from disk.

  • opts.persistentCache - a complex cache handler that allows async and persistent caching of data. A persistentCache needs to follow this interface:

    function persistentCache (
        file, // the path to the file that is loaded
        id,   // the id that is used to reference this file
        pkg,  // the package that this file belongs to fallback
        fallback, // async fallback handler to be called if the cache doesn't hold the given file 
        cb    // callback handler that receives the cache data
    ) {
        if (hasError()) {
            return cb(error) // Pass any error to the callback
        }
    
        var fileData = fs.readFileSync(file)
        var key = keyFromFile(file, fileData)
    
        if (db.has(key)) {
            return cb(null, {
                source: db.get(key).toString(),
                package: pkg, // The package for housekeeping
                deps: {
                    'id':  // id that is used to reference a required file
                    'file' // file path to the required file
                }
            })
        }
        //
        // The fallback will process the file in case the file is not
        // in cache.
        //
        // Note that if your implementation doesn't need the file data
        // then you can pass `null` instead of the source and the fallback will
        // fetch the data by itself.
        //
        fallback(fileData, function (error, cacheableEntry) {
            if (error) {
                return cb(error)
            }
            db.addToCache(key, cacheableEntry)
            cb(null, cacheableEntry)
        })
    }
  • opts.paths - array of global paths to search. Defaults to splitting on ':' in process.env.NODE_PATH

  • opts.ignoreMissing - ignore files that failed to resolve

input objects

Input objects should be string filenames or objects with these parameters:

  • row.file - filename
  • row.entry - whether to treat this file as an entry point, defaults to true. Set to false to include this file, but not run it automatically.
  • row.expose - name to be exposed as
  • row.noparse - when true, don't parse the file contents for dependencies

or objects can specify transforms:

  • row.transform - string name, path, or function
  • row.options - transform options as an object
  • row.global - boolean, whether the transform is global

output objects

Output objects describe files with dependencies. They have these properties:

  • row.id - an identifier for the file, used in the row.deps prperty
  • row.file - path to the source file
  • row.entry - true if the file is an entry point
  • row.expose - name to be exposed as
  • row.source - source file content as a string
  • row.deps - object describing dependencies. The keys are strings as used in require() calls in the file, and values are the row IDs (file paths) of dependencies.

events

d.on('transform', function (tr, file) {})

Every time a transform is applied to a file, a 'transform' event fires with the instantiated transform stream tr.

d.on('file', function (file) {})

Every time a file is read, this event fires with the file path.

d.on('missing', function (id, parent) {})

When opts.ignoreMissing is enabled, this event fires for each missing package.

d.on('package', function (pkg) {})

Every time a package is read, this event fires. The directory name of the package is available in pkg.__dirname.

transforms

module-deps can be configured to run source transformations on files before parsing them for require() calls. These transforms are useful if you want to compile a language like coffeescript on the fly or if you want to load static assets into your bundle by parsing the AST for fs.readFileSync() calls.

If the transform is a function, it should take the file name as an argument and return a through stream that will be written file contents and should output the new transformed file contents.

If the transform is a string, it is treated as a module name that will resolve to a module that is expected to follow this format:

var through = require('through2');
module.exports = function (file, opts) { return through() };

You don't necessarily need to use the through2 module to create a readable/writable filter stream for transforming file contents, but this is an easy way to do it.

module-deps looks for require() calls and adds their arguments as dependencies of a file. Transform streams can emit 'dep' events to include additional dependencies that are not consumed with require().

When you call mdeps() with an opts.transform, the transformations you specify will not be run for any files in node_modules/. This is because modules you include should be self-contained and not need to worry about guarding themselves against transformations that may happen upstream.

Modules can apply their own transformations by setting a transformation pipeline in their package.json at the opts.transformKey path. These transformations only apply to the files directly in the module itself, not to the module's dependants nor to its dependencies.

package.json transformKey

Transform keys live at a configurable location in the package.json denoted by the opts.transformKey array.

For a transformKey of ['foo','bar'], the transformKey can be a single string ("fff"):

{
  "foo": {
    "bar": "fff"
  }
}

or an array of strings (["fff","ggg"]):

{
  "foo": {
    "bar": ["fff","ggg"]
  }
}

If you want to pass options to the transforms, you can use a 2-element array inside of the primary array. Here fff gets an options object with {"x":3} and ggg gets {"y":4}:

{
  "foo": {
    "bar": [["fff",{"x":3}],["ggg",{"y":4}]]
  }
}

Options sent to the module-deps constructor are also provided under opts._flags. These options are sometimes required if your transform needs to do something different when browserify is run in debug mode, for example.

usage

module-deps [FILES] OPTIONS

  Generate json output for the entry point FILES.

OPTIONS are:

  -t TRANSFORM  Apply a TRANSFORM.
  -g TRANSFORM  Apply a global TRANSFORM.

install

With npm, to get the module do:

npm install module-deps

and to get the module-deps command do:

npm install -g module-deps

license

MIT

module-deps's People

Contributors

ahdinosaur avatar andreypopp avatar arlac77 avatar bcomnes avatar bendrucker avatar deathcap avatar dgbeck avatar dominictarr avatar elnounch avatar goto-bus-stop avatar igorklopov avatar jardilio-kpmg avatar jaredhanson avatar jmm avatar joris-van-der-wel avatar k-matsuzaki avatar ljharb avatar mantoni avatar martinheidegger avatar mellowmelon avatar nlindley avatar pmowrer avatar royhowie avatar swang avatar tehshrike avatar terinjokes avatar volune avatar ximus avatar zertosh avatar zkochan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

module-deps's Issues

error when encountering core packages?

I try to run module-deps on this file:

require('assert')

and I get this error message

§ module-deps test.js
[
{"id":"test.js","source":"require('assert')\n","deps":{"assert":"assert"},"entry":true}
stream.js:94
      throw er; // Unhandled stream error in pipe.
            ^
Error: ENOENT, open 'assert' while resolving "assert" from file /Users/dtrejo/Dropbox/dev/lazytemplating/test.js
    at /usr/local/lib/node_modules/module-deps/index.js:112:29
    at fs.js:207:20
    at Object.oncomplete (fs.js:107:15)

It seems that core modules are not correctly processed?
D

Transforms broken when use ES7 Decorators or YAML

Hello @substack. We use 2 transform (babel and yamlify).
When we update to module-deps 4.0.5, browserify throw next error:

SyntaxError: Unexpected character '#'

This error in yaml file.

or detailed example with error stack:

Error: Parsing file /Users/k.kaysarov/src/account/_components/statistics/general/common/navigation/statistics-navigation.js: Unexpected character '@' (5:2)
    at Deps.parseDeps (/Users/k.kaysarov/src/account/node_modules/module-deps/index.js:452:28)
    at fromSource (/Users/k.kaysarov/src/account/node_modules/module-deps/index.js:389:44)
    at /Users/k.kaysarov/src/account/node_modules/module-deps/index.js:383:17
    at ConcatStream.<anonymous> (/Users/k.kaysarov/src/account/node_modules/concat-stream/index.js:36:43)
    at emitNone (events.js:72:20)
    at ConcatStream.emit (events.js:166:7)
    at finishMaybe (/Users/k.kaysarov/src/account/node_modules/readable-stream/lib/_stream_writable.js:511:14)
    at endWritable (/Users/k.kaysarov/src/account/node_modules/readable-stream/lib/_stream_writable.js:521:3)
    at ConcatStream.Writable.end (/Users/k.kaysarov/src/account/node_modules/readable-stream/lib/_stream_writable.js:486:5)
    at DuplexWrapper.onend (/Users/k.kaysarov/src/account/node_modules/readable-stream/lib/_stream_readable.js:545:10)

This error in js file

Error code example:

var { cut, extend } = require('catbee-utils');

class CampaignBudget {
  @extend({ css })
  render () {
    return this.$context.getWatcherData();
  }
}

module.exports = CampaignBudget;

If you need, i provide you example with code.
Error reproduce only in 4.0.5, 4.0.4 work correctly.

Browserify's b.transform is broken by Deps.prototype.getTransforms implementation

I was noticing some odd behavior and decided to investigate. When I add a transform to my package.json:

{
  "browserify": {
    "transform": [
      "coffeeify"
    ]
  }
}

It works fine. But when I use b.transform('coffeeify'), as documented here, I get an error:

Uncaught Error: Parsing file /app_path/app.coffee: Line 1: Unexpected string
    at Deps.parseDeps (/app_path/node_modules/browserify/node_modules/module-deps/index.js:363:28)

I put some log statements in Deps.prototype.getTransforms and I notice that the var transforms does not include 'coffeeify'. It is however present in this.transforms. It's not getting copied over because the file is not 'top level'. But none of the files in my compilation are top level, as @smrq also notes.

As an experiment, I changed the line:

    var transforms = [].concat(isTopLevel ? this.transforms : [])

to:

    var transforms = [].concat(this.transforms)

And my compilation works successfully.

The current logic doesn't seem correct to me. I would expect the transforms to work consistently regardless of whether they come from command line, programmatically added, or from the package.json.

Recent Commit is Crashing Build

To reproduce, install latest browserify (16.1.1) and run it on the following:

// --> need a require(), or a commented out require() somewhere
// require('foo-bar');
var foo = 1;
const foo = 2;
browserify index.js

Resulting error comes from module-deps:

Error: Parsing file /Users/matt/Desktop/foobar/index.js: Identifier 'foo' has already been declared (4:6)
    at Deps.parseDeps (/Users/matt/Desktop/foobar/node_modules/module-deps/index.js:506:28)
    at getDeps (/Users/matt/Desktop/foobar/node_modules/module-deps/index.js:436:44)
    at /Users/matt/Desktop/foobar/node_modules/module-deps/index.js:420:32
    at ConcatStream.<anonymous> (/Users/matt/Desktop/foobar/node_modules/concat-stream/index.js:37:43)
    at emitNone (events.js:111:20)
    at ConcatStream.emit (events.js:208:7)
    at finishMaybe (/Users/matt/Desktop/foobar/node_modules/readable-stream/lib/_stream_writable.js:620:14)
    at endWritable (/Users/matt/Desktop/foobar/node_modules/readable-stream/lib/_stream_writable.js:628:3)
    at ConcatStream.Writable.end (/Users/matt/Desktop/foobar/node_modules/readable-stream/lib/_stream_writable.js:584:41)
    at DuplexWrapper.onend (/Users/matt/Desktop/foobar/node_modules/readable-stream/lib/_stream_readable.js:577:10)

provide a CHANGELOG

I'm trying to debug a problem that appears to cropped up when module-deps was upgraded from 3.7.3 to 3.9.1. It would help me narrow down on the issue faster if there was a CHANGELOG or HISTORY.md, as is common for many projects. Would you please consider this?

windows files are all topLevel

var isTopLevel = mains.some(function (main) {
        var m = path.relative(path.dirname(main), file);
        return m.split('/').indexOf('node_modules') < 0;
    });

This code fails in windows (8.1 in my case) since m contains backslashes () and not slashes (/) so dependencies are indeed as topLevel and transforms are applied.

maybe use path.sep as separator ?
http://nodejs.org/docs/latest/api/path.html#path_path_sep

Chokes on ES6 modules.

Currently, running module-deps on source with ES6 module syntax throws Error: Parsing file /blah/dee/dah.js 'import' and 'export' may appear only with 'sourceType: module' (4:0).

The reason for this is that acorn, via detective, needs a {ecmaVersion: 6, sourceType: 'module'}, but currently, there's no way to pass those options through.
#63 would address this, although it would only be a superficial fix: module-deps still wouldn't actually traverse ES6 modules. I'm assuming ES6 modules are out of scope here?

Also: anything I could do to help get #63 merged?

Support NODE_PATH for resolving files

I know my use case is a bit rare, but in my (complex) build system I copy entry files to temporary locations.

When building with browserify I can use NODE_PATH="/path/to/my/project/app:/path/to/my/project/node_modules" browserify /tmp/entry.js and it works like expected.

However if I attempt to do the same with module-deps it fails to find my dependencies.

Can this issue be fixed / worked around somehow?

post transform file cache

Hi folks, I have some tooling built on top of module-deps. Often I already have post transformed source and deps tree information ready to go. It looks like if I provide opts.cache it skips reading/parsing source and applying transforms. However, it's unclear to me what the format of a "file object" looks like. Poking through the walk() function it looks like I need at least source, package, and deps, so would the format be something like this?

opts.cache = {
  '/User/me/foo.js': {
    source: <string> (post transformed source code), 
    package: ?,
    deps: {
      './fooDep1': '/User/me/fooDep1.js',
      '../fooDep2': '/User/fooDep2.js'
    }
  }
};

My two questions are:

  • I see the transform stream emits other fields too (file, id, entry, etc.) but are those needed in the cache?
  • What should the value of package field be?
  • Any gotchas when it comes to providing cache for files in node_modules?

I'll start running my own tests and exploring, but any guidance would be appreciated. Thanks!

document output format

Unless I'm missing something, it appears that the only place the format of the stream's output objects is given is in the example output of example code in the README.

Please document this better: what are the properties on output objects and their meaning?

browser-resolve 1.7.0 breaks pkg.__dirname

If you try to use module-deps with browser-resolve 1.7.0, the value of pkg.__dirname emitted with the 'package' event can be wrong. Instead of giving the actual directory of the package, it's giving the filename inside the package.

esprima error: illegal return

esprima errors on parsing some files that work in node.
node wraps a file in a closure (to pass in require, exports, module)
so, that make return valid inside a node.js .js file,

but it breaks early in the build step with browserify because esprima can't handle that.

`browserify.transform` in `package.json` does not apply to all files in respective module

If I have a module, derp, that has a package.json that defines a browserify.transform, that transform is applied to the main file for that module, but not for any require()'d files that the main file requires, event if they're in the same module.


package.json

{
  "name": "derp",
  "version": "0.0.0",
  "description": "derp",
  "main": "index.js",
  "browserify": {
    "transform": "./derp-transformer.js"
  }
}

index.js

var something = require('./test.notjs');

module.exports = function() {
    console.log('my main file in my module');
};

test.notjs

derp
------------
this
is
not
javascript

derp-transformer.js

var through = require('through');

module.exports = function(file) {
    console.log('*****derp-transformer called on "' + file + '"*****');
    if(!/\.njs$/.test(file)) return through();

    // ...create a through stream to handle this njs file 
};

If I require('derp') from somewhere that has access to it, either using a relative path, or if derp is installed in node_modules, the transformer gets applied for the main, index.js, but is not applied for test.njs, which is being required inside of index.js.

Since test.njs is at the same level, and part of the derp module, it feels like it should get any transforms that the package.json has defined. Is that a correct assumption? If so, it seems that the fix would be to pass transforms parsed from package.json to child dependencies (maybe only deps that are part of the same "module", though I'm not sure what logic would determine that).

Question about top-level modules

I'm working on browserify/browserify#937, and have some questions about how top-level modules should be detected. Right now the problem is caused because without any entry point files, all modules in the bundle are detected as non-top-level. I think this should be changed, and I'm trying to think through all the edge cases so I change it to the right thing.

Consider some dependency trees, and see which modules you think transforms should run against.


1.

/foo/main.js (entry: true)
└─┬ /foo/x.js
  └── /foo/node_modules/bar/main.js

No-brainer, should be /foo/main.js and /foo/x.js.


2.

/quux/node_modules/foo/main.js (entry: true)
└─┬ /quux/node_modules/foo/x.js
  └── /quux/node_modules/foo/node_modules/bar/main.js

Still should be foo/main.js and foo/x.js, even though they are inside a node_modules folder, because all of the files bundled are in that folder?


3.

/quux/node_modules/foo/main.js (entry: true)
└── /quux/node_modules/foo/x.js
/quux/node_modules/foo/node_modules/bar/main.js (expose: true, entry: false)

I think this should still be foo/main.js and foo/x.js. Even though bar/main.js is explicitly included in the bundle with expose: true, it's still in a node_modules folder relative to the other files, which indicates that it's an external module.


4.

/quux/node_modules/foo/main.js (entry: true)
└── /quux/node_modules/foo/x.js
/quux/node_modules/foo/node_modules/bar/main.js (expose: true, entry: true)

I'm not sure that changing bar/main.js to entry: true should have an effect on whether transforms run against it. Current behavior is that it would be detected as top-level, but I feel like the same reasoning as in 3 applies here.


5.

/quux/node_modules/foo/main.js (expose: true, entry: false)
└── /quux/node_modules/foo/x.js
/quux/node_modules/foo/node_modules/bar/main.js (expose: true, entry: false)

Still think that it should be foo/main.js and foo/x.js getting transformed. (This is an example of the issue I'm working through.) I think the basic algorithm in my mind here is that modules should have their paths resolved against the "main" source directory, which is defined as the shallowest directory of any file passed explicitly to module-deps. So, in this case, the main directory would be /quux/node_modules/foo. Then any module with a relative path to the main directory containing node_modules would be considered non-top-level (as in the existing algorithm).


6.

/quux/node_modules/foo/node_modules/bar/main.js (expose: true, entry: false)
/quux/node_modules/foo/node_modules/baz/main.js (expose: true, entry: false)

Sort of a weird use case here-- I guess it's making a bundle of just npm dependencies and exposing them for use elsewhere. I could see it happening though. Based on all of the previous examples and what we know of the folder structure at this point, transforms shouldn't run against these files at all... but in that case, why would the user specify transforms to pass to module-deps in the first place? I think it's probably fine to transform all files here. It does bring up a little bit of a wrinkle to the algorithm posed in 5, which is that multiple explicitly specified files may have equally valid but different "main" directories.


So, that's a lot of text for a basic proposal. The current isTopLevel algorithm tests each file against all entry point files; I'm suggesting instead testing against all explicitly specified files that are themselves top-level relative to each other. Does that seem consistent with the intent of module-deps?

Alternatively, maybe something with opts.basedir?

example doesn't work

it seems to be a bug in browser-pack, but the example was in this readme.

~/c/module-deps>module-deps example/deps.js | browser-pack | node

[stdin]:1845
module.exports = require('./core.json').reduce(function (acc, x) {
                                        ^
TypeError: Object #<Object> has no method 'reduce'
    at Object./Users/dominictarr/c/module-deps/node_modules/resolve/lib/core.js../core.json ([stdin]:1845:41)
    at i ([stdin]:1:218)
    at [stdin]:1:269
    at Object./Users/dominictarr/c/module-deps/node_modules/resolve/index.js../lib/core ([stdin]:1369:12)
    at i ([stdin]:1:218)
    at [stdin]:1:269
    at Object./Users/dominictarr/c/module-deps/node_modules/browser-resolve/index.js.fs ([stdin]:1856:12)
    at i ([stdin]:1:218)
    at [stdin]:1:269
    at Object./Users/dominictarr/c/module-deps/index.js.fs ([stdin]:321:22)

_isTopLevel() check for "node_modules" transform exclusion is too "loose", should be more specific

module-deps currently contains logic that results in excluding files from transformation if any segment in the file's package-relative path is the string "node_modules". This logic lives within the _isTopLevel(file) function, which determines if the specified file should be considered "top-level" (meaning, a file belonging to the package's own source code). Elsewhere in module-deps, only files considered top-level are eligible for transformation.

But, the current implementation of this logic appears to be far too "loose", in that it can end up excluding files from transformation that should indeed be transformed!

Here is the problem...

In some projects, internal "node_modules" folders are used simply to facilitate use of the Node Module Resolution Algorithm (NMRA) locally within the package's own source code. Of course, depending on the needs of the project, any or all of this source code may require some kind of transformation.

Consider this example package file structure which uses an internal "node_modules" folder:

package.json
/src
    main.js
    /node_modules
       foo.js
       bar.js
/node_modules
    (npm-installed packages)

Here, the /src/node_modules folder is intended to facilitate the use of NMRA between main.js, foo.js, and bar.js. For example, with this organization, both main.js and bar.js can access foo.js via require("foo").

To be clear, the content of the /src/node_modules folder is NOT managed by npm. This particular "node_modules" folder has just been strategically placed to enable use of NMRA internally between the files under /src.

The above is a very simple, contrived example, but in larger, actual projects, such internal use of NMRA can eliminate "require path hell" and aid modular design.

Anyway... it appears to me that the logic in _isTopLevel() is likely intended to assume that anything "outside" of the root /node_modules (which normally contains already-built packages that should not be transformed) is a "top-level" file. Normally, top-level files end up being anything in /src or some such location.

The fly in the ointment, though, is that _isTopLevel() doesn't just check for "node_modules" specifically in as the 0th element of the candidate file's path... it checks for "node_modules" anywhere in the path. Thus, _isTopLevel() erroneously concludes that files like /src/node_modules/foo.js are NOT top-level, and are thus excluded from transformation.

IMO, _isTopLevel() really should only check to see if a file lives below /node_modules specifically. Files everywhere else should be considered top-level, and eligible for transformation.

The proposed change in logic is this:

Deps.prototype._isTopLevel = function (file) {
    var isTopLevel = this.entries.some(function (main) {
        var m = relativePath(path.dirname(main), file);
        // EXISTING:
        // return m.split(/[\\\/]/).indexOf('node_modules') < 0;
        // PROPOSED:
        return m.split(/[\\\/]/).indexOf('node_modules') !== 0;
    });
    if (!isTopLevel) {
        var m = relativePath(this.basedir, file);
        // EXISTING:
        // isTopLevel = m.split(/[\\\/]/).indexOf('node_modules') < 0;
        // PROPOSED:
        isTopLevel = m.split(/[\\\/]/).indexOf('node_modules') !== 0;
    }
    return isTopLevel;
};

That is, "tightly" check for "node_modules" appearing only as the 0th element of the path (as it will for any file within /node_modules/**), rather than "loosely" triggering off "node_modules" appearing anywhere in the path (which, as shown in the above example, probably doesn't mean what _isTopLevel() thinks it means).

Alternatively, it would be an improvement if the above "loose" assumption was simply not "baked into" module-deps, and instead was configurable (for those projects that need it).

If `row.file` and `row.expose` are equal, transforms are not applied

If row.file and row.expose are equal, transforms are not applied. Unless it is a global transform.

'use strict';
const Deps = require('module-deps');
const through = require('through2');

const deps = new Deps({
  transform: [
    [
      function(file) {
        console.log('transform:', file);
        return through();
      }
    ]
  ]
});

let file = './foo.js';
deps.end({ file: file, expose: file });

In this example, the log line is never reached.

This is because the value of expose is regarded as a builtin. See: line 319 and 369

Potential of Caching Transform Results

A discussion on twitter resulted in me suggesting that we look at the potential of caching transform results (hashing the input, doing a comparison before retransforming, etc, etc).

This isn't the first time I've heard similar feedback from folks who have looked into using Browserify (transforms are awesome) but then been turned off because coffee-script takes too long to recompile.

While I understand that the problem isn't browserify / module-deps itself, I am wondering whether the hashing + caching solution has any merit and if so, what would be a good way of implementing it. I'm happy to write the code and supporting tests, etc. Just keen to get a feel for feasibility prior to doing the work.

Cheers,
Damon.

Transforms from multiple dependencies not applied correctly.

I've run into an issue where I have two (or more) dependencies, each of which are modules with a transform. However, only the transform from last required module is applied.

I have a failing test case (modified from your tr_module test) that demonstrates the issue in this commit:
https://github.com/jaredhanson/module-deps/commit/da4c813656bb6d2c6e8f1ad033429e4d57bc180a

See the file here:
https://github.com/jaredhanson/module-deps/blob/master/test/tr_2dep_module.js

The transform key is getting parsed correctly. At this line:
https://github.com/substack/module-deps/blob/master/index.js#L51

trx is correctly set.

However, once we go to apply the transforms here:
https://github.com/substack/module-deps/blob/master/index.js#L83

trx has been invalidated.

I suspect this is a closure scope problem related to when the packageFilter function gets called, but I haven't entirely nailed it down yet.

I'll keep digging on it, but was hoping you could take a look and perhaps spot the obvious.

the large files cannot be friendly to compile

Utf8 encoding of large files, after compiled will be garbled
in this way, garbled will disappear;
is there some issues?

var rs = fs.createReadStream(file,{
encoding:'utf8'
});

Reading from STDIN does not seem to work

I'm trying to pipe data into module-deps but it fails:

$ cat robot.js | ./node_modules/.bin/module-deps -

events.js:72
        throw er; // Unhandled 'error' event
              ^
Error: path must be a string
    at /home/username/node_modules/module-deps/node_modules/browser-resolve/node_modules/resolve/lib/async.js:15:16
    at process._tickCallback (node.js:419:13)

Running module-deps with "robots.js" as input on the other hand works like a charm:

$ ./node_modules/.bin/module-deps robot.js 
[
{"file":"/home/username/robot.js","id":"/home/username/robot.js","source":"module.exports = function (s) { return s.toUpperCase() + '!' };\n","deps":{},"entry":true}
]

I'm not sure how the output should look when you pipe though... How will it know the path to the entry point? Will it simply have {"file":"-","id":"-",...}

$ nodejs --version
v0.10.33
$ npm info module-deps | grep version:
  version: '3.6.4',

Cannot set detective opts from module-deps and browserify

Would it be possible to expose detective opts from module-deps and ultimately browserify?

https://github.com/substack/module-deps/blob/f3114e180737b5456971fa144b0a62c3c27abf40/index.js#L442

try { var deps = detective(src) }

Notice that second argument not supplied to detective() for opts.

I'd like to override detective's opts.isRequire function from browserify (via module-deps' opts). Or is there another way to achieve this without patching module-deps and/or detective code?

ignoreMissing not working

having pull request issues but if you add this at line 115

        if( !~file.indexOf(path.sep) ) return cb(new Error(
            'no path to module, and not in cache: "'+id+'" from file '
            + parent.filename
        ))

it will work... and if you need a test to verify

var mdeps = require('../');
var test = require('tape');

test('ignoreMissing should skip native modules if not cached', function (t) {
    t.plan(1);
    var p = mdeps({ignoreMissing: true});
    p.on('error', function(err){ t.fail(err) } );
    p.on('missing', function(msg){ t.pass(msg) } );
    p.write(__dirname + '/files/ignore.js');
    p.end();
});

with test/files/ignore.js:

require('fs');

applyTransforms doesn't apply some transforms

Regarding this issue: jnordberg/coffeeify#10

The coffeeify transform, or a transform I write myself, doesn't get applied before global-insert-modules which assumes that the input stream is JavaScript. The problem seems to be here:

function applyTransforms (file, trx, src, pkg) {
    var isTopLevel = mains.some(function (main) {
        var m = path.relative(path.dirname(main), file);
        return m.split('/').indexOf('node_modules') < 0;
    });
    var transf = (isTopLevel ? transforms : []).concat(trx);
    if (transf.length === 0) return done();
    ...

I'm not sure how this works. If I force isTopLevel to true, my CoffeeScript transform gets applied before any global insertion, and my bundle compiles properly.

Provide untransformed source in output

I'm using module-deps in documentationjs and just started to use babelify as a transform to support JSX. I want to crawl dependencies of ES6 code, but I don't want to transform that code in the output - otherwise the inline code examples of modules will be different than the actual code in the repository.

Is it possible that module-deps could return the untransformed source in addition the the transformed source in the output stream?

local transforms

it should be possible to put local paths into the browserify.transform-array inside the package.json

currently when the /home/me/mymodule/package.json looks like this:

{ "name" : "foo"
, "version" : "0.0.0"
, "browserify": { "transform": ["./some/local.js"] }
}

then when i run:

cd /
browserify /home/me/mymodule/entry.js > bundle.js

it throws an error: Error: Cannot find module './some/local.js' from '/'

__dirname points to wrong package causing 'package' event to be omitted

I'm using browserify 9.0.3 using module-deps 3.7.2 and my code listens on the bundle.on('package') event.
I noticed that the event is sometimes not triggered for some packages, although the package gets included properly.
It does not occur reliable even if the same code is executed twice, so it seems to be some sort of async race condition.

Example:

require('package1');
require('package2');
require('package3');

These events are fired:

bundle.on('package',function(){ //package1 });
bundle.on('package',function(){ //package3 });

I tracked the problem down to this line where the _self.emittedPkg object already contains 'package2', although the 'package' even was not yet emitted for this package.

Digging further I found that in the resolver function the following situation may occur:

// pkgdir holds the directory to 'package1'
// pkg.name holds the name of 'package2'
if (pkg && pkgdir) pkg.__dirname = pkgdir;

So it seems that the packageFilter function is called by another callback before the resolver function executes for this package? Just guessing though, I was not able to track it down further.

It seems to be similar to #67 and maybe also connected to (browserify/resolve#69)
But it differs to both issues in the fact, that __dirname does not simply point to a wrong path in the package but to a complete wrong package.

Non-package.json conform json file causes an infinite loop

I just ran into a weird issue where Deps.prototype.lookupPackage was called in an infinite loop.
I generated some json files programmatically and one of them was named package.json.

How to reproduce it:

echo '"somefile"' > file.json
echo '"anotherfile"' > package.json
browserify file.json

The file was requireable in node but not in browserify, so the lookup logic probably needs a fix.

no way to dynamically handle missing modules

Hi!

I am trying to find module deps from an entry point, but ignore missing requires. These typically come about because people conditionally include coverage stuff that's in devDependencies. So the filter option would not work.

At the moment, if one file fails to resolve, the best option I've found is to catch it by binding a handler to the 'error' event on the stream, but if this occurs the 'end' event never fires.

Is this a stream issue, or is there's a way to do this?

tr is undefined in readFile

Deps.prototype.readFile = function (file, pkg) {
    if (this.cache && this.cache[file]) {
        var tr = through();
        tr.push(this.cache[file].source);
        tr.push(null);
        return tr;
    }
    var rs = fs.createReadStream(file);
    HERE -->  rs.on('error', function (err) { tr.emit('error', err) });
    this.emit('file', file);
    return rs.pipe(this.getTransforms(file, pkg));
};

pass pkg.__dirname to transform functions

As we cannot write __dirname to package.json, transforms which require directories must be given relative paths.

package.json

{
  "browserify": {
    "transform": [
      ["transform1", {"dirname": "relative/path"} ]
    ]
  }
}

However, there is no way to resolve absolute paths. process.cwd() varies by where node process is created.

Pass pkg.__dirname to transform functions and we can calculate absolute paths from where the package.json file is located.

Process hangs when pass module-deps options to transforms

This is a dependency of browserify and is causing our installs to fail.

This commit doesn't check for undefined.
b22b584

TypeError: Cannot call method 'hasOwnProperty' of undefined

I don't really have much more context on this mainly due to it being a depedency from browserify but I can see in debugging that some in cases trOpts is an empty object, has a key or is also undefined.

I can try to provide more details if required but I guess a simple check for undefined would make sense (although I'm not certain on the purpose of the change which is why I've not made a PR).

Configurable depth?

Is it possible to configure how deep module-deps looks, or grab the depth when it iterates to ignore dependencies beyond a certain level?

Order of output

When i run the example, the order of the output is reversed compared to the example output in the readme. Generally I'd like to know, if the order is guaranteed in so far, that a dep comes always after (or before in my case) the files which require it.

Pass ecmaVersion to detective (To enable support for async/await +)

Node v7 supports async/await with --harmony-async-await. Chrome stable supports it now. module-deps (and therefore Browserify) chokes on it.

Could module-deps accept an input parameter to set the ecmaVersion property on the call to detective()? If it was set to 8, browserify could handle async/await just fine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.