Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: Local Modules #146

Closed
srcspider opened this issue Dec 11, 2014 · 27 comments
Closed

feature: Local Modules #146

srcspider opened this issue Dec 11, 2014 · 27 comments

Comments

@srcspider
Copy link

Currently one of the biggest pain points with node development is the problem of referring to local modules (ie. how to avoid the unreadable require('../../../../../something')).

The more modules you create in your project the biggest the issue becomes.

Node conventions recommends splitting things into modules and uploading to npm however that has some issues of it's own:

  • it assumes your code is generic enough to be a npm module
  • it doesn't allow you to maintain a library of modules, you're always maintaining a single module; works great for public modules, not that great if it's something private--makes more sense to split into libraries before splitting everything into individual functions (maintaining 100 functions as 100 modules vs just 2-3 libraries)
  • it requires you use npm to pull the code in or a private solution that involves npm; if you're pulling private modules you should have the choice of avoiding anything but your own system

There's a simple solution for both, and that's to create "fake" node_modules directories. Something like this:

ProjectRoot/
  node_modules/       # your npm modules
  src/
    node_modules/     # place all your project modules here
      yourModule1/    # example module       
      yourModule2/    # example module
      yourModule3/    # example module
      main.js
    entrypoint.js

This works great since any code inside src/node_modules can call internal modules and node will resolve them correctly once it's search algorithm goes into src/.

It also doesn't require any path aliasing (which I believe is frown upon?), so you can just stick everything in src/ into a module, point to src/node_modules/main.js as the entry point and your entire application is now reusable (either by the world or members of your private group).

The problem with that solution is that it's very annoyingly:

  • name biased; even if you use only camel case in your code for presumably good reasons you'll still have these random "node_modules" folders just being an eyesore
  • "node_modules" is literally "vendor"
  • causes confusion between npm and non-npm modules
  • is really annoying to have multiples of, eg. project_modules, shared_modules1, shared_modules2, etc, because you can't have them in the same directory, you have to have each in a different directory level

Solution

Changes to module search algorithm

  • if node_modules directory is not found in directory, node should search for a .iojs file
  • if a .iojs file is found then the configuration is read and rules in it apply

Example .iojs files that resolves our problem

In each custom module library you place the following, unless it's literally called "node_modules"

module.exports = {
  // tell node that current directory is root for modules
  modules_root: true
};

In the root you place .iojs file if you wish to expose modules inside them directly to other modules:

modules.exports = {
  // directories specified are resolved in order
  node_modules_dirs: [ 'sharedLibrary1', 'sharedLibrary2' ]
}

We can now have something like:

ProjectRoot/
  node_modules/       #  your npm modules
  src/
    sharedLibrary1/   # private accessible modules
    sharedLibrary2/   # private accessible modules
    sharedPrivate1/   # private modules, non accessible outside themselves
    sharedPrivate2/   # private modules, non accessible outside themselves
    someAppModules/   # place all your project modules here
      yourModule1/    # example module       
      yourModule2/    # example module
      yourModule3/    # example module
      main.js
    entrypoint.js

All three directories in src/ would have a copy of the "modules_root" .iojs file. And in src you would have a copy of the "node_modules_dirs" .iojs.

Let's say for example both sharedLibrary1, sharedLibrary2, sharedPrivate1 and sharedPrivate2 all have a file lib/merge.js.

Because the .iojs applies while still inside them calling require('lib/merge') would resolve to sharedPrivate1/lib/merge in one and sharedPrivate2/lib/merge in the other; which is exactly what we want.

If you call require('lib/merge') in someAppModules will resolve to sharedLibrary1/lib/merge because it's the first one specified in the src/.iojs file, however if you call it in sharedLibrary2 it will resolve to sharedLibrary2/lib/merge since it will get resolved by src/sharedLibrary2/.iojs before it gets resolved by src/.iojs

Backwards compatible convection of someAppModules to a single application NPM module

npm_module_name/
  node_modules/       # your npm dependencies (created by package.json resolution)
  src/ 
    node_modules/     # you rename someAppModules to node_modules
      yourModule1/    # example module       
      yourModule2/    # example module
      yourModule3/    # example module
      main.js
  index.js            # module.exports = require('./src/node_modules/main.js')
  package.json

Of course this assumes you convert some parts of your code to rely on generic modules as opposed to local ones. Easy peasy since everything has a nice grep-able name.

@phpnode
Copy link

phpnode commented Dec 11, 2014

@srcspider I agree that require('../../../../../../../foo/bar') is painful but I think your solution is a bit too complicated. What if we could refer to the project root by using a special symbol instead, e.g.

require('#foo/bar')

I would have preferred @ but it's been taken by org namespaces in npm

@juliangruber
Copy link
Member

i still agree with @Raynos on this one (trying to cite correctly): write applications in a flat structure (read: lib/*.js and lib/*/*.js). if an application can't be written using that structure, it's too big. don't write it or rethink it.

@rvagg
Copy link
Member

rvagg commented Dec 11, 2014

I second that, if your application is getting out of hand with nested directories it's a solid sign that you need to split it up even more. You can even do this within an application and avoid the need to publish small, single-use packages by using an approach such as linklocal (I'm hoping that the npm client will add first-class support for features like this, but that might be a pipe dream).

@phpnode
Copy link

phpnode commented Dec 11, 2014

@rvagg there are many scenarios where that approach is not practical, some apps really are big and complicated and separating into smaller modules brings with it its own set of problems. If iojs is to grow it must be possible (and fun!) to write these bigger apps using it. Your linklocal suggestion sounds like a hack around the problem - what is the difference between writing a module like that and simply having another folder under /lib?

@phpnode
Copy link

phpnode commented Dec 11, 2014

Fwiw I've seen people writing their own global appquire() functions to achieve this in bigger projects, which works but is not nice

@benjamingr
Copy link
Member

@phpnode can you find an example of a big app where you'd do require("../../../../x") legitimately and it would not be beneficial to refactor it into smaller separate modules (preferably on GitHub)?

I think that would really help the case here.

@seidtgeist
Copy link

I've thought about this a lot (and I'm probably biased as I've written an editor where instead of requireing modules/packages it finds the packages/modules you reference by name). I think the problem is not just relative requires, but also having very repetitive requires in every file, e.g., requiring lodash, React (browserified) or any other utility package in every module.

So just fixing the relative requires isn't going to solve the bigger issue of having repetitive and/or hard to reason with requires all over the place.

My own conclusion is that we probably shouldn't change the module loader or resolution algorithm, which is working wonderfully well (I mean, come on, it's fascinating that multiple versions of packages in a tree work as well as they do without coordination!). Instead we should create new tools to better manage our namespaces.

Again, here's why I think that way:

  • Module names change. Sucks to update references everywhere.
  • Module locations change. Again, sucks to update references everywhere.
  • Some modules are required everywhere. Not so nice to have repetitive requires in every module.
  • Can't use global.

Instead of fixing this in core I'd rather think of how to fix this in the ecosystem. And determine if core needed to change in some small way to allow more ecosystem experimentation.

@timoxley
Copy link
Contributor

I've been struggling with this as well. It doesn't always make sense to break everything up into isolated packages if each piece is only relevant in the context of its sibling packages. Reality of "business" means not every component can be open-sourced or is immediately of open-source quality. Problem domains often messy and it's hard to get the ideal tree structure correct from the outset. Publishing stuff to a registry incurs a lot of overhead and makes architectural refactoring very difficult.

As rod mentioned, I've been trying to solve these problems with linklocal. I don't think this belongs in core at all, though wouldn't mind seeing this in npm one day.

linklocal allows you to structure your app exactly like so:

app
  component1
  component2
      component2.2
  component3
  component4

Then you can pull dependencies in-between components like any other dependency: require('component1'). You just specify those dependencies explicitly in each component's package.json using standard file: dependencies (these exist as of npm@2.0.0):

"name": "component1",
"dependencies": {
  "component2":"file: ../component2",
  "component4":"file: ../component4"
}

then you call linklocal and everything is symlinked into the appropriate node_modules locations, (not just in the top-level):

> linklocal -r
component1
component2
component2.2
component3
component4

Linked 5 dependencies

We've also been combining this with scoped packages to prevent future namespace collisions, and only require package.json changes if we eventually want to take advantage of scopes to publish to a private registry.

linklocal is basically a local, recursive, npm link – it doesn't clobber/clutter npm's global namespace, and will recursively symlink any local file: dependencies for you automatically.

linklocal fits a nice middle-ground between the a lib/* directory and a private npm registry. lib/* gives you high flexibility but minimal structure or encapsulation. On the other hand, a private npm registry (& git urls/tarballs to a lesser degree) give you encapsulation and structure and localised dependencies, but creates great overheads during development. This is an especially arduous process when cross-project changes are required e.g. cd component1; git add; git commit; npm version; git tag; git push --all; cd component2; npm install --save component1@latest; git add; git commit; etc….

linklocal gives you the flexibility of lib/* with the structure of a registry, but without the and management overheads. It's currently being used in two moderate-size projects and works well, though it currently requires you to jump through a few hoops with npm. It's not for libs, it's for apps. Yes, ideally apps should be no different to libs, but that is very hard to achieve, at least for me. I don't think I could go back to building apps without linklocal or an equivalent.

</advertisement>

@phpnode
Copy link

phpnode commented Dec 11, 2014

@benjamingr I was typing a long reply here but @timoxley did a better job of explaining the problem, even though I disagree with his solution. I'm not aware of any large projects that do this legitimately on github, I do it myself in oriento but I think I could potentially solve those issues by using the (apparently now discouraged) peerDependencies feature. However, that is an open source module, not a proprietary application.

When developing applications it's very common for changes to affect more than one module, much more so than when developing small open source components and keeping these changes in sync is a lot of overhead. It's exactly the same problem faced by people who make use of git submodules in their apps.

It is conceptual purity at the expense of pragmatism and developer productivity. When there is no hope of reusing the various parts of the application between projects what is gained by having distinct modules?

@juliangruber
Copy link
Member

there's NODE_PATH there as well, so NODE_PATH=lib/ node will let you lib/foo as require('foo') from all over your code...however it's being deprecated afaik

@yoshuawuyts
Copy link

We've been using linklocal in production for a few months now and it's pretty much solved all our local require() issues. I'd be very happy to see something similar being built into the npm client.

@srcspider
Copy link
Author

@phpnode

@srcspider I agree that require('../../../../../../../foo/bar') is painful but I think your solution is a bit too complicated. What if we could refer to the project root by using a special symbol instead, e.g.

require('#foo/bar')

Sorry, not familiar with how the # there is meant to work. Is it just searching for package.json? if so then that's either requiring an extra file that's hard to stick in a ignore clause on the editor. Note, you can safely tell your editor to not show the .iojs files I suggested and terminals will ignore them automatically.

There are several other problems.

You said that refers to the root of the project. But that seems less flexible then what I suggested as an existing workaround which is just to name the directory you stick your modules in "node_modules" so on resolution names bubble up and then hit it, think it's the npm one and jump back in. That satisfies the functionality you describe with require('#foo/bar') but doesn't require your module to be on the route. Doesn't break if you try to move the code around, doesn't require special files. And it all works already

My problem wasn't referring to local modules, the node_modules solution works well enough. My problem is "managing local modules." Essentially having more freedom on the layout of the project (which the #module syntax sounds like it just makes worse if I understood it correctly) and the ability to manage shared local libraries of modules with out much fuss.

Note that in my example with lib/merge I could have had lib/merge inside the someAppModules and moved it later on in any of the two library modules with out any code needed to change.

With regard to the custom require function; I use a lot of browserify and abused aliasing to achieve the same effect but I prefer the "no build configuration" required option when possible.

@juliangruber

Doesn't actually achive what I described in the example. Note that at the end I describe how the different libraries can easily choose if they want to have their own lib/merge it's just a matter of having the module inside their structure. The PATH solution doesn't solve that.

I also hate it for another reason: IT only works if you're gonna work on a single project all your life.

I often find projects go on waiting periods and I work on others. Or you're working on an isolated piece of code. Having magical lib/merge is just plain annoying, and managing PATH is annoying as well. It's a "becomes everyone's problem" solution.

@timoxley @yoshuawuyts

Doesn't solve my problem. It essentially creates the same problem as PATH, it's doesn't dynamically resolve. Creates a lot of configuration. I would prefer to configure locations of many modules, not individual module locations. It's much easier to reason with.

Note that even right now you can do this,

project/
  node_modules/      # npm modules
  src/
    node_modules/    # shared private code
    client/
      node_modules/  # project specific code

No configuration. Dynamically resolves. No extra tooling required. And you should be able to easily transition the code to a npm module if you ever need to, with little to no changes to the code just moving files.

The issue is only that you can't have that same structure there be flat, and you can't rename the node_modules in any way to make them actually describe more accurately what they actually are. Obviously it also gets really awkward when you need more then 1 shared library.

@srcspider
Copy link
Author

off-topic, but to answer this

@ehd

I think the problem is not just relative requires, but also having very repetitive requires in every file, e.g., requiring lodash, React (browserified) or any other utility package in every module.

You can easily solve this by bundling at the code level. For example if you have something like Chiken, and that chicken has a Model, Repo, View and whatever else types associated with it you can just bundle everything under the name Chiken, by just having a Chiken/index.js that looks like this:

module.exports = {
  Model: require('./Model'),
  Repo: require('./Repo'),
  View: require('./View'),
  Domestic: require('./Domestic') // object with it's own Model, Repo, etc
  // ...
}

You can now just do require('system/Chiken') and you have access to Chiken.Model, Chiken.Repo, Chiken.View, etc. React accepts the . in recent versions so you can use <Chiken.View /> no problem.

If you feel like being optimal you can always still do var ChikenModel = require('system/Chiken/Model')

This principle applies to utility functions as well,

module.exports = {
  merge: require('./merge'),
  href: require('./href'),
  hsl: require('./hsl'),
  Model: require('./Model'),
  // ...
};
var lib = require('lib');

var Chiken = function (data) {
  lib.Model.call(this, data)
};

Chiken.prototype = lib.merge(lib.Model.prototype, {
  // ...
});

module.exports = Chiken;

Personally, I like aliasing locally any important functions like Promises, Request management, Logging, etc, so their implementation is in only one place in code and can be easily swapped out for a different one. I also prefer referring and loading directly very specialized libraries like React, lodash, etc, and most of the time I do use just explicit declaration for utility functions just to keep them short.

@Raynos
Copy link
Contributor

Raynos commented Dec 11, 2014

There are a few techniques that can be applied to large applications to make

require('../../../../../x') go away.

Technique 1: Have multiple test folders

One of the common problems I see in large apps is files like

  • endpoints/user/routes/x.js
  • test/endpoints/user/routes/x.js
  • which leads to require('../../../../endpoints/user/routes/x.js')

Instead I recommend:

  • endpoints/user/routes/x.js
  • endpoints/user/test/routes/x.js
  • which leads to require('../../routes/x.js')

Technique 2: Dependency injection.

One of the common problems I see in large applications is statefully requiring completely different sub systems in your application.

Let's say you have files like

  • business/user/repo/save-user.js
  • `endpoints/user/routes/save.js
  • which leads to require('../../../business/user/repo/save-user.js')

We've found that requiring other sub systems statefully has always been a terrible idea. It's not flexible and hard to test. The coupling also makes refactors a nightmare.

The better practice is dependency injection. In our case we would have something like

// BAD: var saveUser = require('../../../business/user/repo/save-user.js')

module.exports = saveUserRoute;

function saveUserRoute(req, res, opts) {
  var saveUser = opts.services.user.repo.saveUser;

  /* real code */
}

There are other techniques as well, but these two around tests and sub systems get rid of most of the nasty require statements.

Note that this seperation of "multiple packages in one git repository" does not have the overhead of multiple git repositories but has all the benefits of small packages. Especially if you keep one package.json per git repository.

@srcspider
Copy link
Author

@Raynos

We've found that requiring other sub systems statefully has always been a terrible idea. It's not flexible and hard to test. The coupling also makes refactors a nightmare.

Whats the difference between pushing opts into every method call like you did there and just doing:

Module-Localized Testing

project/
  node_modules/     # npm modules
  src/
    node_modules/
        module1
            tests/
                node_module/
                    ...dependency injection...
                index.js
        module2
            tests/
                node_module/
                    ...dependency injection...
                index.js
        module3
            tests/
                node_module/
                    ...dependency injection...
                index.js

Your test file, (example is simplified)

var Tested = require('./temp/Tested');

// ...run test code for Tested...
cd src/node_modules/module1
mkdir tests/temp
cp Something.js tests/temp/Tested.js
run-tests
rm -rf tests/temp

Here's a version with the "modules_root" .iojs file suggested in the first post:

project/
  node_modules/     # npm modules
  src/  
    module1
      tests/
        ...dependency injection...
        index.js
        .iojs
    module2
      tests/
        ...dependency injection...
        index.js
        .iojs
    module3
      tests/
        ...dependency injection...
        index.js
        .iojs
    .iojs

(note: it's pretty easy to make the .iojs invisible in most editors)

You can create a specialized "mock" module and then just do module.exports = require('mock/Something'); to reuse test related code. It shouldn't result in any more code then the dependency injection method, only your dependencies would be dynamically resolved.

Project-Localized Testing

You can save more code, and avoid false positives by having the test code outside the main source tree, so that after the temporary copy your piece of code can't access anything in the normal source tree, just mocks and whatever else you tell it.

project/
  node_modules/     # npm modules
  src/
    node_modules/
      module1
      module2
      module3
  tests/
    node_modules/
      Workspace/
        ...you copy your the tested file(s) here...
      lib/
        Promise.js     # mock
      test/
        module1Tests
        module2Tests
var Module1 = require('Workspace/Module1');

// ...test Module1...

So long as you wrap any important 3rd-party libraries into a local name space, eg. require('bluebird') -> require('lib/Promise'), you don't have to worry about
global or npm dependencies getting in the way of your tests (aliasing locally is just good in general, even if you don't intend to write tests).

The nice thing about the two method above is they can be applied with out much effort; so long as you haven't created a tangled mess of things. So you can use them with out refactoring too much of your project or including boilerplate.

You would also not have a need for all the wrapping, configuration, deferring and more importantly since the the resolution is based on the directory tree it's very clear how the resolution happens (anything could happen with dependency injection on the other hand; though I trust most people keep it sane).

Just for comparison here's the code for creating the Chicken type with the two methods:

Dynamically Resolved

var merge = require('lib/merge');
var Model = require('lib/Model');

var Chicken = function (data) {
  Model.call(this, data)
};

Chicken.prototype = merge(Model.prototype, {
  // ...
});

module.exports = Chicken;
// file: example.js
var Chiken = require('models/Chicken');
var Food = new Chicken();

Dependency Injection

module.exports = function (di) {
    var merge = di.lib.merge;
    var Model = di.lib.Model;

    var Chicken = function (data) {
      Model.call(this, data)
    };

    Chicken.prototype = merge(Model.prototype, {
      // ...
    });

    return Chiken;
}
// file: di.conf.js
var di = {};

// Load Utilities
// --------------

di.merge = require('lib/merge')();
// ...

// Load Utility Types
// ------------------

di.Model = require('lib/Model')(di);
// ...
// file: main.js
var di = require('di.conf');

var example = require('example');
example(di);
// file: example.js
modul.exports = function (di) {
    var Chiken = di.Chicken;
    var food = new Chicken();
}

Personally everything in my project including non-javascript stuff is in some build pipeline so isolating code in a testable context is practically just as complicated to run as the dependency injection example; initial costs of writing build scripts not considered.

@kessler
Copy link

kessler commented Dec 12, 2014

Its worth mentioning this technique from substack's browserify handbook:
https://github.com/substack/browserify-handbook#avoiding-
I find it very useful at times.

@ibc
Copy link

ibc commented Dec 13, 2014

I agree that require('../../../../../../../foo/bar') is painful but I think your solution is a bit too complicated. What if we could refer to the project root by using a special symbol instead, e.g.

require('#foo/bar')

That would be really good to have.

@Raynos
Copy link
Contributor

Raynos commented Dec 13, 2014

You realize the model system is frozen right ?

We cannot make any changes to the require function.

@ibc
Copy link

ibc commented Dec 13, 2014

No, I did not realize of it. May I ask why is anything "frozen" at this stage and why changes are not allowed in the require function? Thanks a lot.

@Raynos
Copy link
Contributor

Raynos commented Dec 13, 2014

@ibc

Apologies for my tone.

The module system is "locked" ( not frozen, wrong terminology ).

See https://github.com/iojs/io.js/blob/v0.12/doc/api/modules.markdown#modules

If we change the semantics of require you can publish a module to npm that will only be usable with iojs version 1.0. This means that you can no longer use modules in npm using node 0.10 or node 0.12

This might not seem scary, however because npm supports nested dependencies and ranges you might require a module A that supports node 0.10 and a dependency of A that is three levels deep say module E might break backwards compatibility by using this new require semantics in a patch or minor version change, this means module A no longer works. This can be a very unexpected and frustrating change.

node has made an enormous effort to not break backwards compatibility between 0.8 and 0.10, and between 0.10 and 0.12.

I doubt the folks involved with iojs would want a backwards incompatible change between node 0.12 and io 1.0

@ibc
Copy link

ibc commented Dec 13, 2014

Thanks for the comment.

If we change the semantics of require you can publish a module to npm that will only be usable with iojs version 1.0. This means that you can no longer use modules in npm using node 0.10 or node 0.12.

Clear. Being honest, I was just considering this new feature (using "#" to indicate root of the project) for custom projects rather than for NPM libraries (as typicall libraries are small enough not require features like that one).

But yes... there is no way to categorize/distinguish them, so it would be dangerous anyhow.

@srcspider
Copy link
Author

@Raynos

If we change the semantics of require you can publish a module to npm that will only be usable with iojs version 1.0. This means that you can no longer use modules in npm using node 0.10 or node 0.12

The semantics can be locked specifically to the node_modules directory and require. Since we're talking local modules so long as you correctly create the distinction of "these are universal package conventions" and "these are the conventions of your flavor of node" you can pretty much do anything.

Many languages have this distinction: 3rd party modules load one way, your local modules load whichever way you feel is best, and the system work fine.

The issue is mainly really "how would npm block misuse of the conventions" as well as how would stuff like browserify handle such conventions with out much fuss.

Here is, in my opinion the simplest solution to these problems:

  • introduce an alternative loading system for local packages only, eg. local function
  • add transformation to require plugin for use by common build systems and things like browserify
  • this would be standardized as how all flavors of node, not just iojs bypass this problem

Example

var Model = local('lib/Model');

var Example = function (data) {
  Model.call(this, data);
};

module.exports = Example;

The local function above is identical to require with the exception that it also allows for extra node-flavour rules; such as the .iojs files suggested at the start of the topic.

npm explicitly doesn't allow for the use of local (it's easy to scan for), thereby ensuring modules can be universally used.

Example usage, obviously very little changes:

node ./src/main.js
browserify -t iojs ./src/main.js -o ./public/main.js
// in whatever build system you're using
browserify({ transform: [ 'iojs' ] })

So in summary, require stays locked as-is, npm can keep serving universal modules, browserify stays as-is, different flavors of node can implement whatever local modules management strategy they wish so long as it uses the function local (or whatever is convened as a standard "local modules loading" require-function).

Are there any other political issues with adding local-only modules functionality?

@bodokaiser
Copy link

I also never had any problems with too long paths. Actually you anyway should just pass a variable (like a koa instance) around. See here for an example.

@j-san
Copy link

j-san commented Dec 15, 2014

If my package is named "foo" in the pacakge.json, it could be as simple as require('foo/somethings'); ? It assumes that the foo directory is named foo and node also search from direname($PWD).

@phpnode
Copy link

phpnode commented Dec 15, 2014

@j-san yes that'd be pretty nice too, but @Raynos's argument still holds - this would be a breaking change.

@chrisdickinson
Copy link
Contributor

@srcspider The module system is, as @Raynos mentioned, locked: no changes will be made to it. This precludes changing the API for modules, both explicitly (modifying require directly), and implicitly (changing the expectations of what's available to modules -- e.g., new globals). Much of Node's value is tied up in the node package ecosystem; the easiest way to destroy that value it would be to change the module system and introduce incompatibilities, however subtle or unintentional.

That being said, as I understand the proposal, it should be able to be implemented as a standalone package on npm using require.extensions and perhaps a bit of monkey-patching. An exploration of what code written against such an API would look like in a more fleshed-out fashion would be valuable for future module systems.

Closing this for now. It is very unlikely that we'll make such a drastic change to such a delicate, locked API, but feel free to continue the conversation here. I would be interested in the results of your experiment, if you choose to release a package on npm!

@othiym23
Copy link
Contributor

I would like to make a brief plug for using this problem space as an opportunity to model ways of introducing ES6 modules into Node and writing a flexible, potentially metadata-driven System.loader implementation that allows you to use something like your .iojs manifest approach, @srcspider. The whole point of decoupling ES6 module loading from parsing and linking is to make it possible to write alternative loaders using the ES6 reflection API, so I bet it would be educational to see how far you could take this building on top ofcurrent transpilation and shimming tools like esnext and 6to5.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests