-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
4f #10
Comments
My current understanding is that deep linking require statements will still be allowed with native |
@jokeyrhyme Yes, deeplinking would still work but packages that port from |
You may not want to mock us if you actually want buy in from the CTC. (Which whatever is accepted in the end will require, obviously.)
I should probably make it clear that the CTC doesn't really like any of the solutions but that the extension is the most favorable of the bunch. Ideally we wouldn't need anything special but that isn't possible when you have two possible filemodes for a file and need to avoid double-parsing.
It is probably favorable if npm does not need to do anything. I think we've probably caused even more of a mess if that is necessary?
Not exactly positive imo |
It's pretty important that Updating the cli to support new registry semantics and updating the npm registry to support them is, relatively, "easy". (I say as the person who only has to be concerned with the former. =p) But how do you select variants for git dependencies? What about tarball dependencies? What do you do when talking to a third party registry that hasn't implemented this yet? You might say "well, don't publish things to the third party registry using this feature" except that many of them act as transparent caches for the npm registry. Making changes to package selection semantics is, I think, best kept outside the scope of any es6 module proposal. |
@Fishrock123 I did not mean to mock you in any way. I just hoped it is not a final decision - it can be changed. That is why I posted it as comment here, in this repo, that seems to be of same sentiment.
I would be curious about why not?
Thank you for the insight @iarna . I think though this also could be easily taken care of. Let me extend the the Proposal a bit: Git/Tarball dependenciesThe Support for old registries/proxiesThe current registry and proxies should have no problem to support the "syntax" property do not support package variants. When npm@4 tries to publish a package with multiple variants to a server that doesn't support (different endpoint) multiple variants the package owner will be informed of this and he needs to specify If the proxies or registries don't upgrade they will by-default get the
npm could help make this situation better. I think it is worth to explore this path. After all this decision will affect pretty much everyone. |
Ah, alright. Apology accepted. Generally we don't do specs and so anything that isn't existing functionality is up for change, to a degree of course.
This is a very, very big misinterpretation of the size of the ecosystem and how much time people have to transition, and generally the availability of downstream users to adapt on a dime. We'd rather be a bit "wrong" than hurt everyone hugely.
Perhaps, but why should they have to? If they must, node core has failed in a sense. Plus npm is already super busy so I'd rather we don't hold them up on this too. |
Transitionary packages are packages that contain both es6 and commonjs modules. there is no penalty for commonjs only packages. Also it needs to be clear that this "penalty" consists of a compile script that compiles mixed sources to commonjs. (not at all a huge penalty imho)
This could also make way for implementing frontend packages which is on the agenda of npm. Also I think that implementing es6 modules will affect them quite a bit (i guess) in their work. So: @iarna 's opinion surely matters a lot. |
Any time I think about es6 modules I get uncomfortable feelings around forward compatibility for published modules. The current status quo, which is that you write your modules in whatever and compile to es5 at publish time, pretty much guarantees forward compatibility. That is, not only do newer versions of node & npm continue to work with older modules (backward compatibility) but older versions of node & npm will work with newer modules. This is critical because our usage data shows that users have been very slow to move forward. In fact given the community growth rate, I my gut feeling is that much of the migration to v4+ has been mostly new users. I would very much like any solution to work with npm@2 and Node 0.10 (or maaaybe 0.12). That is, the package.json should be setup such that if I want to make a multi-mode module w/ both es6 and es5, that an older npm can install the module and an older node load it. |
As an aside: npm's support guarantees are broader than Node.js'. Currently npm only supports two major releases of npm itself. But it supports using them on versions of node going all the way back to 0.8. We're very much hoping to drop 0.8 support this year but inevitably I suspect we will always be supporting more versions of Node.js than Node.js does itself. |
@iarna I totally agree with that sentiment and I see this proposal as best proposal yet to actually support this:
|
(Note: I see this proposal 4f a little bit like the 32bit and 64bit packages that are distributed in linux package managers) |
There's nothing intrinsically wrong with that, but pushing it into an already established ecosystem feels untenable to me. I must be missing something… I'm gonna write through this... so you say:
So that implies, I guess, that the registry would return package metadata including something like this:
(Or maybe Or if the module ONLY supported es6 then it'd return:
Except I'm concerned that would set older versions of npm on fire, as they make lots of assumptions about shasum and tarball being set, so it wouldn't have an opportunity to give a reasonable error message. Sure it wasn't gonna work for them, but I don't want to introduce modules to the ecosystem that just by existing are guaranteed to break older npms. And of course, if you swap that around and make |
(None of this is to even speak about all-in-one-file packages, which are just JS files with the package.json embedded in the top. But I would be very happy to call those cjs ONLY.) |
@iarna I see two different implementations but neither of them would touch the response of the npm server: One would be using HTTP headers: The npm client requests it either with an additional header Another implementation would be using a different endpoint. If the endpoint doesn't exist there would be an error from old servers. And npm would fallback to commonjs and the old endpoint. Edit: Of course the old endpoint would still response with the old data. In order for old versions of npm to work as expected: In short this is just an extension of the current server. |
@iarna Wouldn't that work? |
I really liked that you stand up to the TC "decision" (hopefully) since I am also not quite happy with the current state.
That being said given that I call the current proposal
4d
I would like to add an few concerns I have with4d
and would like to offer a4f
(jumping over e). Maybe it could be useful?!Concerns
modules
,module
andmodules.root
) topackage.json
which have to be learned.module
andmain
is not clear by looking at the code. (It is easy to mistakemodule
for main when skimming the package)modules
property definition..tar.gz
. Which means that when downloading the package that supports both it will be twice as big.Proposal
4f
Based on those concerns I thought if there might be a way to incorporate that into your work and well this is what I have come up with:
One new property to
package.json
In order to figure out if
.js
files in a package are CommonJS or ES6 we add a new property"syntax": "es6"
to thepackage.json
. All files in this package will automatically be treated asES6
while"syntax": "commonjs"
would specify that the files are treated as CommonJS.Backwards compatibilty through syntax variants
Old Node.js versions do not support ES6. Loading a package that contains es6 would thusly fail. In order to support this case we add support for syntax variants into
npm
! A new version of npm (say npm@4) supports a new "publish-time-only" property of the package.json:"variants": {"commonjs": "./common_js"}
.Every key in this property points to a folder. On publish
npm
stores two variants of the package in the package storage: One with all folders except the"./commonjs"
folder and one where the"./commonjs"
folder content is treated as root.This way we have two packages stored in the npm registry. The old packages would all be treated as commonjs.
Old npm versions would automatically install the CommonJS variant of the package. The new version of npm would by-default install the es6 variant of the package a new
--syntax=CommonJS
flag would allow to request the CommonJS variant of it in case npm@4 runs on a old version of node.(Of course npm could simply assume
--syntax=CommonJS
if npm runs in an old version of node. Yet I still think a flag is good-to-have)Npm could even go as far and install the latest fitting
CommonJS
version of a package in case--syntax=CommonJS
.Transistion support through Babel
Packages that want to port their code from CommonJS to babel but have a hard time to are hard to up-date all at once could for-the-intermediate-time (until all modules are ported) simply use
Babel
(and"syntax": "commonjs"
). This would add a processing penalty and more devDependencies to this case but it would work.Benefits of this approach
require("lodash/array.js
) would still work..js
file ending.syntax
is easier to understand and differentiate from the other properties in the package.json.One last concern
With the simplicity of this comes the obvious lack of the possibility that we can't properly mix CommonJS and ES6 modules. The reason as to why this would be required is because ES6 doesn't support all the features of CommonJS (namely: dynamic imports). I see this as task of TC39 to improve the ES6 specification to support dynamic imports.
I would be very happy to hear your comments.
The text was updated successfully, but these errors were encountered: