-
Notifications
You must be signed in to change notification settings - Fork 44
Pluggable Loaders to support multiple use cases #82
Comments
this is pretty much possible with the current system's hooks. using a combo of resolve and dynamic instantiate you can do everything from hot reloading to babel-style cjs named exports. |
Like I wrote in #70, if developers need to still use Babel or I can see the benefit of maybe the package to be loaded having a flag in its |
@GeoffreyBooth I'm of the opinion that standardizing on ESM is of great value as it has the potential of sharing code without a build step between a variety of environments (not even just Node/ Browsers). If people are wishing to perform transformations that are not supported in all environments it makes some sense to me that it requires some configuration. I think continuing as we are with build tooling and required configuration should be seen as a red flag for your code working in all environments since some things like web browsers are not seeking to add such hooks at this time. I firmly believe getting unification of hosting environments and developer ecosystems is of higher value than anything else that ESM can provide. We should seek to provide solutions to use cases, but some use cases are served in simple ways like how importing bindings by name may not be necessary if you can read properties of an object similar to how the We can solve use cases in multiple ways, and that leads to the potential of multiple loaders which are tailored to specific use cases or environments in specific ways. I don't think shipping a specific loader is the best way to unify developers and actually encourages using tooling to continue to be relied on to assist in managing differences between environments. |
@bmeck The issue is, ESM doesn’t offer many benefits for Node users over CommonJS. There’s no reason the many thousands of Node developers will rush to use it. Some will, sure, but many others won’t see the need, just as many packages on NPM are still unapologetically using CommonJS is part of Node. It’s not just some other loader, like some userland thing that maybe Node tries not to break. If you want people to start using ESM, you need to provide them a way to use it in the world we live in now, where almost every dependency they import is a CommonJS module. And if that way is to push users off on userland solutions like Babel or Look, I’m all for getting to the nirvana of an ESM-only future. But you need to provide a path for people to get there, because they’re not going to just throw out every app they’ve ever written and every library they’ve ever used to start using some new spec that has no obvious advantages for them. |
@GeoffreyBooth I'm not suggesting they throw out applications they have written, and they can continue to use whatever compile to JS system they already do. I'm stating that the satisfaction of use cases might not match a specific compile to JS system and custom tailoring those loaders is probably going to continue. Node providing a compatible system with other environments is a higher priority to me than matching any given compile to JS system of today since people can continue to use those as they migrate. I don't think standardizing on a compile to JS system is a good idea. We have many tools using the ESM syntax for different semantics and need to admit that adopting and/or excluding loaders will have the same effect that you are seeking to avoid. You will break some amount of application, and also are breaking compatibility with other environments. I see unity in allowing loaders and creating a safe system for usability in other environments, not in encouraging semantics that enforce a specific behavior at odds with other environments. |
What’s the definition of “loader” here? Can a loader be something that converts Because that would be fine. That would solve the “import from CommonJS” use case. |
that could be something that a loader could in theory do, however it causes me great stress that node transpiling code is somehow something that people would be okay with. |
@GeoffreyBooth there are a ton of topics covered in that paragraph. Lets go with, it could be? it might not be? to all those questions. In general a loader is something that is given control of HostResolveImportedModule in some way. "Pluggable Loaders" as described above do something that isn't the default behavior. Having Node do something automatically would mean it is default so probably wouldn't be called a loader in the context of "Pluggable Loaders" as described at the top of this issue. |
@bmeck So work with me here. Describe to me a solution that you would find acceptable. My criteria for success are:
I’m not a fan of transpilation either, that’s just meant as a shorthand of explaining what I want the loader to achieve. Let’s say we create a loader called
I don’t see the point of NPM not just adding it automatically unless there’s some other loader already configured, but if that makes a difference, fine. The point is that it’s not Babel or It does seem odd that this wouldn’t be part of Node, though, as CommonJS is part of Node. This feels like something Node should solve or have a solution for built in, like it has core modules like |
These seem in conflict as you need to use tools to get their behavior if it has problems being converted to ESM. I would say to keep using tooling if you need the behavior of tooling and don't want ESM.
As with the first two points you are mandating the behavior of existing tooling, so keep using that tooling. Given those 3 points the only solution is to completely adopt one of the compiler chains and doing it at runtime rather than implementing ESM. I don't think mandating the behavior of tools and then saying not to use the tools makes sense. The solution of just always using a runtime compiler similar to how Meteor does things satisfies your points but doesn't seem desirable to me. You could change the specification to comply to some specific tooling, but I'm not going to go into that since this was about what can be done today I presume.
This is done through some level compilation/manual hooking even with loaders since it has to manipulate how code functions. I think there might be confusion on how loading code is affected by loaders. Loaders just instrument the various parts of ESM. Breaking ESM constraints requires manipulation of behaviors in some way and not using ESM and/or doing code transforms. |
@bmeck Is there a solution that you would find acceptable where |
@GeoffreyBooth We can load the module namespace still, whatever that means. |
What if we ship a command line flag (like we do for esm) today that supports named exports of commonjs modules so users get the familiar user experience but we show a warning when it is used and ship Node with a tool that users can run on their project and converts ""cjs named imports"" to destructuring assignments? That would let users starts with running their transpiled code natively as a start with a flag as well as give them an automatic tool to transition to more compliant ESM in the future. We can give babel and TypeScript users transforms that do this automatically too. Named exports would work between ESM modules anyway so the UX on those isn't hurt. |
@benjamingr the problem is the nature of "supports named exports" with that. If we can do it, it should be on by default and stay supported. It requires code transformation and I would be against it by default since it currently breaks how the specification requires things to act by doing behaviors like late binding which are not supported by the VM, or synthetic imports. Those breakages lead me to not want the idea to ever land if it is not on standards track for support by ESM. Shipping a tool that does this transform ahead of runtime for you seems fine to me. The tool needs to not do destructuring assignment as that loses liveness and doesn't have a clear way to determine if a module namespace being imported is CJS or ESM though. The biggest problem is setting up the transformation without knowing if the import is CJS or ESM. You have to determine the mode of your dependencies in order to transform your module. That requires all dependencies be available and is not suitable for localized isolation in a package manager if that changes over time (people move to/from one format to another). So, it probably wouldn't work at the library level, but it probably would work as an application tool. |
Can you elaborate on why a preprocessor tool would not know that? |
@benjamingr not if it's |
So . . . what does that mean? 😄 It sounds like we have two options here regarding getting
Either or both options can be pursued. @bmeck you’re an expert on the spec, so I would encourage you to propose solutions for either option. Maybe the group doesn’t decide to pursue those solutions for one reason or another, or maybe you think they’re bad ideas, but if you were forced to come up with them, what would they be?
|
@GeoffreyBooth I think we need to also accept that there is another possibility... we don't support transparent interoperability, and to use common-js one must use |
I have never considered that live bindings need to work with dynamic imports... somehow that makes them even more magical 😮 That said - wouldn't the overhead of wrapping it "in case" be negligible if it's only done for dynamic imports? |
Given a library Applications don't have this problem as they generally have dependencies pinned / the entire source tree when they are run. They are a great time to run tools that require pinned versions and the source tree to not change. |
What do you mean? You can import CJS with it. // main.mjs
import dep from './dep.js';
console.log(dep.x); // dep.js
module.exports = {x: 12345}; |
Thanks - that explains things, but couldn't this be alleviated by requiring running the tool on |
@benjamingr even if you do it on install that kind of workflows don't work with times you run with symlinked dependencies, are manually updating things in your source tree, and/or don't use |
@MylesBorins That’s always an option, of course, but I’d prefer that be a last resort. That means users need to keep track of which of their dependencies are CommonJS and which are ESM, and refactor their code whenever a dependency switches from one format to another. When projects usually have dozens or hundreds of dependencies, that’s a burden; though I’m sure someone will write a tool to automatically convert But obviously it’s more user friendly if existing code works as is without needing refactoring, automatic or otherwise. |
I personally see it as the more compelling approach for a number of reasons
but am waiting until we finish determining features before pushing any
particular approach
…On Wed, May 16, 2018, 5:31 PM Geoffrey Booth ***@***.***> wrote:
I think we need to also accept that there is another possibility… we don’t
support transparent interoperability, and to use common-js one must use
import.meta.require
@MylesBorins <https://github.com/MylesBorins> That’s always an option, of
course, but I’d prefer that be a last resort. That means users need to keep
track of which of their dependencies are CommonJS and which are ESM, and
refactor their code whenever a dependency switches from one format to
another. When projects usually have dozens or *hundreds* of dependencies,
that’s a burden; though I’m sure someone will write a tool to automatically
convert import and require statements back and forth as necessary
depending on the dependency. At the very least, I think such a tool should
be built and released before Node’s module support is publicized, so that
the release notes can say “use this tool to rewrite your code for you”.
But obviously it’s more user friendly if existing code works as is without
needing refactoring, automatic or otherwise.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#82 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AAecV0kNBV_PXtRP8TAuRhB-cflRoiJvks5tzJqxgaJpZM4T-i6Q>
.
|
You’d have to set up hot reloading prior to a module being imported, just like with CJS. |
@naturalethic you can't completely invalidate ESM due to spec idempotentcy requirements. It is just as @ljharb said that you have to make a module that does delegation to the currently "hot" module instead of replacing module records. Edit: A crude example of a module that does delegation that only works for new calls to export function then (r) {
r(import(await getHotModuleURL(...)));
} |
One thing I'd like to offer is leveraging an already vetted, tried, and completely extensible resolver that is async and leverages the default node algorithm oob. |
@TheLarkInn it would be good to support such a loader, but I think the discussion is more around how to enable essentially what the |
Most of the discussions here have been focused on the specific use case of using the loaders to transpile the code in some capacity. I have a quite different use case in mind, so please feel free to ask me to post this message in a separate thread if you think it'd keep discussions more manageable. We recently unveiled Yarn Plug'n'Play, whose goal is to generate static maps that Node would then be able to consume. Since Yarn knows everything about the dependency tree, including the location of the packages on the disk, it makes sense to fetch information directly from it - and it makes it possible to avoid creating the node_modules folders, amongst other benefits (whitepaper here). In order to achieve this, we currently require our users to use the All this to say: the |
@arcanis would nodejs/node#18233 be sufficient? We are currently unable to land it if we were to PR it, but just check the design for now and see if anything is missing. With |
@arcanis the difficulty with this model is that it becomes very tricky to distinguish between a loader that is necessary for a package to work and a loader that is necessary for a specific project to work. For example, I might in development use Yarn Plugn'n'Play, then publish that package to npm, with the same plugin and play loader declaration in the package.json. Now anyone installing this package would get the Yarn loader applied, even if they were consuming the package via node_modules with npm. So this is the current snag here on loaders, making this distinction clear and simple. Suggestions very welcome here! |
@bmeck I don't think it would be enough unfortunately, because of the distinction @guybedford mentioned - your PR adds support for per-package hooks, but in the case of PnP the hook must be registered globally (since all packages will need to use it in order to resolve their own dependencies). The use case would be covered by the "Global Composition" section, except that it would be a production loader, not development only. Using the environment to set the global hooks is nice for debugging, more impractical for production (it doesn't remove the need for @guybedford What about a |
@arcanis I would be against adding non-package (application level) data to package.json What is exactly the reason that this cannot be done on a per package level?
|
|
if your individual package depends on yarn pnp it should still be package level. i don't want my deps being handled by some other package randomly. |
@bmeck Some reasons why I think
Plug'n'Play is enabled on an application-basis, not package-basis. You cannot have an individual package depend on PnP (hence why |
@arcanis all of those are reasons why I believe that it should be done on a per package level. You state that it is enabled on an application-basis, but why can it not be enabled on a per package basis? Most packages when you install them don't come with |
@arcanis For your use case, are there multiple entry points into the application that need the same loader-hook treatment, or is there generally only one entry point? |
@bmeck There are a few reasons:
In case the reason why per-package configuration wouldn't work, I recommend taking a look at the whitepaper - I think it might help clarify some points that can still be confusing, especially the Detailed Design section.
@zenparsing It's up to the user, they can have as many as they want, and it cannot be statically determined since they can new ones after the install. Basically, each time they would usually run a script using |
@arcanis I did see the design, but I'm still unclear on why it needs to be that way. Maybe we should setup a call. I'm not sure but it seems like there is some confusion or I'm missing something. I agree that the current whitepaper/RFC would not be an ideal fit for either global or per package loaders. I'm interested in solving the use case, but if we must perfectly match that RFC it seems unlikely to be pleasant.
So can't a Loader just find that file/generate it as needed? A package manager could even make a big single shared
That sounds like you want a global hook, but as described in both the comments and a few notes in the RFC this would be a breaking change to use this resolution algorithm. Wouldn't that mean that users should opt into this behavior? And if they should opt into this behavior how would they do so to state they are using non-standard behavior if not on a per package level.
I don't understand this comment, why can't you have either linkage via symlinks like pnpm or use the default resolution algorithm to get to your loader? |
My goal in being here is to help make this pleasant to everyone. If the RFC has to consider new facts, so be it. I want to stress that we're really open to feedback and don't want to push this on anyone 🙂
I'm not entirely sure what you mean - a loader cannot generate that file, since it is meant to be generated by the package manager. Yarn already does make a big single single shared .pnp.js file that works across all packages, so I'm not sure either I understand correctly. If you mean "accross all projects", this isn't possible - multiple projects have different resolutions (different lockfiles, in sort) that cannot be merged, and the
There's a few points here that can be discussed (maybe it'd be better to mention it on the rfc thread, since people here might not be interested about Plug'n'Play's details?):
How do we achieve this compatibility? By strictly following the rules of
Neither symlinks nor the node resolution would solve the problem. Consider the following hierarchy:
What would you put in |
@arcanis I'm getting quite confused with a lot of implementation of how your system works right now being thrown around. I'm going to just state how I would expect things to look given what I'm understanding: /path/to/cache/[email protected]/package.json -> {"loader":"./.pnp.js"}
/path/to/cache/[email protected]/package.json -> {"loader":"./.pnp.js"}
/path/to/cache/[email protected]/package.json -> {"loader":"./.pnp.js"}
/path/to/cache/[email protected]/package.json -> {"loader":"./.pnp.js", "dependencies": {"foo":"2.0.0"}}
/path/to/cache/[email protected]/package.json -> {"loader":"./.pnp.js", "dependencies": {"foo":"^2"}}
/path/to/project-1/app.js -> require(`foo`) require(`bar`)
/path/to/project-1/package.json -> {"loader": "./.pnp.js"}
# unclear how this handles the `foo@2` nesting?
/path/to/project-1/.pnp.js -> [email protected], [email protected]
/path/to/project-2/app.js -> require(`foo`) require(`bar`)
/path/to/project-2/package.json -> {"loader": "./.pnp.js"}
# no nesting, simple
/path/to/project-2/.pnp.js => [email protected], [email protected]
/path/to/project-3/app.js -> require(`foo`) require(`bar`)
/path/to/project-3/package.json -> {"loader": "./.pnp.js"}
# no nesting if bar using [email protected]
# nesting if bar using [email protected]
/path/to/project-3/.pnp.js => [email protected], [email protected] Given that
This could be done in a ton of other ways, the We also face some problems from taint if we use methods of finding We also face some problems of creating a distinction of $ # using the repl
$ node
> require('/path/to/project-1') $ # as a preload
$ node -r /path/to/project-1 sidecar $ # bad APIs
$ node -e '...;require("module").runMain(...)' I have serious concerns in creating systems that create this divergence since it makes behavior very dependent on how things are used, which we don't currently have well defined paths for a variety of things in ESM. Things like It seems like if you really need that distinction we should clarify what an |
I must be missing something: how come the |
@arcanis i would assume they exist in the cache as well, i am still unclear on what is project specific hence wanting a meeting. |
Some highlights from my understanding of what we discussed (@bmeck, @MylesBorins, please correct me if I'm mistaken somewhere):
A pseudo-implementation for what such a PnP loader would look like this (@bmeck, let me know if I made a conceptual error here). Note that I made PnP more complex than it actually is (it currently returns actual folder paths, not hashes) to illustrate in the next snippet the use of the asset api. import {resolveToUnqualified} from './.pnp.js';
export default function pnpLoader(parentLoader) {
return {
// request=react-calendar/component
loaderStepOne: (request, issuer) => {
// return=TZhsV3bGQV2KZIjFIObr/component
return resolveToUnqualified(request, issuer);
},
// request=TZhsV3bGQV2KZIjFIObr/component
loaderStepTwo: request => {
const {loader, ... rest} = parentLoader.loaderStepTwo(request);
// Wrap the loader to substitute it by our own
return {loader: pnpLoader(loader), ... rest};
}
};
} And a pseudo-implementation for the default loader would be something like this (keep in mind this is SUPER pseudo-code, we haven't discussed code samples and this is just based out of my rough understanding of how the asset api could work): import * as fs from 'fs';
export default function defaultNodeLoader(parentLoader) {
return {
// Note that the PnP loader would entirely shortcut this step, since
// it implements its own step one.
loaderStepOne: (request, issuer) => {
// We need to run the full resolution since the extension and index.js
// must be resolved in order to select the right node_modules
return runNodeResolution(request, issuer, fs);
},
//
loaderStepTwo: request => {
// If there's a parent resolver, use it to resolve the assets (since it's
// the only one that'll know how to use the special identifier
// TZhsV3bGQV2KZIjFIObr/component that's been returned by
// the PnP loader)
const selectedFs = parentLoader ? parentLoader.assets : fs;
// Then use the FS we've selected to run the resolution; we need to run
// it again (even if we do it in the step one of this loader), because the
// step one is not guaranteed to return a fully qualified path (the PnP
// override wouldn't, for example)
const qualifiedPath = runNodeResolution(request, issuer, selectedFs);
// without PnP = /.../node_modules/react-calendar/component/index.js
// with PnP = TZhsV3bGQV2KZIjFIObr/component/index.js
// And then, once it has finished resolving the fully qualified path, it
// can load it (still using the parent loader FS if needed)
const body = await selectedFs.readFile(qualifiedPath);
const loader = /*not sure how it will be loaded?*/;
return {body, loader};
}
};
} |
Closing this as there has been no movement in a while, please feel free to re-open or ask me to do so if you are unable to. |
As #81 is getting into implementation details of the loader I thought it might be a good idea to start discussing an idea I have been kicking around for a robust solution. "Pluggable Loaders". This is heavily inspired by nodejs/node#18392
TLDR; (bluesky)
Example loaders include
Questions to be answered
Thoughts?
The text was updated successfully, but these errors were encountered: