Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Current Roadmap #17475

Closed
12 of 13 tasks
bartlomieju opened this issue Jan 20, 2023 · 45 comments
Closed
12 of 13 tasks

Current Roadmap #17475

bartlomieju opened this issue Jan 20, 2023 · 45 comments

Comments

@bartlomieju
Copy link
Member

bartlomieju commented Jan 20, 2023

We’ll be writing up a more detailed explanation of this soon, but for now take a look at these examples to see how these features play together:

  1. node-compat-app. package.json specifies dependencies, which can be imported from npm or deno.land/x. "type": "module" is necessary. Lockfile used is deno.lock. deno task reads from deno.json first, then falls back to package.json.

  2. deno-canonical-app. deno.json supports the import map keys. Can specify an npm package via npm specifiers or deno.land/x package via deno specifiers. Lockfile used is deno.lock. main.ts is identical in this to the first example.

  3. deno-lib

Feedback is welcome!

Tasks:

@albnnc
Copy link
Contributor

albnnc commented Jan 20, 2023

To begin with, I'm not trying to be rude or something like that. Maybe my English is not fluent enough. But yes, the following questions are very important to me personally and a few people I know in person.

<questions>

Option to embed an import map in deno.json

Why? Import maps were a perfect thing to note that they, in contrast with package.json, are meant for application-level usage.

deno: specifiers

It looks really bad because of the previously noted ideology of decentralised module (not package) management in Deno ecosystem. Why is deno.land/x better than nest.land or cdn.skypack.dev? If your answer is "because it's official and trusted", then people mostly won't try to use anything other than deno: or npm:. It seems that ideology is going to crack more. I also don't see any problem with just typing deno.land/x.

Moreover, Node's node: specifiers relate to Node-related modules, not remote packages/modules. To be honest, I would expect deno: specifiers to serve things located in Deno.<...> namespace. Is it wrong?

It feels that deno.land/x/ could be something like x:, but still, the first argument seems valid. Why is deno: name chosen? And what for?

When built-in Node.js modules are imported without node: prefix a friendly error message is shown:

What if I have my own fs module? Why should Node's fs module take precedence over Deno-specific code? I mean, why would Deno even try to advocate for Node-specific code? Again, it feels weird to use node:fs under Deno environment (not inside npm:-imported packages).

Unflag npm: specifier usage in http/https imports.

Could you please consider adding NO_NPM environment variable support?

Bare specifiers via package.json and import map.

Future support for { "type": "commonjs" } for running legacy code.

One of the key points why my company had switched to Deno was the fact that Deno didn't support CJS. In the context of npm:-specifiers and Node compatibility, this point of the roadmap is OK, I just find the above-noted fact somewhat funny.

Replace deno lint with embedded ESLint

Is it going to be as fast as rust-based deno lint? Just kidding.

it's unclear if the community is willing to rewrite/port existing plugins to different API

Community is lazy, but those, who aren't, won't try to do anything for deno lint after these changes. You are doing so much heavylifting to support npm:-specifiers, but going to drop another piece of existing beauty. If you want deno lint to be enterprise-ready that much, just add ability to use eslint:<...>-like rules in deno lint rules config as a temporal solution.

</questions>

Just before the first Deno release with npm:-specifiers support, you released a post where you said that "large number of [Deno users] have called out how refreshing it is to get away from all the pain of Node". You were right, and I was really worried about npm:-specifiers support. Take a look at that post:

There will be no node_modules folder

Yes, it looks right, but in many (maybe event most) cases end-user will end up typing --node-modules-dir in his console.

All the Deno tooling will work with this, from type checking, to the LSP, to deno vendor.

Seems like it's false also.

Why would we want to use, let's be real, unstable npm:-specifiers with such a complex module-resolution strategy?

I tried to think of npm:-specifiers as a tidy Deno extension, but it looks like it's overtaking the whole ideology that made me feel inspired before. The problem is that in some cases I won't be able to abstract from Node and NPM, even if I just won't use these. Please, make me change my mind.

@imjamesb
Copy link
Contributor

https://www.youtube.com/watch?v=M3BM9TB-8yA

@lino-levan

This comment was marked as outdated.

@lino-levan
Copy link
Contributor

lino-levan commented Jan 20, 2023

Second impression, after some time waiting for my impression to settle:

Most of these changes are starting to make sense to me. I understand that many people have ideological concerns with this one and I still believe that there are more important things to work out but at a fundamental level, most of these make sense to me. Some people are treating this as if it were a random list of ideas but I'm sure deno core has done a lot of discussion internally on these.

The one thing that I absolutely have questions on are deno: specifiers. I understand that there is a demand for them, but I have no idea on how they are practically going to be implemented. Let me break down my concerns, and hopefully we will be able to have a productive community discussion around them.

Will support semver resolution

How? Will deno specifiers only work for modules that are using semver themselves? This sounds like a massive pain to deal with code-wise but I think it's still useful to clarify what what this means.

Edit 2: Looks like this isn't nailed down yet.

Will map to deno.land/x/ packages by default

The "by default" in this line is tripping me up. I guess that this implies that other registries will be supported (which I am a big fan of), but the question of how? comes up again.

deno.lock gets support for deno: specifiers, similar to npm: specifiers.

Not quite sure how this will work in relation to semver, but this SGTM. If I specify deno:some_module^1.2.2 and some_module releases 1.2.3, will the version be bumped? If so, this comes back to the problem of supply chain attacks à la faker/uwebsockets. I guess the response to this could be "don't use them" but why even build this if it obviously introduces security issues. I guess something like faker wouldn't be as bad because of the deno permissions model but accidental breakages will still happen if the semver isn't strictly followed by the module authors.

Import map unfurling on publish

I don't know what this means, please clarify?

An index of all version of a package (maybe plus dependency info per version)

Cool for extendability, I like this one.

/x/package redirects to [list of possible resolution paths]

Are these in order of resolution? Like, will mod.ts get resolved before mod.js?

Also, doesn't this kind of reinforce the idea that mod.[ext] is the "official" entry-point to deno projects more than it already has? Not against that but just noting this down somewhere.

Edit: @crowlKats has clarified some stuff on discord. Do not quote him on this one, I assume it's all tentative, just writing this for others to hear

  • deno: specifiers are the best way to solve a hard/bad problem.
  • deno: specifiers will support semver via something like deno:some_lib^1.2.3
  • On packages not following semver, "trying to use semver syntax (like ^) would error I guess"

@crowlKats
Copy link
Member

Import map unfurling on publish

I don't know what this means, please clarify?

It menas that /x/ would resolve imports that use specifiers defined in an importmap, so a module that uses import maps would be possible to be imported and used properly

Also, doesn't this kind of reinforce the idea that mod.[ext] is the "official" entry-point to deno projects more than it already has? Not against that but just noting this down somewhere.

We already mention convetion of mod.ts in the manual, and the registry already has similar resolution for the doc view.

@albnnc
Copy link
Contributor

albnnc commented Jan 20, 2023

A few more thoughts on deno:-specifiers.

Import map unfurling on publish

It feels not OK to do it magically during a publication on /x/.

I understand the idea, but it feels more natural to say that import maps are just not suitable for libraries and suggest users a few options:

  • Use deps.ts.

  • Use some sort of user-space (but denoland-related possibly) tool to transform code via import map. Btw, it should be fairly easy to implement.

An index of all version of a package (maybe plus dependency info per version)

At first sight, I see two options :

  • Index could be stored at CDN, like at package registry. This option is bad, since storing a library or a module becomes a non-trivial task. Currently, you can just spin up an HTTP-server, and you're ready.

  • Index could be stored as file (like versions.ts) in repo / library files. This option, probably, excludes dependency info from this index, but it looks like a no-problem, since we can always look for deps in the specific source code version directly. But what if particular repo hosts only a limited subset of versions?

/x/package redirects to

What happened to this rule?

Again, this is going to break many ideological decisions that made Deno beautiful. We simply don't need it, since it's OK to write .../mod.ts.

Again, Deno replicates browser behaviour in many ways, Deno uses import maps because of browser. Obviously, browsers won't support deno:-specifiers. I feel that Deno souldn't intoduce another standard here. I still think that library version problems are not Deno-specific. Please, keep module-resolution process simple.

Deno core team said for multiple times that version-resolution problem should be solved in user-space. Aren't you believe in it anymore?

If core team wants to handle versions, it might intoduce another user-space (again, denoland-repo related possibly) tool. Let's say, you have multiple modules with specific dependency URLs. You can just walk through the tree of dependencies and remap them accordingly to semver rules via generating an import map (and possibly merging it with user-defined one).

Interestingly, one'll need to parse each URL slightly differently based on particular CDN URL pattern, but it still might be configurable and not bound to deno.land/x/. That being said, the PoC worked for me pretty well.

One might say that there is other problem than bundle or runtime — type checks. Personally, I would prefer to ignore this "problem", because I've never encountered unsolvable type problems between dependencies during development for Deno.

What I'm trying to say, is that it's not quite clear what problem is unsolvable without deno:-specifiers.

@bartlomieju
Copy link
Member Author

bartlomieju commented Jan 21, 2023

@albnnc

I'll try to answer some of your questions now, as we are still working on the roadmap:

Option to embed an import map in deno.json
Why? Import maps were a perfect thing to note that they, in contrast with package.json, are meant for application-level usage.

It is team's thinking that it would be more handy - I'm myself not a fan of this solution - I would prefer if import map was embedded under importMap key (so it could either be a string pointing to an import map file or an object).

It feels that deno.land/x/ could be something like x:, but still, the first argument seems valid. Why is deno: name chosen? And what for?

Main point is that we need to provide semver resolution, that we were unable to figure out with http/https imports. We are still thinking about other names and providing a way for other registries to register their own prefix.

When built-in Node.js modules are imported without node: prefix a friendly error message is shown:
What if I have my own fs module? Why should Node's fs module take precedence over Deno-specific code? I mean, why would Deno even try to advocate for Node-specific code? Again, it feels weird to use node:fs under Deno environment (not inside npm:-imported packages).

If you'll have an entry in an import map that specifies fs module, then the error would not be changed in any way. That should have been clarified in that point.

Unflag npm: specifier usage in http/https imports.
Could you please consider adding NO_NPM environment variable support?

There's --no-npm flag that disables support for npm packages altogether. Is that enough for your use case?

Bare specifiers via package.json and import map.
Future support for { "type": "commonjs" } for running legacy code.
One of the key points why my company had switched to Deno was the fact that Deno didn't support CJS. In the context of npm:-specifiers and Node compatibility, this point of the roadmap is OK, I just find the above-noted fact somewhat funny.

Some frameworks/build tools from npm still produce CommonJS output (eg. NextJS). We are still evaluating if this is needed, but with our current knowledge it is - it would be silly to support developing projects using these frameworks but not allowing to actually deploy/run the project. No one's happy about it, but it's an escape hatch (we're consider adding a warning about running CommonJS modules).

Replace deno lint with embedded ESLint
Is it going to be as fast as rust-based deno lint? Just kidding.
it's unclear if the community is willing to rewrite/port existing plugins to different API
Community is lazy, but those, who aren't, won't try to do anything for deno lint after these changes. You are doing so much heavylifting to support npm:-specifiers, but going to drop another piece of existing beauty. If you want deno lint to be enterprise-ready that much, just add ability to use eslint:<...>-like rules in deno lint rules config as a temporal solution.

This was removed from the roadmap for now. deno lint already supports all built-in ESLint rules, but has no plugin functionality and there's no way forward to provide compatibility with existing ESLint plugins. I did an experiment with ESLint that you can find here: https://github.com/bartlomieju/eslint_binary Hope it clears some points for you (there is a path to improve performance of ESLint a lot, I know other people are working on that as well: https://twitter.com/marvinhagemeist/status/1616766023235174402).

There will be no node_modules folder
Yes, it looks right, but in many (maybe event most) cases end-user will end up typing --node-modules-dir in his console.

That's true, but the fact is many frameworks from npm assume that there will be local node_modules directory and they use this assumption heavily when resolving packages (eg. Vite). We were looking at providing plugins for such frameworks, but it's not always possible as some of that resolution logic is hard coded and cannot be overriden.

EDIT: Questions from other comment

Import map unfurling on publish

It feels not OK to do it magically during a publication on /x/.

That could happen locally via a CLI.

At first sight, I see two options :

Index could be stored at CDN, like at package registry. This option is bad, since storing a library or a module becomes a non-trivial task. Currently, you can just spin up an HTTP-server, and you're ready.

Index could be stored as file (like versions.ts) in repo / library files. This option, probably, excludes dependency info from this index, but it looks like a no-problem, since we can always look for deps in the specific source code version directly. But what if particular repo hosts only a limited subset of versions?

The registry would be open source and users would be free to host their own instances. Still these are all additive changes. You will be able to use existing http/https imports just fine if you don't like this solution.

Again, this is going to break many ideological decisions that made Deno beautiful. We simply don't need it, since it's OK to write .../mod.ts.

It's not set in stone that this redirect will happen. It's a current roadmap, but we'll be evaluating all these points as they are about to be implemented.

Again, Deno replicates browser behaviour in many ways, Deno uses import maps because of browser. Obviously, browsers won't support deno:-specifiers. I feel that Deno souldn't intoduce another standard here. I still think that library version problems are not Deno-specific. Please, keep module-resolution process simple.

It's not another standard, all of this is based on import maps standard. Since import maps are not composable we need to find another way to make it easier for users to create "a collection of ES modules" which is essentially a library or package (offtopic: it's been growing on me and I think it's a missing bit in ES modules standard that could have saved the community a whole lot of headaches if it had been defined).

@bartlomieju
Copy link
Member Author

@lino-levan

Same deal here:

Will support semver resolution

How? Will deno specifiers only work for modules that are using semver themselves? This sounds like a massive pain to deal with code-wise but I think it's still useful to clarify what what this means.

I'm not sure sure I understand the question. AFAIK no registry validates if a package follows semver (ie. ensures there are no breaking change in patch and minor releases). It's up to the library author.

Will map to deno.land/x/ packages by default

The "by default" in this line is tripping me up. I guess that this implies that other registries will be supported (which I am a big fan of), but the question of how? comes up again.

We are looking at making the protocol for registry public so that other registries can implement it and allow users to register their own prefixes with the runtime.

Import map unfurling on publish

I don't know what this means, please clarify?

Libraries in the registry could use bare specifiers and have an import map. During publishing step the processing would be applied that replaces bare specifiers with relevant entries from the import map.

/x/package redirects to [list of possible resolution paths]

Are these in order of resolution? Like, will mod.ts get resolved before mod.js?

Also, doesn't this kind of reinforce the idea that mod.[ext] is the "official" entry-point to deno projects more than it already has? Not against that but just noting this down somewhere.

This needs to specified further. I believe it would be a setting on the registry (effectively it would be a redirect - you configure you library in registry to redirect to module ./<module_name> if there's no path.

@lino-levan
Copy link
Contributor

@bartlomieju

I'm not sure sure I understand the question. AFAIK no registry validates if a package follows semver (ie. ensures there are no breaking change in patch and minor releases). It's up to the library author.

Ah sorry, I should have been more clear. My question was regarding how you would deal with libraries that aren't versioned using semver. A lot of libraries use something like calver or literally just update names.

We are looking at making the protocol for registry public so that other registries can implement it and allow users to register their own prefixes with the runtime.

This sounds good to me but yet again this feels not very well defined.

Libraries in the registry could use bare specifiers and have an import map. During publishing step the processing would be applied that replaces bare specifiers with relevant entries from the import map.

I need to think about this one to really determine how I feel about it. Definitely mixed feelings on this one. On one hand, it would be really convenient, on the other hand I share the concerns of @albnnc.

This needs to specified further. I believe it would be a setting on the registry (effectively it would be a redirect - you configure you[r] library in registry to redirect to module ./<module_name> if there's no path.

As far as I can tell, the registry current has no settings page for the module authors to interact with. In my view, this is a good thing (opinionated > non-opinionated). I agree that this does need to be specified further.

@ayame113
Copy link
Contributor

ayame113 commented Jan 21, 2023

I agree with the following suggestions to improve compatibility with Node.js.

  • Bare specifiers via package.json and import map.
  • package.json is auto-discovered, similar to deno.json
  • deno task can run scripts from package.json
  • Node built-in modules are available to be imported with node: prefix
  • When global Node.js objects are referenced, Deno will error out with a friendly error message
  • When built-in Node.js modules are imported without node: prefix a friendly error message is shown
  • Unflag npm: specifier usage in http/https imports.

However, I don't feel the need to introduce the following behavior, which is incompatible with both Node.js and browsers.

  • Add support for deno: specifiers
    • Will support semver resolution
    • Will map to deno.land/x/ packages by default

The Deno-specific module resolution with deno: feels a bit too complicated. Importing using a http: URL is not too bad thanks to the completion provided by lsp.
Also, using semver can lead to security vulnerabilities. (reference: Open source maintainer pulls the plug on npm packages colors and faker, now what?)

Updating a version-pinned module is a bit of a pain at the moment. However, I think that it should not be solved by introducing semver, but by officially providing automatic update tools such as deno-udd and dependabot.

@albnnc
Copy link
Contributor

albnnc commented Jan 21, 2023

@bartlomieju

Thank you for the reply, I appreciate this.

Option to embed an import map in deno.json

Why? Import maps were a perfect thing to note that they, in contrast with package.json, are meant for application-level usage.

It is team's thinking that it would be more handy - I'm myself not a fan of this solution - I would prefer if import map was embedded under importMap key (so it could either be a string pointing to an import map file or an object).

importMap.imports would be great indeed, but it is, likely, a matter of taste. It should be kinda OK, since nobody is forced to use this feature.

Why is deno: name chosen? And what for?

Main point is that we need to provide semver resolution, that we were unable to figure out with http/https imports. We are still thinking about other names and providing a way for other registries to register their own prefix.

I will try to ask some concrete questions about this at the end of this comment.

If you'll have an entry in an import map that specifies fs module, then the error would not be changed in any way. That should have been clarified in that point.

My idea is that Deno, probably, should be searching for deno:fs or custom:fs, rather than node:fs. It feels very controversial that Deno is going to suggest something from Node over anything other. If deno:-like specifiers are going to be implemented, then it would be nice to search for suggestions in any of namespaces (URIs) and figure out the way of determining the priority of particular suggestion.

There's --no-npm flag that disables support for npm packages altogether. Is that enough for your use case?

Use-case: the team wants to forbid the usage of npm:-specifiers in the code, even in the transitive dependencies. Since one doesn't want to specify --no-npm all the time, the only way of validating the absence of npm:-based URLs is using some sort of CI job that is going to run something like deno check --no-npm. It would be nice to be able to set NO_NPM=1 in .env or .zshrc.

This feels a little off-topic. If this idea is adequate, I can submit a separate feature request.

Some frameworks/build tools from npm still produce CommonJS output (eg. NextJS). We are still evaluating if this is needed, but with our current knowledge it is - it would be silly to support developing projects using these frameworks but not allowing to actually deploy/run the project. No one's happy about it, but it's an escape hatch (we're consider adding a warning about running CommonJS modules).

Yes, for now, Next.js producing legacy CJS. Yes, for someone it would be better to just wait for Next.js to drop CJS support or to just not use Next.js at all. Supporting all (possibly legacy) frameworks have nothing in common with building elegant and stable platform. Why would anyone even try to build something like fresh or Aleph.js if Next.js worked perfectly?

The really bad thing about it is that not only fresh / aleph projects are going to be disadvantaged, but we will less reasons for deno.json usage, since package.json is going to be more universal.

This idea is somewhat acceptable, but it seems very strange to implement it if "no one's happy about it". Maybe it's better to just don't support Next.js officially.

This was removed from the roadmap for now. deno lint already supports all built-in ESLint rules, but has no plugin functionality and there's no way forward to provide compatibility with existing ESLint plugins. I did an experiment with ESLint that you can find here: https://github.com/bartlomieju/eslint_binary Hope it clears some points for you (there is a path to improve performance of ESLint a lot, I know other people are working on that as well: https://twitter.com/marvinhagemeist/status/1616766023235174402).

Thanks for cautiousness and interesting links. I think that the reasoning for ESLint looks more clear now.

It feels not OK to do it magically during a publication on /x/.

That could happen locally via a CLI.

Although the integration of deno unfurling with tools like deno vendor looks questionable, this feature might be really useful. My vote for this feature to stay tool, not magical feature of /x/.

The registry would be open source and users would be free to host their own instances.

That's good, but it looks really similar to NPM. How many other NPM registries are popular?

Still these are all additive changes. You will be able to use existing http/https imports just fine if you don't like this solution.

This is only partially true, probably. Yes, we'll be able to use raw URLs. But once semver-resolution becomes popular enough, it would be impossible. I mean, how would one use a library with deno:-specifiers in its code without a registry?

...this is going to break many ideological decisions that made Deno beautiful. We simply don't need it, since it's OK to write .../mod.ts.

It's not set in stone that this redirect will happen. It's a current roadmap, but we'll be evaluating all these points as they are about to be implemented.

Thanks for that. Please, continue to use GitHub for that too, since sometimes it's almost impossible to track Discord. GitHub is very good for history also.

Again, Deno replicates browser behaviour in many ways, Deno uses import maps because of browser. Obviously, browsers won't support deno:-specifiers. I feel that Deno souldn't intoduce another standard here. I still think that library version problems are not Deno-specific. Please, keep module-resolution process simple.

It's not another standard, all of this is based on import maps standard. Since import maps are not composable we need to find another way to make it easier for users to create "a collection of ES modules" which is essentially a library or package (offtopic: it's been growing on me and I think it's a missing bit in ES modules standard that could have saved the community a whole lot of headaches if it had been defined).

I tend to disagree.

To begin with, I think that ESM spec is OK, because it defines how modules are written and combined and not how registries (or collections) are built. The good part of it is that it's simple.

What I don't understand, is how the proposed idea completely based on import maps spec. Yes, you can say that module-resolution involves magical internal import map generation. But it's really controversial, since this process will probably be hidden, unclear, while import maps supposed to be application level and mostly unchangeable. Moreover, user-defined import maps should be taken into account during this process somehow.

The most abnormal part (which is really hard to accept) of it is that the result isn't completely determined and depends on the current state of registry index.

Analysing this further, it looks like deno:-specifiers idea splits into the following parts:

  • deno:-like URI handling;
  • @<semverRange> support and dynamic resolution.

Again, some of us just don't see any need for the support of deno:-like URIs, since it looks like they aren't required even for semver-based module resolution. Could you please explain, why we need deno: or custom: prefix support?

Talking about @<semverRange> support, the reasoning in understandable. But isn't it solvable in user-space via generating that magic deduping import map on the side of user just before using some sort of deno unfurling?

Yes, you might think that we need to support some sort of ^{version} or >={version}, but, to be honest, this might be just configurable on the side of end-user that will decide to shrink the module tree.

@KnorpelSenf
Copy link
Contributor

Regarding import maps:

Import map unfurling on publish

I don't know what this means, please clarify?

Libraries in the registry could use bare specifiers and have an import map. During publishing step the processing would be applied that replaces bare specifiers with relevant entries from the import map.

Please don't add anything like a build step. I love that I can write a source file, publish it anywhere, and run it again via its URL. I love even more that this is the only way and it's the right way. It's so straightforward and dead simple.

As soon as you start introducing weird features that mess around with the source files at any point along this chain, you're breaking this trust. I could then no longer be sure about what I publish and what people run (unless I understand the internals of import maps and make sure it doesn't affect me).

This means

  • import maps for modules on /x are bad (I'm mostly a library author and I have never ever felt like we needed a solution here beyond deps.ts)
  • automatic redirects to certain file names are bad (it's not ‘cleaner’ if you can suddenly import a directory of files, let's just write the damn filename in there)

I share the hesitation for some of these features. Yes, we really need a way to do semver and remote module deduplication, but please let's not rush this, keep up the communication with us, and let's build the right system. It would be a shame to repeat the mistakes of Node.

@dsherret
Copy link
Member

We should break this issue up into multiple issues so the conversation is focused on each point.

Again, Deno replicates browser behaviour in many ways, Deno uses import maps because of browser. Obviously, browsers won't support deno:-specifiers.

@albnnc they can via an import map. It will be trivial to generate an import map from the output of deno info --json and use modules that use semvered specifiers (deno: specifiers--name pending) with the browser and other tools.

Again, some of us just don't see any need for the support of deno:-like URIs, since it looks like they aren't required even for semver-based module resolution. Could you please explain, why we need deno: or custom: prefix support?

The key here is that these semvered specifiers provide a way for module authors to publish the constraints of their dependencies rather than exact versions. Module authors usually have the best knowledge of what semver range of their dependencies work well with their module instead of module consumers.

Another important part is that Deno will automatically handle this so users don't need to worry about creating an import map themselves--when Deno reads semvered specifiers in loaded code, it will create an implicit import map that Deno handles for the end users out of the box. We could ask people to try following a convention for this and then manually generate an import map, but this is not a great experience.

Please don't add anything like a build step. I love that I can write a source file, publish it anywhere, and run it again via its URL. I love even more that this is the only way and it's the right way. It's so straightforward and dead simple.

@KnorpelSenf You will still be able to do this. This is an optional build registry only (non-CLI change) build step for people who want to use bare specifiers while authoring modules.

@albnnc
Copy link
Contributor

albnnc commented Jan 22, 2023

We should break this issue up into multiple issues so the conversation is focused on each point.

That would be nice.

Obviously, browsers won't support deno:-specifiers.

they can via an import map.

This exact line is about deno:-specifier complete behaviour, including semver-based module resoltion. Browser won't treat @>=1.2.3 like something special, the module resolition will still be dead-simple.

Obviously, one will be able to make browser resolve @>=1.2.3 to the exact version that Deno will target itself (e.g. using deno info, like you said), but this is not the same module resolution process at all. Deno will change the tree possibly (e.g., on registry index changes), while browser won't do such hidden things.

I wrote a little more about deno: URIs and semver-based resolution in later comment.

The key here is that these semvered specifiers provide a way for module authors to publish the constraints

Good point, but the counterarguments are the following:

  • Casual Deno app itself doesn't need to dedupe its deps. It's OK to just use exact versions and URLs.

  • Even if you have the reason to use a single version of the same library (e.g. for frontend project), you can handle this using user-space tooling.

That tooling might, for example, treat @x.y.z as @^x.y.z by default – this would quite good default behavior, since this is the way semver supposed to work.

We could ask people to try following a convention for this and then manually generate an import map, but this is not a great experience.

I agree and this is what @KnorpelSenf was writing about. Build steps aren't good, but I still think that they're still better than registry magic. That being said, I think that our common point is that Deno doesn't need semver-based module-resolution logic at all.

@petamoriken
Copy link
Contributor

petamoriken commented Jan 22, 2023

I feel strange that deno: is mapped to deno.land/x. I think deno: should be mapped to deno.land/std and have another specifier for third-party modules.

@KnorpelSenf
Copy link
Contributor

You will still be able to do this. This is an optional build registry only (non-CLI change) build step for people who want to use bare specifiers while authoring modules.

I'm aware that this won't break existing functionality. I just disagree that it's a good idea to provide a second and less explicit way to do the same thing, just because it looks cute.

@dsherret
Copy link
Member

dsherret commented Jan 22, 2023

I'm aware that this won't break existing functionality. I just disagree that it's a good idea to provide a second and less explicit way to do the same thing, just because it looks cute.

It's already technically possible for people to do this. We will just be providing an easier way to do it. Also, IMO using import maps isn't something that is functionally reduced down to "looks cute". Import maps provide a way to map bare specifiers to all your dependencies stored in a single file.

@KnorpelSenf
Copy link
Contributor

KnorpelSenf commented Jan 22, 2023

We should break this issue up into multiple issues so the conversation is focused on each point.

I agree, could you split it up and link to the created issues here? :)

It's already technically possible for people to do this. We will just be providing an easier way to do it. Also, IMO using import maps isn't something that is functionally reduced down to "looks cute". Import maps provide a way to map bare specifiers to all your dependencies stored in a single file.

Right, my bad, the part that's useless and looks cute is being able to drop /mod.ts. I still don't really like the build step part, but one point which makes it more okay is the fact that the actual code as one would import it will still be visible on /x. It's merely the github repo that's no longer perfectly in sync with the module on /x, but that's certainly something I could live with.

@aapoalas
Copy link
Collaborator

aapoalas commented Jan 22, 2023

Pre-warning: I disagree with adding deno: imports and adding automatic import map "unfurling" to /x/ publishing. I shall try to pose a few questions that I believe are significant enough to really put into question the benefits of these, especially deno: imports as means to implement semver based deduplication of dependencies.

Pulling in two directions

The first issue I see is that plain deno: imports with semver support, and import map unfurling seem to be orthogonal to one another.

The first promises to make importing from the /x/ registry a bit easier, while also providing semver support that Deno can then use to deduplicate dependencies in the same vein as NPM and Deno's NPM support do. The ease of importing, while cute, doesn't seem to be that big of a deal: Saving the keystrokes between deno:library@tag/customEntry.ts and https://deno.land/x/library@tag/customEntry.ts is not a great benefit, it's mostly just a bit of convenience (note here I am specifically referring to static version imports, not loose semver). I feel the same could be achieved with an LSP suggestion to import-map a URL import.

Semver support is an added feature that, as @dsherret mentions, can already be done either manually or by an external tool. Achieving deduplication with this feature requires that all of your (deep) dependencies also use semver imports, however. (See below for more discussion on common usage.)

The second promises to make authoring libraries with import maps easier. Fair enough, but unfurling an import map means that eg. a deno:library@^semver/ entry in the import map will now be unfurled into a specific version in the published library. This directly fights against the above deduplication-via-semver feature.

Common usage

NPM achieves deduplication of dependencies through a common usage of the registry with common (though occasionally misused or broken) semver based package version naming. Each package will declare its dependencies with the same semver format, and thus deduplication becomes possible.

The /x/ registry does not do this, and the obvious intention seems to be that this would not be forced, or even necessarily recommended. Import maps unfurling would even do the exact opposite of this.

As a result, if you wanted to see your /x/ imports deduplicated, you'd need to use only such modules (or are they now libraries?) on /x/ that use semver. Conversely, if you wanted to only use static version imports you'd need to select your dependencies in such a way as to only use statically importing dependencies. This is an issue on NPM as well, where I believe the usual choice is to include deep dependencies as your own dependencies to force the version. With Deno the answer would of course be to manually import-map these dependencies.

For semver deduplication, any library you import that uses static imports would always cause a duplication to occur. For example, you import deno:Alib@^3.0.0/mod.ts and some other library, which happens to import https://deno.land/x/[email protected]/mod.ts. Deno would probably map the ^3.0.0 import to the newest 3-series version of this library, which may or may not be 3.1.6. A duplication may occur.

Counter point: Why wouldn't Deno just notice that Alib is already imported with version 3.1.6 through an HTTP import, and then choose to use that version? Well, that would mean that Deno would need to resolve the HTTP import backwards essentially into a deno: import, and deduplicate the two imports from there. But that means that Deno inherently knows how the /x/ library's version tagging works, meaning that Deno is now strongly bound to the /x/ library just like Node is bound to NPM, repeating that mistake. Furthermore, this same couldn't be (easily) done for other registries as their versioning (tagging) strategy may not be the same. And if Deno can work back from a statically versioned HTTP import to a plain import with semver, then is there actually any need for the plain imports to begin with? Could Deno or the /x/ registry simply add support for semver in the HTTP import path?

I ran out of time so I'll come write some more later.

P.S. I support going all-in on NPM for semver stuff. Deno has already spent good hours supporting NPM, why not lean in on that? Is there truly a need to make /x/ be a big thing, competing with NPM? Can /x/ not be its own, opinionated registry for those of us who'd rather not have all the extra?

@aapoalas
Copy link
Collaborator

Continuing from the previous:

Playing to your strengths

As I mentioned in the afterword of my first comment, I have not really understood the intended benefits of adding further automation and features to /x/, specifically as it comes to semver.

I can kind of understand new users not liking automatic index.js sniffing not being there, or not having a package.json to declare dependencies, or having to use a deps.ts for dependency imports when authoring libraries. These can be inconvenient and odd if one is used to NPM but they don't seem to me as deal-breakers. Most importantly, with Deno supporting NPM imports I do not see any issue with these sorts of users simply continuing to publish into NPM and having their library be imported from there.

NPM has great strengths in its wide usage, it's audit tooling, and it's automatic semver based deduplication. Deno supports NPM now, so it seems wise to me to utilise this strength and lean into it instead of trying to reinvent it.

The /x/ registry has its own strengths: It's simplicity and the independence of runtime brought about by that (excepting runtime specific API usage which one cannot really guard against, as we see with Node only modules being published there). It's intentional lack of metadata to eg. guide users by engine / runtime, by peer dependency etc. can be viewed as a fault but it is also very much a choice made by the registry, presumably with full knowledge of what it will mean.

In trying to change /x/ to be more like NPM the Deno team seems to be playing to the weaknesses of the registry, and fighting against the strengths of NPM. Given that NPM even explicitly invites non-Node code through its engines field it would seem to me much more reasonable to play into the strengths instead:

Make publishing Deno libraries into NPM easy and convenient, and keep on improving the NPM support. Keep the /x/ registry as a simple, opinionated registry with static version dependencies. If in the future the userland or ECMA spec proceeds to a direction where library imports and semver become possible then take that step there, with the spec instead of with magics built into the runtime. There already exists some work to this end with Web Bundles.

As said in the beginning, I don't understand what the intended benefit of adding semver into /x/ is. If it is for more registry usage then why? I doubt there's a financial benefit. If it is for Deno usage then I expect NPM is a stronger route to that. For Deno Deploy as well I expect getting further NPM support has a better return on investment than adding another NPM contender.

Binding Deno strongly to /x/ in addition to the already existing NPM binding seem like it would only add more complexity to both the runtime binary and the registry, neither of which really needs it as NPM support already does it all in a better way with a bigger addressable market.

@aapoalas
Copy link
Collaborator

aapoalas commented Jan 22, 2023

Another important part is that Deno will automatically handle this so users don't need to worry about creating an import map themselves--when Deno reads semvered specifiers in loaded code, it will create an implicit import map that Deno handles for the end users out of the box. We could ask people to try following a convention for this and then manually generate an import map, but this is not a great experience.

Couldn't Deno just as well handle this automatically in such a way that the /x/ registry would send in its responses a special header if a particular published module was contained in a library where an import map was present. Deno could then automatically download said import map (header contains URL or relative path) and include it as a scoped map in the users (implied) import map. A CLI command could be used to create a complete import map this way, with possibly automatic helpers for user-based deduplication. The best part here would be that this would not tie Deno with /x/ at all, as any other registry could pretty trivially include the same header in their response to get the same effect.

It would also work as a relatively neat halfway point between NPM-like automatic semver deduplication, and /x/'s strict static imports: By default scoped import maps would retain the static nature of the imports but either manual manipulation or automatic helpers could be used to do deduplication when and where it seems possible. Given the free-form nature of /x/ publishing it seems like semver-deduplication would need to be best-effort anyway, as eg. deno:Alib@^3.0.0 could've been import mapped to something else entirely, which would of course break the basic assumptions of the deno: semver imports.

EDIT: There's an issue I didn't really consider here properly: Automatic joining of import maps is not yet defined in spec (a proposal exists but it's not yet fully decided on) and thus including the algorithm in Deno would risk possible future breaking changes. If the automatic joining was only provided as a CLI command or script instead, ie. "generate an import map that joins my current dependencies' import maps" then future breaking changes could be avoided: Any import map created today would still work in the future, even if the specific composition algorithm changed that Deno uses to create these import maps changed in between.

@KnorpelSenf
Copy link
Contributor

When taking a step back, Deno really just runs the ES modules that work on the web. Any problem related to dependencies will appear both for Deno modules and for ES modules on the web. Consequently, we should use/invent a web standard for this. What would the ECMA spec say about semantic versioning and dependency deduplication?

@Kycermann
Copy link

Is this issue planned to be fixed in Q1?

@bartlomieju
Copy link
Member Author

bartlomieju commented Jan 23, 2023

Is this issue planned to be fixed in Q1?

@Kycermann Yes, it's covered by #17264

@aapoalas
Copy link
Collaborator

When taking a step back, Deno really just runs the ES modules that work on the web. Any problem related to dependencies will appear both for Deno modules and for ES modules on the web. Consequently, we should use/invent a web standard for this. What would the ECMA spec say about semantic versioning and dependency deduplication?

ECMA spec doesn't really consider dependencies or libraries, and semantic versioning to be part of it's cup of tea. The closest we get to libraries is probably with Web Bundles (https://github.com/WICG/webpackage) and bundle preloading (https://github.com/WICG/bundle-preloading), but there too the answer is pretty much just a lock file -like "is this the same file I got last time" type of checking.

Browsers really do not have the "time" to ponder such things as a dependency graph and its possible deduplications. The only tools for the job, really, are the import map or a ServiceWorker since either of those can be used to reroute imports to another location. I think that's also the only thing ECMA or WICG will ever give us. WinterCG might change that though.

@abflow

This comment was marked as off-topic.

@dsherret

This comment was marked as off-topic.

@abflow

This comment was marked as off-topic.

@dsherret

This comment was marked as off-topic.

@yi-huan
Copy link

yi-huan commented Apr 15, 2023

Feels like it's become another nodejs

@PoignardAzur
Copy link

We're way past Q1 2023 now, soon into Q3. Is another roadmap coming?

@KnorpelSenf
Copy link
Contributor

#18836 provides some insight already

@PoignardAzur
Copy link

Right, but of two issues that are pinned to the repo, one is for Q1 2023, one is for the month of April. Since we're almost mid-June, maybe a new roadmap should be published; or these issues should at least be renamed?

@KnorpelSenf
Copy link
Contributor

I agree very much with you that there's a lack of transparency about where this project will be taken :)

@dsherret dsherret changed the title Q1 Roadmap Current Roadmap Jun 10, 2023
@dsherret
Copy link
Member

This is the current roadmap. I updated the title.

@Kycermann
Copy link

Kycermann commented Jun 25, 2023

Custom import resolvers please. So we could import:

  • Files from other languages (custom resolver would compile into WASM, wrap in a Javascript WASM loader and return that file)
  • .less or .sass or .css as a react hook (import { useStyles } from "./style.less")
  • Markdown files could be imported as React components, etc
  • Imports from *.worker.js|ts could be imported as classes to make new instances of web workers

@bombillazo
Copy link

bombillazo commented Aug 25, 2023

Hey, is Resource Management and the using keyword (now supported in TS 5.2) something in the roadmap for Deno as well?

@bartlomieju
Copy link
Member Author

Hey, is Resource Management and the using keyword (now supported in TS 5.2) something in the roadmap for Deno as well?

Yes, we plan to ship it in v1.37 next month, but with a caveat that it will only work in TypeScript files.

@bombillazo
Copy link

1.37 released! I was testing and was wondering if destructuring is possible with the using keyword?

await using { client } = await getConnection();

The deno-ts linter doesnt like it

'await using' declarations may not have binding patterns.deno-ts(1492)

@bartlomieju
Copy link
Member Author

@bombillazo using is currently only supported in TS files (V8 lacks support for it currently). If you encounter more problems, please open a new issue.

@PoignardAzur
Copy link

PoignardAzur commented Sep 26, 2023

What does "only supported in TS files" mean? Does Deno do some transpilation to convert using directives to try-finally-style code?

@bartlomieju
Copy link
Member Author

What does "only supported in TS files" mean? Does Deno do some transpilation to convert using directives to try-finally-style code?

Correct, just like in TS 5.2 these directives are down emitted, the same happens in Deno using SWC.

@PoignardAzur
Copy link

Oh, that's pretty cool!

@bartlomieju
Copy link
Member Author

Going to close this one now. We'll come up with a new roadmap after new year.

@dsherret dsherret unpinned this issue Dec 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests