From 62f78dc080e2f97ad302a25a7c6c4dcd1ac33826 Mon Sep 17 00:00:00 2001 From: MrBBot Date: Thu, 8 Feb 2024 22:07:04 +0000 Subject: [PATCH] [edge-preview-authenticated-proxy] Deploy validation fixes (#4940) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * ci: only deploy Workers when pushing to special branches (#4848) Previously every push to `main` that contained changes to a Worker package would cause that Worker to be deploy "to production". This meant that we had to be very careful about merging PRs that touch these Workers files. Now production deployment is only run for a Worker on a push to its special deployment branch (e.g. `deploy-worker/playground-preview-worker`). This allows us to land PRs to `main` and then collect up changes to release in a single go by updating the special deployment branch in a controlled way. Moreover it is also easy to see what changes will be released by looking at the difference between `main` and the deployment branch for the Worker package directory. The three workers in this repo now follow this pattern: - edge-preview-authenticated-proxy - format-errors - playground-preview-worker In addition, the Playground Preview Worker has end-to-end tests that are run on every push to the `main` branch, and also any PRs that have the "playground-worker" label. For these tests the Worker is deployed to the testing environment before running the tests against this deployment. * C3: bump `create-next-app` and handle new `next.config.mjs` defaults (#4863) * bump create-next-app from 14.0.4 to 14.1.0 * C3: handle new next.config.mjs default files * [wrangler] Support ai bindings in getBindingsProxy (#4869) * Retain AI service binding from miniflare options * Update changeset * chore: bump `workerd` to `1.20240129.0` (#4874) * [C3] Bump create-vue from 3.9.1 to 3.9.2 in /packages/create-cloudflare/src/frameworks (#4870) * [C3] Bump create-vue in /packages/create-cloudflare/src/frameworks Bumps [create-vue](https://github.com/vuejs/create-vue) from 3.9.1 to 3.9.2. - [Release notes](https://github.com/vuejs/create-vue/releases) - [Commits](https://github.com/vuejs/create-vue/compare/v3.9.1...v3.9.2) --- updated-dependencies: - dependency-name: create-vue dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * [C3] Update frameworks cli dependencies --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Wrangler automated PR updater * Exclude standardPricingWarning from dryrun (#4880) * Exclude standardPricingWarning from dryrun That way we do not need authentication for dryrun, again. * fixup! Exclude standardPricingWarning from dryrun * fixup! Exclude standardPricingWarning from dryrun --------- Co-authored-by: petero-dk <2478689+petero-dk@users.noreply.github.com> * Implements Python support in miniflare. (#4873) * Fix changesets (#4885) * fix dario-piotrowicz's changesets * fix existing C3 changesets and amend the changeset text in the creation logic * Version Packages (#4854) Co-authored-by: github-actions[bot] * [C3] Bump create-qwik from 1.4.2 to 1.4.3 in /packages/create-cloudflare/src/frameworks (#4881) * [C3] Bump create-qwik in /packages/create-cloudflare/src/frameworks Bumps [create-qwik](https://github.com/BuilderIO/qwik/tree/HEAD/packages/create-qwik) from 1.4.2 to 1.4.3. - [Release notes](https://github.com/BuilderIO/qwik/releases) - [Commits](https://github.com/BuilderIO/qwik/commits/v1.4.3/packages/create-qwik) --- updated-dependencies: - dependency-name: create-qwik dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * [C3] Update frameworks cli dependencies --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Wrangler automated PR updater * [wrangler] fix: listen on loopback for wrangler dev port check and login (#4830) The `wrangler dev` port availability check triggers a firewall prompt on macOS while it briefly opens and closes listeners. The OAuth callback server from `wrangler login` has the same issue. Fix both cases by listening on the loopback address only by default. Fixed some new test failures by using locally available IP addresses: wrangler:test: ● wrangler dev › ip › should use to `ip` from `wrangler.toml`, if available wrangler:test: listen EADDRNOTAVAIL: address not available 1.2.3.4:8787 wrangler:test: wrangler:test: ● wrangler dev › ip › should use --ip command line arg, if provided wrangler:test: listen EADDRNOTAVAIL: address not available 5.6.7.8:8787 Relates to #4430 * remove outdated and no longer valid miniflare development section from CONTRIBUTING.md (#4893) * [C3] Bump @angular/create from 17.1.1 to 17.1.2 in /packages/create-cloudflare/src/frameworks (#4892) * [C3] Bump @angular/create in /packages/create-cloudflare/src/frameworks Bumps [@angular/create](https://github.com/angular/angular-cli) from 17.1.1 to 17.1.2. - [Release notes](https://github.com/angular/angular-cli/releases) - [Changelog](https://github.com/angular/angular-cli/blob/main/CHANGELOG.md) - [Commits](https://github.com/angular/angular-cli/compare/17.1.1...17.1.2) --- updated-dependencies: - dependency-name: "@angular/create" dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * [C3] Update frameworks cli dependencies --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Wrangler automated PR updater * [Stream] WebRTC Template Improvements (#4157) * Add port numbers to fix stackblitz automatic startup * Pull the variables up and show clear instructions * fixup! Pull the variables up and show clear instructions --------- Co-authored-by: Peter Bacon Darwin * fix: return 4xx error from playground preview worker on invalid tokens (#4904) Previously, we were passing user-controllable token inputs (`X-CF-Token` header, `user` search param, `token` cookie) directly to the `DurableObjectNamespace#idFromString()` function. This validated the token to be a valid Durable Object ID, but threw a reportable `TypeError`, generating a 500 response and increasing the errored request count. This is a user error, so shouldn't be reported. This change wraps calls to `idFromString()` inside `try`/`catch`es, and re-throws non-reportable user errors that generate 4xx responses. We could just validate that the tokens were 64 hex digits before calling `idFromString()`, but this approach ensures the IDs are for the `UserSession` object by validating the signature encoded in the ID too. * chore: rename deprecated Vitest `TestContext.meta` properties to `TestContext.task` (#4897) This is a precursor to updating to latest Vitest. * [wrangler] fix: stop rebuild attempts when sources are missing (#3886) (#4877) * [wrangler] fix: stop rebuild attempts when sources are missing (#3886) * fixup! [wrangler] fix: stop rebuild attempts when sources are missing (#3886) * fixup! [wrangler] fix: stop rebuild attempts when sources are missing (#3886) * fixup! [wrangler] fix: stop rebuild attempts when sources are missing (#3886) * Update .changeset/hip-files-count.md Co-authored-by: Pete Bacon Darwin * !fixup: [wrangler] fix: stop rebuild attempts when sources are missing (#3886) * !fixup: [wrangler] fix: stop rebuild attempts when sources are missing (#3886) --------- Co-authored-by: Peter Bacon Darwin Co-authored-by: Pete Bacon Darwin * [C3] Bump create-remix from 2.5.1 to 2.6.0 in /packages/create-cloudflare/src/frameworks (#4903) * [C3] Bump create-remix in /packages/create-cloudflare/src/frameworks Bumps [create-remix](https://github.com/remix-run/remix/tree/HEAD/packages/create-remix) from 2.5.1 to 2.6.0. - [Release notes](https://github.com/remix-run/remix/releases) - [Changelog](https://github.com/remix-run/remix/blob/main/packages/create-remix/CHANGELOG.md) - [Commits](https://github.com/remix-run/remix/commits/create-remix@2.6.0/packages/create-remix) --- updated-dependencies: - dependency-name: create-remix dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] * [C3] Update frameworks cli dependencies --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Wrangler automated PR updater * fix: ensure that the Pages dev proxy server does not change the Host header (#4888) * fix: ensure that the Pages dev proxy server does not change the Host header Previously, when configuring `wrangler pages dev` to use a proxy to a 3rd party dev server, the proxy would replace the Host header, resulting in problems at the dev server if it was checking for cross-site scripting attacks. Now the proxy server passes through the Host header unaltered making it invisible to the 3rd party dev server. Fixes #4799 * fixup! fix: ensure that the Pages dev proxy server does not change the Host header * fixup! fix: ensure that the Pages dev proxy server does not change the Host header * fixup! fix: ensure that the Pages dev proxy server does not change the Host header * fix: fallback to returning stack trace if `format-errors` broken (#4908) * expose cf object from Miniflare instances (#4905) * expose cf object from Miniflare instances --------- Co-authored-by: MrBBot * Add a `ctx` field to the `getBindingsProxy` result (#4922) * split get-bindings-proxy fixture tests in multiple files * add ctx to getBindingsProxy * add ctx tests to get-bindings-proxy fixture --------- Co-authored-by: MrBBot * fix(d1): intercept and stringify errors thrown in --json mode (#4872) * fix(d1): intercept and stringify errors thrown in --json mode * fix: add light parsing on d1 execute error message * add test * wip * implement JsonFriendlyFatalError * address PR feedback * fix: don't report invalid `format-errors` input (#4911) * [D1] Add user friendly D1 validation error messages for `dev --remote` and `deploy` (#4914) * chore: refactor user friendly errors and add 10063 code handling * chore: add user friendly error message to deployments * chore: lint * chore: cleanup directory * chore: fix remote error handling * chore: add patch change set * chore: move ABORT_ERR logic out of the user friendly error handler * chore: fix syntax * Update .changeset/fair-shoes-melt.md Co-authored-by: Pete Bacon Darwin --------- Co-authored-by: Pete Bacon Darwin * [D1] teach wrangler how to fetch insights about D1's queries (#4909) * [D1] teach wrangler how to fetch insights about D1's queries * add options for sorting and number of results * add a means of filtering the datetime geq/leq * add more options, fix tests * add wrangler banner, add warning text * add changeset * rename last to timePeriod * use string format over numbers as >1mo's data could be possible * Update red-icons-flow.md * make it possible to order by count * Update red-icons-flow.md * PR feedback * Update .changeset/red-icons-flow.md Co-authored-by: Pete Bacon Darwin * address PR feedback --------- Co-authored-by: Pete Bacon Darwin * [wrangler] test: fix E2E tests (#4907) * ci: ensure E2E tests failures not reported as successes #4458 accidentally left `continue-on-error: true` in the workflow. This meant that E2E test failures weren't reported as check failures. * test: remove redundant `deploy.test.ts` E2E test This test was a strict subset of `deployments.test.ts`, and wasn't providing any additional value. * test: remove standard pricing warning from E2E test output This warning may not be shown if an account has opted-in to standard pricing. This change normalises output to remove it, meaning tests can be run on any account, regardless of opt-in state. * test: update format of writing logs message in E2E tests * test: use `fallback` instead of `default` in E2E tests #4571 updated the message logged in non-interactive contexts for default values to use the word `fallback` instead of `default`. * fix: ensure `--log-level` argument applied immediately Some messages were being logged before the `--log-level` argument was applied. In particular, this meant the "writing logs to" message at `debug` level, was not output when using `--log-level=debug`. * test: improve reliability of `dev.test.ts` E2E test This change makes a few improvements to `dev.test.ts`: - Shorter timeouts for `fetch()` retries - Replaces `get-port` with native port `0` implementation - Waits for stdout handlers to be registered before starting `wrangler dev` - Ignores errors if we failed to find a proccess to kill (assumes the process was already killed) * fix: throw `UserError`s for R2 object/bucket not-found errors These errors should not be reported to Sentry. * global install * Ensure internal workers listen on ipv4 only * shorter timeouts --------- Co-authored-by: Samuel Macleod * Python support (#4901) * Implements wrangler deploy/dev support for Python. * Migrate to `--no-bundle` approach * Basic support for `requirements.txt` * address review comments * Strip package versions * Support deploying * Address comments * Fix tests --------- Co-authored-by: Dominik Picheta * Version Packages (#4891) Co-authored-by: github-actions[bot] * [C3] Add `getBindingsProxy` support to qwik template (#4927) * Split package.json update into multiple phases * Add getBindingsProxy to qwik template * Add support for testing dev scripts in c3 e2e tests * Refactor framework e2e verification helpers * Add support for verifying build script in framework e2es * Refactor e2e helpers * Refactor workers e2e tests to re-align with frameworks tests * Refactor RunnerConfig in e2e tests * Fixing cli e2e tests * changeset * remove leftover test value * Fix issue with npm tests & fix e2e logging * Addressing PR feedback * [playground-preview-worker] fix: don't report invalid upload input (#4937) * fix: don't report invalid `edge-preview-authenticated-proxy` URLs (#4939) * [D1] print wrangler banner at the start of every D1 command (#4938) * fix: make the entrypoint optional for the `types` command (#4931) --------- Co-authored-by: MrBBot * add a `cf` field to the `getBindingsProxy` result (#4926) --------- Co-authored-by: James Culveyhouse * Improve DX with `node:*` modules (#4499) * Turn node build failures into warnings * Allow files to suppress warnings for specific modules * Remove comment-based allow-listing * Add changeset * Update .changeset/chatty-balloons-impress.md Co-authored-by: James M Snell * Pass through `defineNavigatorUserAgent` * Add tests * lockfile * Linting * fix tests --------- Co-authored-by: James M Snell * improve(r2): Update Sippy endpoint request payloads to match new schema (#4928) * fix(r2): update Sippy API request payloads * improve(r2): rename sippy flags for clarity * improve(r2): stricter existence in Sippy * [C3] Bump create-qwik from 1.4.3 to 1.4.4 in /packages/create-cloudflare/src/frameworks (#4935) * [C3] Bump create-qwik in /packages/create-cloudflare/src/frameworks Bumps [create-qwik](https://github.com/BuilderIO/qwik/tree/HEAD/packages/create-qwik) from 1.4.3 to 1.4.4. - [Release notes](https://github.com/BuilderIO/qwik/releases) - [Commits](https://github.com/BuilderIO/qwik/commits/v1.4.4/packages/create-qwik) --- updated-dependencies: - dependency-name: create-qwik dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] * [C3] Update frameworks cli dependencies --------- Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Wrangler automated PR updater * fix: allow `port` option to be specified with `unstable_dev()` (#4953) * Extend error handling of proxy request errors in ProxyWorker (#4867) * ignore stale proxy errors * attempt to recover from ProxyWorker fetch errors by requeuing the request (only if it is GET or HEAD) * add test * add changeset * refactor to flatten nested if-else blocks * requeue request for retry at front of queue * sort batch of requests in queue by order of arrival * Revert "sort batch of requests in queue by order of arrival" This reverts commit 3329f19b195a58637e8f0de62ef179bb90adbc29. * Revert "requeue request for retry at front of queue" This reverts commit f0377b7c26575805d9f974403446cf8adee2f34a. * prioritise requests being retried over requests being proxied for the first time * better comments * update changeset to match recommeded format * Update five-cooks-share.md (#4956) * Version Packages (#4934) Co-authored-by: github-actions[bot] --------- Signed-off-by: dependabot[bot] Co-authored-by: Pete Bacon Darwin Co-authored-by: Dario Piotrowicz Co-authored-by: James Culveyhouse Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Wrangler automated PR updater Co-authored-by: petero-dk <2478689+petero-dk@users.noreply.github.com> Co-authored-by: Dominik Picheta Co-authored-by: workers-devprod@cloudflare.com <116369605+workers-devprod@users.noreply.github.com> Co-authored-by: github-actions[bot] Co-authored-by: Peter Wu Co-authored-by: Taylor Smith Co-authored-by: Peter Bacon Darwin Co-authored-by: Magnus Dahlstrand Co-authored-by: Max Rozen <3822106+rozenmd@users.noreply.github.com> Co-authored-by: Nora Söderlund Co-authored-by: Samuel Macleod Co-authored-by: Dominik Picheta Co-authored-by: James M Snell Co-authored-by: Siddhant Co-authored-by: Rahul Sethi <5822355+RamIdeas@users.noreply.github.com> --- .changeset/c3-frameworks-update-4855.md | 5 - .changeset/c3-frameworks-update-4856.md | 5 - .changeset/c3-frameworks-update-4857.md | 5 - .changeset/c3-frameworks-update-4858.md | 5 - .changeset/eleven-carrots-happen.md | 7 - .changeset/forty-needles-turn.md | 7 - .changeset/four-teachers-push.md | 8 - .changeset/happy-pandas-sparkle.md | 12 - .changeset/wet-lemons-wash.md | 6 - .../generate-c3-dependabot-pr-changeset.mjs | 2 +- .github/workflows/e2e.yml | 19 +- ...ploy-edge-preview-authenticated-proxy.yml} | 27 +- ...rs.yml => worker-format-errors-deploy.yml} | 30 +- ...-playground-preview-deploy-production.yml} | 27 +- ...ker-playground-preview-deploy-testing.yml} | 36 +- CONTRIBUTING.md | 35 - fixtures/dev-env/package.json | 2 +- fixtures/dev-env/tests/index.test.ts | 53 +- ...ts => get-bindings-proxy.bindings.test.ts} | 36 +- .../tests/get-bindings-proxy.caches.test.ts | 34 + .../tests/get-bindings-proxy.cf.test.ts | 43 + .../tests/get-bindings-proxy.ctx.test.ts | 55 + fixtures/get-bindings-proxy/tests/shared.ts | 13 + .../tests/specified-port.test.ts | 47 + fixtures/pages-proxy-app/package.json | 25 + fixtures/pages-proxy-app/server/index.ts | 10 + fixtures/pages-proxy-app/tests/index.test.ts | 36 + fixtures/pages-proxy-app/tests/tsconfig.json | 7 + fixtures/pages-proxy-app/tsconfig.json | 13 + fixtures/pages-proxy-app/turbo.json | 9 + fixtures/pages-proxy-app/vitest.config.ts | 9 + .../pages-workerjs-app/tests/index.test.ts | 93 + .../workerjs-test/_routes.json | 6 + fixtures/python-worker/requirements.txt | 1 + fixtures/python-worker/src/arith.py | 2 + fixtures/python-worker/src/index.py | 8 + fixtures/python-worker/src/other.py | 2 + fixtures/python-worker/wrangler.toml | 4 + .../shared/src/run-wrangler-long-lived.ts | 11 +- packages/create-cloudflare/.eslintrc.js | 1 + packages/create-cloudflare/CHANGELOG.md | 53 + .../create-cloudflare/e2e-tests/cli.test.ts | 61 +- .../fixtures/qwik/src/routes/test/index.ts | 10 + .../e2e-tests/fixtures/qwik/wrangler.toml | 2 + .../e2e-tests/frameworks.test.ts | 610 ++- .../create-cloudflare/e2e-tests/helpers.ts | 193 +- .../e2e-tests/workers.test.ts | 239 +- packages/create-cloudflare/package.json | 2 +- packages/create-cloudflare/src/cli.ts | 7 +- .../src/frameworks/package.json | 10 +- .../create-cloudflare/src/helpers/codemod.ts | 21 + packages/create-cloudflare/src/templates.ts | 38 +- .../create-cloudflare/templates/next/c3.ts | 2 +- .../templates/next/templates.ts | 36 +- .../create-cloudflare/templates/qwik/c3.ts | 64 +- packages/create-cloudflare/tsconfig.json | 1 + .../package.json | 2 +- .../src/index.ts | 17 + .../tests/index.test.ts | 32 + packages/format-errors/package.json | 2 +- packages/format-errors/src/index.ts | 10 +- packages/miniflare/CHANGELOG.md | 48 + packages/miniflare/README.md | 6 +- packages/miniflare/package.json | 4 +- packages/miniflare/src/index.ts | 19 +- .../miniflare/src/plugins/core/modules.ts | 14 + .../src/plugins/core/proxy/client.ts | 1 + .../src/runtime/config/workerd.capnp | 26 +- .../src/runtime/config/workerd.capnp.d.ts | 1796 ++++---- .../src/runtime/config/workerd.capnp.js | 3683 +++++------------ .../miniflare/src/runtime/config/workerd.ts | 3 + .../miniflare/src/workers/core/devalue.ts | 1 + packages/miniflare/test/index.spec.ts | 48 + packages/pages-shared/CHANGELOG.md | 16 + packages/pages-shared/package.json | 2 +- .../playground-preview-worker/.eslintrc.js | 4 + .../playground-preview-worker/package.json | 7 +- .../playground-preview-worker/src/errors.ts | 124 + .../playground-preview-worker/src/index.ts | 179 +- .../definitions/json.module.template | 25 +- .../playground-preview-worker/src/realish.ts | 2 +- .../playground-preview-worker/src/sentry.ts | 4 +- .../playground-preview-worker/src/user.do.ts | 108 +- .../tests/index.test.ts | 306 +- packages/wrangler/CHANGELOG.md | 237 ++ packages/wrangler/e2e/c3-integration.test.ts | 5 +- packages/wrangler/e2e/deploy.test.ts | 118 - packages/wrangler/e2e/deployments.test.ts | 8 +- packages/wrangler/e2e/dev.test.ts | 108 +- packages/wrangler/e2e/helpers/normalize.ts | 21 +- packages/wrangler/e2e/r2.test.ts | 8 +- packages/wrangler/package.json | 10 +- packages/wrangler/scripts/deps.ts | 3 + packages/wrangler/scripts/emit-types.ts | 73 +- packages/wrangler/src/__tests__/d1/d1.test.ts | 2 + .../wrangler/src/__tests__/d1/execute.test.ts | 13 + .../wrangler/src/__tests__/deploy.test.ts | 89 +- packages/wrangler/src/__tests__/dev.test.tsx | 12 +- .../__tests__/navigator-user-agent.test.ts | 193 + .../__tests__/pages/functions-build.test.ts | 23 +- packages/wrangler/src/__tests__/r2.test.ts | 45 +- .../src/__tests__/type-generation.test.ts | 20 + .../integrations/bindings/executionContext.ts | 5 + .../src/api/integrations/bindings/index.ts | 43 +- packages/wrangler/src/api/pages/deploy.tsx | 8 + packages/wrangler/src/cfetch/internal.ts | 4 +- packages/wrangler/src/config/environment.ts | 4 +- packages/wrangler/src/config/index.ts | 11 + packages/wrangler/src/d1/create.tsx | 2 + packages/wrangler/src/d1/delete.ts | 2 + packages/wrangler/src/d1/execute.tsx | 97 +- packages/wrangler/src/d1/index.ts | 7 + packages/wrangler/src/d1/info.tsx | 5 +- packages/wrangler/src/d1/insights.ts | 170 + packages/wrangler/src/d1/list.tsx | 2 + packages/wrangler/src/d1/migrations/apply.tsx | 2 + .../wrangler/src/d1/migrations/create.tsx | 2 + packages/wrangler/src/d1/migrations/list.tsx | 2 + packages/wrangler/src/d1/types.ts | 32 + packages/wrangler/src/deploy/deploy.ts | 52 +- packages/wrangler/src/deploy/index.ts | 4 +- .../src/deployment-bundle/bundle-type.ts | 10 +- .../wrangler/src/deployment-bundle/bundle.ts | 7 +- .../create-worker-upload-form.ts | 4 + .../esbuild-plugins/nodejs-compat.ts | 71 +- .../find-additional-modules.ts | 36 +- .../deployment-bundle/guess-worker-format.ts | 11 + .../deployment-bundle/module-collection.ts | 2 + .../wrangler/src/deployment-bundle/worker.ts | 4 +- packages/wrangler/src/dev.tsx | 16 +- packages/wrangler/src/dev/dev.tsx | 5 + packages/wrangler/src/dev/miniflare.ts | 34 +- packages/wrangler/src/dev/proxy.ts | 10 +- packages/wrangler/src/dev/remote.tsx | 98 +- packages/wrangler/src/dev/start-server.ts | 12 +- packages/wrangler/src/dev/use-esbuild.ts | 6 +- packages/wrangler/src/errors.ts | 14 + packages/wrangler/src/index.ts | 12 +- packages/wrangler/src/miniflare-cli/assets.ts | 65 +- packages/wrangler/src/navigator-user-agent.ts | 21 + packages/wrangler/src/pages/build.ts | 15 +- packages/wrangler/src/pages/buildFunctions.ts | 4 + packages/wrangler/src/pages/dev.ts | 18 +- .../src/pages/functions/buildPlugin.ts | 2 + .../src/pages/functions/buildWorker.ts | 9 + packages/wrangler/src/r2/helpers.ts | 66 +- packages/wrangler/src/r2/index.ts | 3 + packages/wrangler/src/r2/sippy.ts | 110 +- packages/wrangler/src/type-generation.ts | 13 +- packages/wrangler/src/user/user.ts | 2 +- packages/wrangler/src/versions/upload.ts | 5 + .../templates/startDevWorker/ProxyWorker.ts | 106 +- pnpm-lock.yaml | 205 +- templates/stream/webrtc/package.json | 4 +- templates/stream/webrtc/src/index.html | 26 +- 155 files changed, 6188 insertions(+), 4874 deletions(-) delete mode 100644 .changeset/c3-frameworks-update-4855.md delete mode 100644 .changeset/c3-frameworks-update-4856.md delete mode 100644 .changeset/c3-frameworks-update-4857.md delete mode 100644 .changeset/c3-frameworks-update-4858.md delete mode 100644 .changeset/eleven-carrots-happen.md delete mode 100644 .changeset/forty-needles-turn.md delete mode 100644 .changeset/four-teachers-push.md delete mode 100644 .changeset/happy-pandas-sparkle.md delete mode 100644 .changeset/wet-lemons-wash.md rename .github/workflows/{edge-preview-authenticated-proxy.yml => worker-deploy-edge-preview-authenticated-proxy.yml} (62%) rename .github/workflows/{format-errors.yml => worker-format-errors-deploy.yml} (62%) rename .github/workflows/{playground-preview-worker.yml => worker-playground-preview-deploy-production.yml} (63%) rename .github/workflows/{playground-worker-tests.yml => worker-playground-preview-deploy-testing.yml} (60%) rename fixtures/get-bindings-proxy/tests/{get-bindings-proxy.test.ts => get-bindings-proxy.bindings.test.ts} (89%) create mode 100644 fixtures/get-bindings-proxy/tests/get-bindings-proxy.caches.test.ts create mode 100644 fixtures/get-bindings-proxy/tests/get-bindings-proxy.cf.test.ts create mode 100644 fixtures/get-bindings-proxy/tests/get-bindings-proxy.ctx.test.ts create mode 100644 fixtures/get-bindings-proxy/tests/shared.ts create mode 100644 fixtures/local-mode-tests/tests/specified-port.test.ts create mode 100644 fixtures/pages-proxy-app/package.json create mode 100644 fixtures/pages-proxy-app/server/index.ts create mode 100644 fixtures/pages-proxy-app/tests/index.test.ts create mode 100644 fixtures/pages-proxy-app/tests/tsconfig.json create mode 100644 fixtures/pages-proxy-app/tsconfig.json create mode 100644 fixtures/pages-proxy-app/turbo.json create mode 100644 fixtures/pages-proxy-app/vitest.config.ts create mode 100644 fixtures/pages-workerjs-app/workerjs-test/_routes.json create mode 100644 fixtures/python-worker/requirements.txt create mode 100644 fixtures/python-worker/src/arith.py create mode 100644 fixtures/python-worker/src/index.py create mode 100644 fixtures/python-worker/src/other.py create mode 100644 fixtures/python-worker/wrangler.toml create mode 100644 packages/create-cloudflare/e2e-tests/fixtures/qwik/src/routes/test/index.ts create mode 100644 packages/create-cloudflare/e2e-tests/fixtures/qwik/wrangler.toml create mode 100644 packages/playground-preview-worker/.eslintrc.js create mode 100644 packages/playground-preview-worker/src/errors.ts delete mode 100644 packages/wrangler/e2e/deploy.test.ts create mode 100644 packages/wrangler/src/__tests__/navigator-user-agent.test.ts create mode 100644 packages/wrangler/src/api/integrations/bindings/executionContext.ts create mode 100644 packages/wrangler/src/d1/insights.ts create mode 100644 packages/wrangler/src/navigator-user-agent.ts diff --git a/.changeset/c3-frameworks-update-4855.md b/.changeset/c3-frameworks-update-4855.md deleted file mode 100644 index d6399620b9e3..000000000000 --- a/.changeset/c3-frameworks-update-4855.md +++ /dev/null @@ -1,5 +0,0 @@ ---- -"create-cloudflare": patch ---- - -C3: Bumped `create-docusaurus` from `3.1.0` to `3.1.1` diff --git a/.changeset/c3-frameworks-update-4856.md b/.changeset/c3-frameworks-update-4856.md deleted file mode 100644 index 0d3ac24f5e69..000000000000 --- a/.changeset/c3-frameworks-update-4856.md +++ /dev/null @@ -1,5 +0,0 @@ ---- -"create-cloudflare": patch ---- - -C3: Bumped `create-astro` from `4.7.1` to `4.7.2` diff --git a/.changeset/c3-frameworks-update-4857.md b/.changeset/c3-frameworks-update-4857.md deleted file mode 100644 index c427883294bf..000000000000 --- a/.changeset/c3-frameworks-update-4857.md +++ /dev/null @@ -1,5 +0,0 @@ ---- -"create-cloudflare": patch ---- - -C3: Bumped `create-qwik` from `1.4.1` to `1.4.2` diff --git a/.changeset/c3-frameworks-update-4858.md b/.changeset/c3-frameworks-update-4858.md deleted file mode 100644 index 08c9adcd2aa3..000000000000 --- a/.changeset/c3-frameworks-update-4858.md +++ /dev/null @@ -1,5 +0,0 @@ ---- -"create-cloudflare": patch ---- - -C3: Bumped `create-solid` from `0.3.10` to `0.4.10` diff --git a/.changeset/eleven-carrots-happen.md b/.changeset/eleven-carrots-happen.md deleted file mode 100644 index b593dce6f13f..000000000000 --- a/.changeset/eleven-carrots-happen.md +++ /dev/null @@ -1,7 +0,0 @@ ---- -"wrangler": patch ---- - -fix: allow empty strings in secret:bulk upload - -Previously, the `secret:bulk` command would fail if any of the secrets in the secret.json file were empty strings and they already existed remotely. diff --git a/.changeset/forty-needles-turn.md b/.changeset/forty-needles-turn.md deleted file mode 100644 index fcb9a23bf7ef..000000000000 --- a/.changeset/forty-needles-turn.md +++ /dev/null @@ -1,7 +0,0 @@ ---- -"create-cloudflare": minor ---- - -introduce a new optional `previewScript` to the C3 summary - -such script is to be used to locally preview the application (using wrangler) diff --git a/.changeset/four-teachers-push.md b/.changeset/four-teachers-push.md deleted file mode 100644 index 43cb0ab994fd..000000000000 --- a/.changeset/four-teachers-push.md +++ /dev/null @@ -1,8 +0,0 @@ ---- -"create-cloudflare": patch ---- - -update the svelteKit c3 scripts - -replace the incorrect `pages:dev` with a new `pages:preview` script -(and use the standard `dev` script as the `devScript`) diff --git a/.changeset/happy-pandas-sparkle.md b/.changeset/happy-pandas-sparkle.md deleted file mode 100644 index 3a9417ace9b3..000000000000 --- a/.changeset/happy-pandas-sparkle.md +++ /dev/null @@ -1,12 +0,0 @@ ---- -"wrangler": minor ---- - -feat: expose new (no-op) `caches` field in `getBindingsProxy` result - -Add a new `caches` field to the `getBindingsProxy` result, such field implements a -no operation (no-op) implementation of the runtime `caches` - -Note: Miniflare exposes a proper `caches` mock, we will want to use that one in -the future but issues regarding it must be ironed out first, so for the -time being a no-op will have to do diff --git a/.changeset/wet-lemons-wash.md b/.changeset/wet-lemons-wash.md deleted file mode 100644 index 784359da9f7b..000000000000 --- a/.changeset/wet-lemons-wash.md +++ /dev/null @@ -1,6 +0,0 @@ ---- -"@cloudflare/pages-shared": patch -"wrangler": patch ---- - -fix: Use appropriate logging levels when parsing headers and redirects in `wrangler pages dev`. diff --git a/.github/generate-c3-dependabot-pr-changeset.mjs b/.github/generate-c3-dependabot-pr-changeset.mjs index 4b729e402d84..d53cb7619e1b 100644 --- a/.github/generate-c3-dependabot-pr-changeset.mjs +++ b/.github/generate-c3-dependabot-pr-changeset.mjs @@ -54,7 +54,7 @@ ${generateChangesetBody(changes)} function generateChangesetBody(changes) { if (changes.length === 1) { const { package: pkg, from, to } = changes[0]; - return `C3: Bumped \`${pkg}\` from \`${from}\` to \`${to}\``; + return `chore: Bumped \`${pkg}\` from \`${from}\` to \`${to}\``; } return `Framework CLI versions updated in C3 diff --git a/.github/workflows/e2e.yml b/.github/workflows/e2e.yml index 34b38959b915..55d80f421a42 100644 --- a/.github/workflows/e2e.yml +++ b/.github/workflows/e2e.yml @@ -13,7 +13,7 @@ jobs: concurrency: group: ${{ github.workflow }}-${{ github.ref }}-${{ matrix.os }}-${{ matrix.node }} cancel-in-progress: true - timeout-minutes: 30 + timeout-minutes: 15 if: github.repository_owner == 'cloudflare' && (github.event_name != 'pull_request' || (github.event_name == 'pull_request' && contains(github.event.*.labels.*.name, 'e2e' ))) name: "E2E Test" strategy: @@ -73,9 +73,20 @@ jobs: id: "find-wrangler" run: echo "dir=$(ls $HOME/wrangler-*.tgz)" >> $GITHUB_OUTPUT; - - name: Run tests - id: e2e-1 - continue-on-error: true + - name: Run tests (unix) + if: matrix.os == 'macos-13' || matrix.os == 'ubuntu-22.04' + run: | + pnpm add ${{ steps.find-wrangler.outputs.dir}} --global + pnpm run --filter wrangler test:e2e + env: + CLOUDFLARE_API_TOKEN: ${{ secrets.TEST_CLOUDFLARE_API_TOKEN }} + CLOUDFLARE_ACCOUNT_ID: ${{ secrets.TEST_CLOUDFLARE_ACCOUNT_ID }} + WRANGLER: wrangler + NODE_OPTIONS: "--max_old_space_size=8192" + WRANGLER_LOG_PATH: ${{ runner.temp }}/wrangler-debug-logs/ + + - name: Run tests (windows) + if: matrix.os == 'windows-2022' run: pnpm run --filter wrangler test:e2e env: CLOUDFLARE_API_TOKEN: ${{ secrets.TEST_CLOUDFLARE_API_TOKEN }} diff --git a/.github/workflows/edge-preview-authenticated-proxy.yml b/.github/workflows/worker-deploy-edge-preview-authenticated-proxy.yml similarity index 62% rename from .github/workflows/edge-preview-authenticated-proxy.yml rename to .github/workflows/worker-deploy-edge-preview-authenticated-proxy.yml index 470fd3e2f6c1..e7794e00ea45 100644 --- a/.github/workflows/edge-preview-authenticated-proxy.yml +++ b/.github/workflows/worker-deploy-edge-preview-authenticated-proxy.yml @@ -1,16 +1,16 @@ -name: Edge Preview Authenticated Proxy Worker +name: Deploy Edge Preview Authenticated Proxy Worker (production) +# On a push to `deploy-worker/edge-preview-authenticated-proxy`, on Cloudflare, +# deploy to production. on: push: branches: - - main - paths: - - "packages/edge-preview-authenticated-proxy/**" + - deploy-worker/edge-preview-authenticated-proxy jobs: - publish_worker: + deploy_worker: if: ${{ github.repository_owner == 'cloudflare' }} - name: Publish Worker + name: Deploy Edge Preview Authenticated Proxy (production) runs-on: ubuntu-latest steps: @@ -18,33 +18,36 @@ jobs: uses: actions/checkout@v3 with: fetch-depth: 0 - - uses: pnpm/action-setup@v2 + + - name: Use pnpm 8.8.0 + uses: pnpm/action-setup@v2 with: version: 8.8.0 + - name: Use Node.js 16.18 uses: actions/setup-node@v3 with: node-version: 16.18 cache: "pnpm" - - name: Install workerd Dependencies + - name: Install workerd dependencies if: ${{ runner.os == 'Linux' }} run: | export DEBIAN_FRONTEND=noninteractive sudo apt-get update sudo apt-get install -y libc++1 - - name: Install NPM Dependencies + - name: Install NPM dependencies run: pnpm install --frozen-lockfile - - name: Build wrangler + - name: Build tools and libraries run: pnpm run build env: NODE_ENV: "production" CI_OS: ${{ runner.os }} - - name: Build & Publish Worker - run: pnpm run publish + - name: Build & deploy Worker + run: pnpm run deploy env: NODE_ENV: "production" CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} diff --git a/.github/workflows/format-errors.yml b/.github/workflows/worker-format-errors-deploy.yml similarity index 62% rename from .github/workflows/format-errors.yml rename to .github/workflows/worker-format-errors-deploy.yml index a91824994016..4f9fce7ce29c 100644 --- a/.github/workflows/format-errors.yml +++ b/.github/workflows/worker-format-errors-deploy.yml @@ -1,49 +1,53 @@ -name: Error Formatting Worker +name: Deploy Format Errors Worker (production) +# On a push to `deploy-worker/format-errors`, on Cloudflare, +# deploy to production. on: push: branches: - - main - paths: - - "packages/format-errors/**" + - deploy-worker/format-errors jobs: - publish_worker: + deploy_worker: if: ${{ github.repository_owner == 'cloudflare' }} - name: Publish Worker + name: Deploy Format Errors Worker (production) runs-on: ubuntu-latest steps: - - name: Checkout Repo + - name: Checkout repo uses: actions/checkout@v3 with: fetch-depth: 0 - - uses: pnpm/action-setup@v2 + + - name: Use pnpm 8.8.0 + uses: pnpm/action-setup@v2 with: version: 8.8.0 + - name: Use Node.js 16.18 uses: actions/setup-node@v3 with: node-version: 16.18 cache: "pnpm" - - name: Install workerd Dependencies + - name: Install workerd dependencies if: ${{ runner.os == 'Linux' }} run: | export DEBIAN_FRONTEND=noninteractive sudo apt-get update sudo apt-get install -y libc++1 - - name: Install NPM Dependencies + + - name: Install NPM dependencies run: pnpm install --frozen-lockfile - - name: Build wrangler + - name: Build tools and libraries run: pnpm run build env: NODE_ENV: "production" CI_OS: ${{ runner.os }} - - name: Build & Publish Worker - run: pnpm run publish + - name: Build & deploy Worker + run: pnpm run deploy env: NODE_ENV: "production" CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} diff --git a/.github/workflows/playground-preview-worker.yml b/.github/workflows/worker-playground-preview-deploy-production.yml similarity index 63% rename from .github/workflows/playground-preview-worker.yml rename to .github/workflows/worker-playground-preview-deploy-production.yml index 49d653edbe40..965c989c4c41 100644 --- a/.github/workflows/playground-preview-worker.yml +++ b/.github/workflows/worker-playground-preview-deploy-production.yml @@ -1,49 +1,52 @@ -name: Playground Preview Worker +name: Deploy Playground Preview Worker (production) +# On a push to `deploy-worker/playground-preview-worker`, on Cloudflare, +# deploy to production. on: push: branches: - - main - paths: - - "packages/playground-preview-worker/**" + - deploy-worker/playground-preview-worker jobs: - publish_worker: + deploy_worker: if: ${{ github.repository_owner == 'cloudflare' }} - name: Publish Worker + name: Deploy Playground Preview Worker (production) runs-on: ubuntu-latest steps: - - name: Checkout Repo + - name: Checkout repo uses: actions/checkout@v3 with: fetch-depth: 0 - - uses: pnpm/action-setup@v2 + + - name: Use pnpm 8.8.0 + uses: pnpm/action-setup@v2 with: version: 8.8.0 + - name: Use Node.js 16.18 uses: actions/setup-node@v3 with: node-version: 16.18 cache: "pnpm" - - name: Install workerd Dependencies + - name: Install workerd dependencies if: ${{ runner.os == 'Linux' }} run: | export DEBIAN_FRONTEND=noninteractive sudo apt-get update sudo apt-get install -y libc++1 - - name: Install NPM Dependencies + - name: Install NPM dependencies run: pnpm install --frozen-lockfile - - name: Build wrangler + - name: Build tools and libraries run: pnpm run build env: NODE_ENV: "production" CI_OS: ${{ runner.os }} - - name: Build & Publish Worker + - name: Build & deploy Worker run: pnpm run deploy env: NODE_ENV: "production" diff --git a/.github/workflows/playground-worker-tests.yml b/.github/workflows/worker-playground-preview-deploy-testing.yml similarity index 60% rename from .github/workflows/playground-worker-tests.yml rename to .github/workflows/worker-playground-preview-deploy-testing.yml index 2d8d8cca3106..203fc6de6253 100644 --- a/.github/workflows/playground-worker-tests.yml +++ b/.github/workflows/worker-playground-preview-deploy-testing.yml @@ -1,50 +1,58 @@ -name: Playground Worker tests +name: Deploy Playground Preview Worker (testing) +# On a push to `main`, on Cloudflare, where there are changes to the files in this worker's package, +# or an update to a PR, on Cloudflare, labelled as `playground-worker`, +# deploy to testing and then run the end-to-end tests against this deployment. on: push: branches: - main - - changeset-release/main + paths: + - "packages/playground-preview-worker/**" + pull_request: types: [synchronize, opened, reopened, labeled, unlabeled] - repository_dispatch: jobs: e2e-test: - if: github.repository_owner == 'cloudflare' && (github.event_name != 'pull_request' || (github.event_name == 'pull_request' && contains(github.event.*.labels.*.name, 'playground-worker' )) || (github.event_name == 'pull_request' && github.head_ref == 'changeset-release/main')) - name: "Playground Worker Test" + if: github.repository_owner == 'cloudflare' && (github.event_name != 'pull_request' || contains(github.event.*.labels.*.name, 'playground-worker')) + name: "Deploy Playground Preview Worker (testing)" runs-on: ubuntu-latest + steps: - - name: Checkout Repo + - name: Checkout repo uses: actions/checkout@v3 with: fetch-depth: 0 - - uses: pnpm/action-setup@v2 + + - name: Use pnpm 8.8.0 + uses: pnpm/action-setup@v2 with: version: 8.8.0 - - name: Use Node.js 18 + + - name: Use Node.js 16.18 uses: actions/setup-node@v3 with: - node-version: 18 + node-version: 16.18 cache: "pnpm" - - name: Install workerd Dependencies + - name: Install workerd dependencies if: ${{ runner.os == 'Linux' }} run: | export DEBIAN_FRONTEND=noninteractive sudo apt-get update sudo apt-get install -y libc++1 - - name: Install NPM Dependencies + - name: Install NPM dependencies run: pnpm install --frozen-lockfile - - name: Run builds + - name: Build tools and libraries run: pnpm run build env: NODE_ENV: "production" CI_OS: ${{ runner.os }} - - name: Build & Publish Testing Playground Worker + - name: Build & deploy Worker run: pnpm run deploy:testing env: NODE_ENV: "production" @@ -52,7 +60,7 @@ jobs: working-directory: packages/playground-preview-worker - name: Run tests & collect coverage - run: pnpm run test:ci + run: pnpm run test:e2e env: TMP_CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} TMP_CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }} diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 5a708ffe0a28..ec21195f2bbd 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -319,38 +319,3 @@ We use the following guidelines to determine the kind of change for a PR: ## Releases We generally cut Wrangler releases at the start of each week. If you need a release cut outside of the regular cadence, please reach out to the [@cloudflare/wrangler-admins](https://github.com/orgs/cloudflare/teams/wrangler-admins) team. - -## Miniflare Development - -Wrangler builds upon, and provides a new entry point for, [Miniflare](https://github.com/cloudflare/miniflare), a local Cloudflare Workers simulator. To develop on both Wrangler and Miniflare together, you need to link the two projects, but as of NodeJS `v18.3.0` and NPM `v8.15.0`, relative NPM installs between two workspaces don't work, so you need things to be manual: - -Assume you have the two directories checked out right beside each other: - -``` -❯ ll src -drwxr-xr-x - user 30 Jun 14:12 src -drwxr-xr-x - user 26 Jul 17:34 ├── miniflare -drwxr-xr-x - user 27 Jul 17:51 └── workers-sdk -``` - -> Note: recommend using [exa](https://the.exa.website/) and `alias ll='exa --icons -laTL 1'` for the above output - -Inside `packages/wrangler/package.json`, replace: - -``` -"@miniflare/d1": "^2.x.x", -"@miniflare/core": "^2.x.x", -"@miniflare/durable-objects": "^2.x.x", -"miniflare": "^2.x.x", -``` - -with - -``` -"miniflare": "file:../../../miniflare/packages/miniflare", -"@miniflare/d1": "file:../../../miniflare/packages/d1", -"@miniflare/core": "file:../../../miniflare/packages/core", -"@miniflare/durable-objects": "file:../../../miniflare/packages/durable-objects", -``` - -Then run `npm install` in the root of this monorepo. diff --git a/fixtures/dev-env/package.json b/fixtures/dev-env/package.json index 2cf95197a8fe..d216a30fc2ef 100644 --- a/fixtures/dev-env/package.json +++ b/fixtures/dev-env/package.json @@ -16,7 +16,7 @@ "@types/ws": "^8.5.7", "@cloudflare/workers-tsconfig": "workspace:^", "get-port": "^7.0.0", - "miniflare": "3.20231218.4", + "miniflare": "3.20240129.1", "undici": "^5.28.2", "wrangler": "workspace:*", "ws": "^8.14.2" diff --git a/fixtures/dev-env/tests/index.test.ts b/fixtures/dev-env/tests/index.test.ts index 908f44e0e456..aa2747da0aa5 100644 --- a/fixtures/dev-env/tests/index.test.ts +++ b/fixtures/dev-env/tests/index.test.ts @@ -230,7 +230,7 @@ describe("startDevWorker: ProxyController", () => { fireAndForgetFakeUserWorkerChanges({ mfOpts: run.mfOpts, config: run.config, - script: run.mfOpts.script.replace("1", "2"), + script: run.mfOpts.script.replace("body:1", "body:2"), }); res = await run.worker.fetch("http://dummy"); @@ -295,7 +295,7 @@ describe("startDevWorker: ProxyController", () => { fireAndForgetFakeUserWorkerChanges({ mfOpts: run.mfOpts, config: run.config, - script: run.mfOpts.script.replace("1", "2"), + script: run.mfOpts.script.replace("body:1", "body:2"), }); await executionContextClearedPromise; }); @@ -599,4 +599,53 @@ describe("startDevWorker: ProxyController", () => { "URL: https://mybank.co.uk/test/path/2" ); }); + + test("inflight requests are retried during UserWorker reloads", async () => { + // to simulate inflight requests failing during UserWorker reloads, + // we will use a UserWorker with a longish `await setTimeout(...)` + // so that we can guarantee the race condition is hit + // when workerd is eventually terminated + + const run = await fakeStartUserWorker({ + script: ` + export default { + async fetch(request) { + const url = new URL(request.url); + + if (url.pathname === '/long') { + await new Promise(r => setTimeout(r, 30_000)); + } + return new Response("UserWorker:1"); + } + } + `, + }); + + res = await run.worker.fetch("http://dummy/short"); // implicitly waits for UserWorker:1 to be ready + await expect(res.text()).resolves.toBe("UserWorker:1"); + + const inflightDuringReloads = run.worker.fetch("http://dummy/long"); + + // this will cause workerd for UserWorker:1 to terminate (eventually, but soon) + fireAndForgetFakeUserWorkerChanges({ + mfOpts: run.mfOpts, + config: run.config, + script: run.mfOpts.script.replace("UserWorker:1", "UserWorker:2"), // change response so it can be identified + }); + + res = await run.worker.fetch("http://dummy/short"); // implicitly waits for UserWorker:2 to be ready + await expect(res.text()).resolves.toBe("UserWorker:2"); + + // this will cause workerd for UserWorker:2 to terminate (eventually, but soon) + fireAndForgetFakeUserWorkerChanges({ + mfOpts: run.mfOpts, + config: run.config, + script: run.mfOpts.script + .replace("UserWorker:1", "UserWorker:3") // change response so it can be identified + .replace("30_000", "0"), // remove the long wait as we won't reload this UserWorker + }); + + res = await inflightDuringReloads; + await expect(res.text()).resolves.toBe("UserWorker:3"); + }); }); diff --git a/fixtures/get-bindings-proxy/tests/get-bindings-proxy.test.ts b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.bindings.test.ts similarity index 89% rename from fixtures/get-bindings-proxy/tests/get-bindings-proxy.test.ts rename to fixtures/get-bindings-proxy/tests/get-bindings-proxy.bindings.test.ts index 560840f2323d..e9c97442b039 100644 --- a/fixtures/get-bindings-proxy/tests/get-bindings-proxy.test.ts +++ b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.bindings.test.ts @@ -6,7 +6,6 @@ import { Fetcher, R2Bucket, } from "@cloudflare/workers-types"; -import { Request, Response } from "undici"; import { afterAll, beforeAll, describe, expect, it } from "vitest"; import { getBindingsProxy as originalGetBindingsProxy, @@ -42,7 +41,7 @@ function getBindingsProxy( }); } -describe("getBindingsProxy", () => { +describe("getBindingsProxy - bindings", () => { let devWorkers: UnstableDevWorker[]; beforeAll(async () => { @@ -228,25 +227,6 @@ describe("getBindingsProxy", () => { await dispose(); } }); - - describe("caches", () => { - (["default", "named"] as const).forEach((cacheType) => - it(`correctly obtains a no-op ${cacheType} cache`, async () => { - const { caches, dispose } = await getBindingsProxy({ - configPath: wranglerTomlFilePath, - }); - try { - const cache = - cacheType === "default" - ? caches.default - : await caches.open("my-cache"); - testNoOpCache(cache); - } finally { - await dispose(); - } - }) - ); - }); }); /** @@ -283,17 +263,3 @@ async function testDoBinding( const doRespText = await doResp.text(); expect(doRespText).toBe(expectedResponse); } - -async function testNoOpCache( - cache: Awaited>["caches"]["default"] -) { - let match = await cache.match("http://0.0.0.0/test"); - expect(match).toBeUndefined(); - - const req = new Request("http://0.0.0.0/test"); - await cache.put(req, new Response("test")); - const resp = await cache.match(req); - expect(resp).toBeUndefined(); - const deleted = await cache.delete(req); - expect(deleted).toBe(false); -} diff --git a/fixtures/get-bindings-proxy/tests/get-bindings-proxy.caches.test.ts b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.caches.test.ts new file mode 100644 index 000000000000..8df2a3ac65aa --- /dev/null +++ b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.caches.test.ts @@ -0,0 +1,34 @@ +import { Request, Response } from "undici"; +import { describe, expect, it } from "vitest"; +import { getBindingsProxy } from "./shared"; + +describe("getBindingsProxy - caches", () => { + (["default", "named"] as const).forEach((cacheType) => + it(`correctly obtains a no-op ${cacheType} cache`, async () => { + const { caches, dispose } = await getBindingsProxy(); + try { + const cache = + cacheType === "default" + ? caches.default + : await caches.open("my-cache"); + testNoOpCache(cache); + } finally { + await dispose(); + } + }) + ); +}); + +async function testNoOpCache( + cache: Awaited>["caches"]["default"] +) { + let match = await cache.match("http://0.0.0.0/test"); + expect(match).toBeUndefined(); + + const req = new Request("http://0.0.0.0/test"); + await cache.put(req, new Response("test")); + const resp = await cache.match(req); + expect(resp).toBeUndefined(); + const deleted = await cache.delete(req); + expect(deleted).toBe(false); +} diff --git a/fixtures/get-bindings-proxy/tests/get-bindings-proxy.cf.test.ts b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.cf.test.ts new file mode 100644 index 000000000000..b22e360b1a39 --- /dev/null +++ b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.cf.test.ts @@ -0,0 +1,43 @@ +import { describe, expect, it } from "vitest"; +import { getBindingsProxy } from "./shared"; + +describe("getBindingsProxy - cf", () => { + it("should provide mock data", async () => { + const { cf, dispose } = await getBindingsProxy(); + try { + expect(cf).toMatchObject({ + colo: "DFW", + city: "Austin", + regionCode: "TX", + }); + } finally { + await dispose(); + } + }); + + it("should match the production runtime cf object", async () => { + const { cf, dispose } = await getBindingsProxy(); + try { + expect(cf.constructor.name).toBe("Object"); + + expect(() => { + cf.city = "test city"; + }).toThrowError( + "Cannot assign to read only property 'city' of object '#'" + ); + expect(cf.city).not.toBe("test city"); + + expect(() => { + cf.newField = "test new field"; + }).toThrowError("Cannot add property newField, object is not extensible"); + expect("newField" in cf).toBe(false); + + expect(cf.botManagement).toMatchObject({ + score: 99, + }); + expect(Object.isFrozen(cf.botManagement)).toBe(true); + } finally { + await dispose(); + } + }); +}); diff --git a/fixtures/get-bindings-proxy/tests/get-bindings-proxy.ctx.test.ts b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.ctx.test.ts new file mode 100644 index 000000000000..61f2d53a4d66 --- /dev/null +++ b/fixtures/get-bindings-proxy/tests/get-bindings-proxy.ctx.test.ts @@ -0,0 +1,55 @@ +import { describe, expect, it } from "vitest"; +import { getBindingsProxy } from "./shared"; + +describe("getBindingsProxy - ctx", () => { + it("should provide a no-op waitUntil method", async () => { + const { ctx, dispose } = await getBindingsProxy(); + try { + let value = 4; + ctx.waitUntil( + new Promise((resolve) => { + value++; + resolve(value); + }) + ); + expect(value).toBe(5); + } finally { + await dispose(); + } + }); + + it("should provide a no-op passThroughOnException method", async () => { + const { ctx, dispose } = await getBindingsProxy(); + try { + expect(ctx.passThroughOnException()).toBe(undefined); + } finally { + await dispose(); + } + }); + + it("should match the production runtime ctx object", async () => { + const { ctx, dispose } = await getBindingsProxy(); + try { + expect(ctx.constructor.name).toBe("ExecutionContext"); + expect(typeof ctx.waitUntil).toBe("function"); + expect(typeof ctx.passThroughOnException).toBe("function"); + + ctx.waitUntil = ((str: string) => `- ${str} -`) as any; + expect(ctx.waitUntil("waitUntil can be overridden" as any)).toBe( + "- waitUntil can be overridden -" + ); + + ctx.passThroughOnException = ((str: string) => `_ ${str} _`) as any; + expect( + (ctx.passThroughOnException as any)( + "passThroughOnException can be overridden" + ) + ).toBe("_ passThroughOnException can be overridden _"); + + (ctx as any).text = "the ExecutionContext can be extended"; + expect((ctx as any).text).toBe("the ExecutionContext can be extended"); + } finally { + await dispose(); + } + }); +}); diff --git a/fixtures/get-bindings-proxy/tests/shared.ts b/fixtures/get-bindings-proxy/tests/shared.ts new file mode 100644 index 000000000000..f54262466cac --- /dev/null +++ b/fixtures/get-bindings-proxy/tests/shared.ts @@ -0,0 +1,13 @@ +import { getBindingsProxy as originalGetBindingsProxy } from "wrangler"; +import type { GetBindingsProxyOptions } from "wrangler"; + +// Here we wrap the actual original getBindingsProxy function and disable its persistance, this is to make sure +// that we don't implement any persistance during these tests (which would add unnecessary extra complexity) +export function getBindingsProxy( + options: Omit = {} +): ReturnType> { + return originalGetBindingsProxy({ + ...options, + persist: false, + }); +} diff --git a/fixtures/local-mode-tests/tests/specified-port.test.ts b/fixtures/local-mode-tests/tests/specified-port.test.ts new file mode 100644 index 000000000000..94f596f582da --- /dev/null +++ b/fixtures/local-mode-tests/tests/specified-port.test.ts @@ -0,0 +1,47 @@ +import assert from "node:assert"; +import nodeNet from "node:net"; +import path from "path"; +import { afterAll, beforeAll, describe, expect, it } from "vitest"; +import { unstable_dev } from "wrangler"; +import type { UnstableDevWorker } from "wrangler"; + +function getPort() { + return new Promise((resolve, reject) => { + const server = nodeNet.createServer((socket) => socket.destroy()); + server.listen(0, () => { + const address = server.address(); + assert(typeof address === "object" && address !== null); + server.close((err) => { + if (err) reject(err); + else resolve(address.port); + }); + }); + }); +} + +describe("specific port", () => { + let worker: UnstableDevWorker; + + beforeAll(async () => { + worker = await unstable_dev( + path.resolve(__dirname, "..", "src", "module.ts"), + { + config: path.resolve(__dirname, "..", "wrangler.module.toml"), + port: await getPort(), + experimental: { + disableExperimentalWarning: true, + disableDevRegistry: true, + }, + } + ); + }); + + afterAll(async () => { + await worker?.stop(); + }); + + it("fetches worker", async () => { + const resp = await worker.fetch("/"); + expect(resp.status).toBe(200); + }); +}); diff --git a/fixtures/pages-proxy-app/package.json b/fixtures/pages-proxy-app/package.json new file mode 100644 index 000000000000..cd989ceb1eb6 --- /dev/null +++ b/fixtures/pages-proxy-app/package.json @@ -0,0 +1,25 @@ +{ + "name": "pages-proxy-app", + "version": "0.1.2", + "private": true, + "sideEffects": false, + "main": "server/index.js", + "scripts": { + "build": "esbuild --bundle --platform=node server/index.ts --outfile=dist/index.js", + "check:type": "tsc", + "dev": "npx wrangler pages dev --compatibility-date=2024-01-17 --port 8790 --proxy 8791 -- pnpm run server", + "server": "node dist/index.js", + "test": "vitest run", + "test:watch": "vitest", + "type:tests": "tsc -p ./tests/tsconfig.json" + }, + "devDependencies": { + "@cloudflare/workers-tsconfig": "workspace:*", + "miniflare": "workspace:*", + "undici": "^5.28.2", + "wrangler": "workspace:*" + }, + "engines": { + "node": ">=14" + } +} diff --git a/fixtures/pages-proxy-app/server/index.ts b/fixtures/pages-proxy-app/server/index.ts new file mode 100644 index 000000000000..e6efbc2ab5fa --- /dev/null +++ b/fixtures/pages-proxy-app/server/index.ts @@ -0,0 +1,10 @@ +import { createServer } from "http"; + +const server = createServer(); + +server.on("request", (req, res) => { + res.write("Host:" + req.headers.host); + res.end(); +}); + +server.listen(8791); diff --git a/fixtures/pages-proxy-app/tests/index.test.ts b/fixtures/pages-proxy-app/tests/index.test.ts new file mode 100644 index 000000000000..0657f893652f --- /dev/null +++ b/fixtures/pages-proxy-app/tests/index.test.ts @@ -0,0 +1,36 @@ +import { fork } from "node:child_process"; +import { resolve } from "node:path"; +import { fetch } from "undici"; +import { afterAll, beforeAll, describe, it } from "vitest"; +import { runWranglerPagesDev } from "../../shared/src/run-wrangler-long-lived"; +import type { ChildProcess } from "node:child_process"; + +describe("pages-proxy-app", async () => { + let ip: string, port: number, stop: (() => Promise) | undefined; + let devServer: ChildProcess; + + beforeAll(async () => { + devServer = fork(resolve(__dirname, "../dist/index.js"), { + stdio: "ignore", + }); + + debugger; + ({ ip, port, stop } = await runWranglerPagesDev( + resolve(__dirname, ".."), + undefined, + ["--port=0", "--inspector-port=0", "--proxy=8791"] + )); + }); + + afterAll(async () => { + await stop?.(); + devServer.kill(); + }); + + it("receives the correct Host header", async ({ expect }) => { + debugger; + const response = await fetch(`http://${ip}:${port}/`); + const text = await response.text(); + expect(text).toContain(`Host:${ip}:${port}`); + }); +}); diff --git a/fixtures/pages-proxy-app/tests/tsconfig.json b/fixtures/pages-proxy-app/tests/tsconfig.json new file mode 100644 index 000000000000..d2ce7f144694 --- /dev/null +++ b/fixtures/pages-proxy-app/tests/tsconfig.json @@ -0,0 +1,7 @@ +{ + "extends": "@cloudflare/workers-tsconfig/tsconfig.json", + "compilerOptions": { + "types": ["node"] + }, + "include": ["**/*.ts", "../../../node-types.d.ts"] +} diff --git a/fixtures/pages-proxy-app/tsconfig.json b/fixtures/pages-proxy-app/tsconfig.json new file mode 100644 index 000000000000..902b5311b2a8 --- /dev/null +++ b/fixtures/pages-proxy-app/tsconfig.json @@ -0,0 +1,13 @@ +{ + "include": ["server"], + "compilerOptions": { + "target": "ES2020", + "module": "CommonJS", + "lib": ["ES2020"], + "types": ["node"], + "moduleResolution": "node", + "esModuleInterop": true, + "noEmit": true, + "skipLibCheck": true + } +} diff --git a/fixtures/pages-proxy-app/turbo.json b/fixtures/pages-proxy-app/turbo.json new file mode 100644 index 000000000000..1a6c74c9def8 --- /dev/null +++ b/fixtures/pages-proxy-app/turbo.json @@ -0,0 +1,9 @@ +{ + "$schema": "http://turbo.build/schema.json", + "extends": ["//"], + "pipeline": { + "build": { + "outputs": ["dist/**"] + } + } +} diff --git a/fixtures/pages-proxy-app/vitest.config.ts b/fixtures/pages-proxy-app/vitest.config.ts new file mode 100644 index 000000000000..846cddc41995 --- /dev/null +++ b/fixtures/pages-proxy-app/vitest.config.ts @@ -0,0 +1,9 @@ +import { defineProject, mergeConfig } from "vitest/config"; +import configShared from "../../vitest.shared"; + +export default mergeConfig( + configShared, + defineProject({ + test: {}, + }) +); diff --git a/fixtures/pages-workerjs-app/tests/index.test.ts b/fixtures/pages-workerjs-app/tests/index.test.ts index aa214fb9c34b..34398e2b43be 100644 --- a/fixtures/pages-workerjs-app/tests/index.test.ts +++ b/fixtures/pages-workerjs-app/tests/index.test.ts @@ -1,5 +1,7 @@ import { execSync } from "node:child_process"; +import { rename } from "node:fs/promises"; import path, { resolve } from "node:path"; +import { setTimeout } from "node:timers/promises"; import { fetch } from "undici"; import { describe, it } from "vitest"; import { runWranglerPagesDev } from "../../shared/src/run-wrangler-long-lived"; @@ -60,4 +62,95 @@ describe("Pages _worker.js", () => { await stop(); } }); + + it("should not error if the worker.js file is removed while watching", async ({ + expect, + }) => { + const basePath = resolve(__dirname, ".."); + const { ip, port, getOutput, clearOutput, stop } = + await runWranglerPagesDev(resolve(__dirname, ".."), "./workerjs-test", [ + "--port=0", + "--inspector-port=0", + ]); + try { + clearOutput(); + await tryRename( + basePath, + "workerjs-test/_worker.js", + "workerjs-test/XXX_worker.js" + ); + await setTimeout(1000); + // Expect no output since the deletion of the worker should be ignored + expect(getOutput()).toBe(""); + await tryRename( + basePath, + "workerjs-test/XXX_worker.js", + "workerjs-test/_worker.js" + ); + await setTimeout(1000); + // Expect replacing the worker to now trigger a success build. + expect(getOutput()).toContain("Compiled Worker successfully"); + } finally { + await stop(); + await tryRename( + basePath, + "workerjs-test/XXX_worker.js", + "workerjs-test/_worker.js" + ); + } + }); + + it("should not error if the _routes.json file is removed while watching", async ({ + expect, + }) => { + const basePath = resolve(__dirname, ".."); + const { ip, port, getOutput, clearOutput, stop } = + await runWranglerPagesDev(resolve(__dirname, ".."), "./workerjs-test", [ + "--port=0", + "--inspector-port=0", + ]); + try { + clearOutput(); + await tryRename( + basePath, + "workerjs-test/_routes.json", + "workerjs-test/XXX_routes.json" + ); + await setTimeout(1000); + // Expect no output since the deletion of the routes file should be ignored + expect(getOutput()).toBe(""); + await tryRename( + basePath, + "workerjs-test/XXX_routes.json", + "workerjs-test/_routes.json" + ); + await setTimeout(1000); + // Expect replacing the routes file to trigger a build, although + // the routes build does not provide any output feedback to compare against, + // so we just check that nothing else is being printed. + expect(getOutput()).toBe(""); + } finally { + await stop(); + await tryRename( + basePath, + "workerjs-test/XXX_routes.json", + "workerjs-test/_routes.json" + ); + } + }); + + async function tryRename( + basePath: string, + from: string, + to: string + ): Promise { + try { + await rename(resolve(basePath, from), resolve(basePath, to)); + } catch (e) { + // Do nothing if the file was not found + if ((e as any).code !== "ENOENT") { + throw e; + } + } + } }); diff --git a/fixtures/pages-workerjs-app/workerjs-test/_routes.json b/fixtures/pages-workerjs-app/workerjs-test/_routes.json new file mode 100644 index 000000000000..1cf79313f314 --- /dev/null +++ b/fixtures/pages-workerjs-app/workerjs-test/_routes.json @@ -0,0 +1,6 @@ +{ + "version": 1, + "description": "", + "include": ["/*"], + "exclude": [] +} diff --git a/fixtures/python-worker/requirements.txt b/fixtures/python-worker/requirements.txt new file mode 100644 index 000000000000..6b9ac5866230 --- /dev/null +++ b/fixtures/python-worker/requirements.txt @@ -0,0 +1 @@ +bcrypt==4.0.1 \ No newline at end of file diff --git a/fixtures/python-worker/src/arith.py b/fixtures/python-worker/src/arith.py new file mode 100644 index 000000000000..0bcf862adf20 --- /dev/null +++ b/fixtures/python-worker/src/arith.py @@ -0,0 +1,2 @@ +def mul(a,b): + return a*b diff --git a/fixtures/python-worker/src/index.py b/fixtures/python-worker/src/index.py new file mode 100644 index 000000000000..9177bde348dc --- /dev/null +++ b/fixtures/python-worker/src/index.py @@ -0,0 +1,8 @@ +from js import Response +from other import add +from arith import mul +import bcrypt +def fetch(request): + password = b"super secret password" + hashed = bcrypt.hashpw(password, bcrypt.gensalt(14)) + return Response.new(f"Hi world {add(1,2)} {mul(2,3)} {hashed}") diff --git a/fixtures/python-worker/src/other.py b/fixtures/python-worker/src/other.py new file mode 100644 index 000000000000..2a99cdfa95f0 --- /dev/null +++ b/fixtures/python-worker/src/other.py @@ -0,0 +1,2 @@ +def add(a, b): + return a + b diff --git a/fixtures/python-worker/wrangler.toml b/fixtures/python-worker/wrangler.toml new file mode 100644 index 000000000000..882e41bd6089 --- /dev/null +++ b/fixtures/python-worker/wrangler.toml @@ -0,0 +1,4 @@ +name = "dep-python-worker" +main = "src/index.py" +compatibility_flags = ["experimental"] +compatibility_date = "2024-01-29" \ No newline at end of file diff --git a/fixtures/shared/src/run-wrangler-long-lived.ts b/fixtures/shared/src/run-wrangler-long-lived.ts index 3f3316208871..9d63214a8e65 100644 --- a/fixtures/shared/src/run-wrangler-long-lived.ts +++ b/fixtures/shared/src/run-wrangler-long-lived.ts @@ -16,10 +16,14 @@ export const wranglerEntryPath = path.resolve( */ export async function runWranglerPagesDev( cwd: string, - publicPath: string, + publicPath: string | undefined, options: string[] ) { - return runLongLivedWrangler(["pages", "dev", publicPath, ...options], cwd); + if (publicPath) { + return runLongLivedWrangler(["pages", "dev", publicPath, ...options], cwd); + } else { + return runLongLivedWrangler(["pages", "dev", ...options], cwd); + } } /** @@ -62,6 +66,7 @@ async function runLongLivedWrangler(command: string[], cwd: string) { chunks.push(chunk); }); const getOutput = () => Buffer.concat(chunks).toString(); + const clearOutput = () => (chunks.length = 0); const timeoutHandle = setTimeout(() => { if (settledReadyPromise) return; @@ -90,5 +95,5 @@ async function runLongLivedWrangler(command: string[], cwd: string) { } const { ip, port } = await ready; - return { ip, port, stop, getOutput }; + return { ip, port, stop, getOutput, clearOutput }; } diff --git a/packages/create-cloudflare/.eslintrc.js b/packages/create-cloudflare/.eslintrc.js index 1e2096d54a14..1057fd4b8aa3 100644 --- a/packages/create-cloudflare/.eslintrc.js +++ b/packages/create-cloudflare/.eslintrc.js @@ -4,6 +4,7 @@ module.exports = { ignorePatterns: [ "dist", "scripts", + "e2e-tests/fixtures/*", // template files are ignored by the eslint-config-worker configuration // we do however want the c3 files to be linted "!**/templates/**/c3.ts", diff --git a/packages/create-cloudflare/CHANGELOG.md b/packages/create-cloudflare/CHANGELOG.md index f609bd6836fc..8e278e740958 100644 --- a/packages/create-cloudflare/CHANGELOG.md +++ b/packages/create-cloudflare/CHANGELOG.md @@ -1,5 +1,58 @@ # create-cloudflare +## 2.11.2 + +### Patch Changes + +- [#4935](https://github.com/cloudflare/workers-sdk/pull/4935) [`0699506d`](https://github.com/cloudflare/workers-sdk/commit/0699506d9cab929779d19ec2af9b53ccb70c0e7b) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-qwik` from `1.4.3` to `1.4.4` + +* [#4927](https://github.com/cloudflare/workers-sdk/pull/4927) [`49696ab3`](https://github.com/cloudflare/workers-sdk/commit/49696ab391d09243b54a0c32cf220fcc272871ec) Thanks [@jculvey](https://github.com/jculvey)! - feature: Add `getBindingsProxy` support to `qwik` template + + The `qwik` template now uses `getBindingsProxy` for handling requests for bound resources + in dev. This allows projects to use `vite` for dev instead of `wrangler pages dev` on built output. + +## 2.11.1 + +### Patch Changes + +- [#4881](https://github.com/cloudflare/workers-sdk/pull/4881) [`37141ba5`](https://github.com/cloudflare/workers-sdk/commit/37141ba5fe3df960fb744ba5c665c4d606a51f57) Thanks [@dependabot](https://github.com/apps/dependabot)! - C3: Bumped `create-qwik` from `1.4.2` to `1.4.3` + +* [#4892](https://github.com/cloudflare/workers-sdk/pull/4892) [`598b2c49`](https://github.com/cloudflare/workers-sdk/commit/598b2c49d78421fe793f2a7fda467f3fa68d6e8d) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `@angular/create` from `17.1.1` to `17.1.2` + +- [#4903](https://github.com/cloudflare/workers-sdk/pull/4903) [`582396a7`](https://github.com/cloudflare/workers-sdk/commit/582396a78bbeb9efd3c42dd22bb4cad6cc5fbaa7) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-remix` from `2.5.1` to `2.6.0` + +## 2.11.0 + +### Minor Changes + +- [#4850](https://github.com/cloudflare/workers-sdk/pull/4850) [`eb76082b`](https://github.com/cloudflare/workers-sdk/commit/eb76082bfc96c588cc26a2eed7c49407bb797dd5) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - feature: introduce a new optional `previewScript` to the C3 summary + + such script is to be used to locally preview the application (using wrangler) + +### Patch Changes + +- [#4855](https://github.com/cloudflare/workers-sdk/pull/4855) [`c58c253b`](https://github.com/cloudflare/workers-sdk/commit/c58c253bf5abf347c8c5a05e12561cc9d8544b95) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-docusaurus` from `3.1.0` to `3.1.1` + +* [#4856](https://github.com/cloudflare/workers-sdk/pull/4856) [`f6a707d3`](https://github.com/cloudflare/workers-sdk/commit/f6a707d37522794dc828ddd7895c038c05ab094f) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-astro` from `4.7.1` to `4.7.2` + +- [#4857](https://github.com/cloudflare/workers-sdk/pull/4857) [`9adfeae5`](https://github.com/cloudflare/workers-sdk/commit/9adfeae5763e00075cea80cb34585598fdc28c19) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-qwik` from `1.4.1` to `1.4.2` + +* [#4858](https://github.com/cloudflare/workers-sdk/pull/4858) [`f2ca5e2e`](https://github.com/cloudflare/workers-sdk/commit/f2ca5e2e6df50f7e3977ef3a9e408bd7e11f60be) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-solid` from `0.3.10` to `0.4.10` + +- [#4870](https://github.com/cloudflare/workers-sdk/pull/4870) [`7a341949`](https://github.com/cloudflare/workers-sdk/commit/7a341949216fa6bfdb892f0c4d1f797415741856) Thanks [@dependabot](https://github.com/apps/dependabot)! - chore: Bumped `create-vue` from `3.9.1` to `3.9.2` + +* [#4863](https://github.com/cloudflare/workers-sdk/pull/4863) [`ed40cf84`](https://github.com/cloudflare/workers-sdk/commit/ed40cf849ae8f164574137ea34bb86ed88a6e168) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - chore: handle new Next.js `next.config.mjs` default files + + Since `create-next-app` `v14.1.0` the default generated config file is `next.config.mjs` + (instead of `next.config.js`), so such new default files need to be handled accordingly + +- [#4850](https://github.com/cloudflare/workers-sdk/pull/4850) [`eb76082b`](https://github.com/cloudflare/workers-sdk/commit/eb76082bfc96c588cc26a2eed7c49407bb797dd5) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - fix: update the svelteKit c3 scripts + + replace the incorrect `pages:dev` with a new `pages:preview` script + (and use the standard `dev` script as the `devScript`) + +* [#4863](https://github.com/cloudflare/workers-sdk/pull/4863) [`ed40cf84`](https://github.com/cloudflare/workers-sdk/commit/ed40cf849ae8f164574137ea34bb86ed88a6e168) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - chore: Bumped `create-next-app` from `14.0.4` to `14.1.0` + ## 2.10.0 ### Minor Changes diff --git a/packages/create-cloudflare/e2e-tests/cli.test.ts b/packages/create-cloudflare/e2e-tests/cli.test.ts index 174acf2cdb61..7a26171b2450 100644 --- a/packages/create-cloudflare/e2e-tests/cli.test.ts +++ b/packages/create-cloudflare/e2e-tests/cli.test.ts @@ -11,7 +11,14 @@ import { } from "vitest"; import { version } from "../package.json"; import { frameworkToTest } from "./frameworkToTest"; -import { isQuarantineMode, keys, recreateLogFolder, runC3 } from "./helpers"; +import { + createTestLogStream, + isQuarantineMode, + keys, + recreateLogFolder, + runC3, +} from "./helpers"; +import type { WriteStream } from "fs"; import type { Suite } from "vitest"; // Note: skipIf(frameworkToTest) makes it so that all the basic C3 functionality @@ -21,13 +28,15 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( () => { const tmpDirPath = realpathSync(mkdtempSync(join(tmpdir(), "c3-tests"))); const projectPath = join(tmpDirPath, "basic-tests"); + let logStream: WriteStream; beforeAll((ctx) => { recreateLogFolder(ctx as Suite); }); - beforeEach(() => { + beforeEach((ctx) => { rmSync(projectPath, { recursive: true, force: true }); + logStream = createTestLogStream(ctx); }); afterEach(() => { @@ -36,18 +45,18 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( } }); - test("--version", async (ctx) => { - const { output } = await runC3({ ctx, argv: ["--version"] }); + test("--version", async () => { + const { output } = await runC3(["--version"], [], logStream); expect(output).toEqual(version); }); - test("--version with positionals", async (ctx) => { + test("--version with positionals", async () => { const argv = ["foo", "bar", "baz", "--version"]; - const { output } = await runC3({ ctx, argv }); + const { output } = await runC3(argv, [], logStream); expect(output).toEqual(version); }); - test("--version with flags", async (ctx) => { + test("--version with flags", async () => { const argv = [ "foo", "--type", @@ -55,17 +64,16 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( "--no-deploy", "--version", ]; - const { output } = await runC3({ ctx, argv }); + const { output } = await runC3(argv, [], logStream); expect(output).toEqual(version); }); test.skipIf(process.platform === "win32")( "Using arrow keys + enter", - async (ctx) => { - const { output } = await runC3({ - ctx, - argv: [projectPath], - promptHandlers: [ + async () => { + const { output } = await runC3( + [projectPath], + [ { matcher: /What type of application do you want to create/, input: [keys.enter], @@ -83,7 +91,8 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( input: [keys.left, keys.enter], }, ], - }); + logStream + ); expect(projectPath).toExist(); expect(output).toContain(`type "Hello World" Worker`); @@ -95,11 +104,10 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( test.skipIf(process.platform === "win32")( "Typing custom responses", - async (ctx) => { - const { output } = await runC3({ - argv: [], - ctx, - promptHandlers: [ + async () => { + const { output } = await runC3( + [], + [ { matcher: /In which directory do you want to create your application/, @@ -122,7 +130,8 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( input: ["n"], }, ], - }); + logStream + ); expect(projectPath).toExist(); expect(output).toContain(`type Example router & proxy Worker`); @@ -134,11 +143,10 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( test.skipIf(process.platform === "win32")( "Mixed args and interactive", - async (ctx) => { - const { output } = await runC3({ - ctx, - argv: [projectPath, "--ts", "--no-deploy"], - promptHandlers: [ + async () => { + const { output } = await runC3( + [projectPath, "--ts", "--no-deploy"], + [ { matcher: /What type of application do you want to create/, input: [keys.enter], @@ -148,7 +156,8 @@ describe.skipIf(frameworkToTest || isQuarantineMode())( input: ["n"], }, ], - }); + logStream + ); expect(projectPath).toExist(); expect(output).toContain(`type "Hello World" Worker`); diff --git a/packages/create-cloudflare/e2e-tests/fixtures/qwik/src/routes/test/index.ts b/packages/create-cloudflare/e2e-tests/fixtures/qwik/src/routes/test/index.ts new file mode 100644 index 000000000000..c31b601352fb --- /dev/null +++ b/packages/create-cloudflare/e2e-tests/fixtures/qwik/src/routes/test/index.ts @@ -0,0 +1,10 @@ +import type { RequestHandler } from "@builder.io/qwik-city"; + +export const onGet: RequestHandler = async ({ platform, json }) => { + if (!platform.env) { + json(500, "Platform object not defined"); + return; + } + + json(200, { value: platform.env["TEST"], version: 1 }); +}; diff --git a/packages/create-cloudflare/e2e-tests/fixtures/qwik/wrangler.toml b/packages/create-cloudflare/e2e-tests/fixtures/qwik/wrangler.toml new file mode 100644 index 000000000000..4679b8cbbddd --- /dev/null +++ b/packages/create-cloudflare/e2e-tests/fixtures/qwik/wrangler.toml @@ -0,0 +1,2 @@ +[vars] +TEST = "C3_TEST" diff --git a/packages/create-cloudflare/e2e-tests/frameworks.test.ts b/packages/create-cloudflare/e2e-tests/frameworks.test.ts index f8096e3a82b9..5433e81276d1 100644 --- a/packages/create-cloudflare/e2e-tests/frameworks.test.ts +++ b/packages/create-cloudflare/e2e-tests/frameworks.test.ts @@ -1,249 +1,243 @@ +import { existsSync } from "fs"; +import { cp } from "fs/promises"; import { join } from "path"; import { retry } from "helpers/command"; +import { sleep } from "helpers/common"; +import { detectPackageManager } from "helpers/packages"; import { fetch } from "undici"; -import { beforeAll, describe, expect, test } from "vitest"; +import { + afterEach, + beforeAll, + beforeEach, + describe, + expect, + test, +} from "vitest"; import { deleteProject, deleteWorker } from "../scripts/common"; import { getFrameworkMap } from "../src/templates"; import { frameworkToTest } from "./frameworkToTest"; import { + createTestLogStream, isQuarantineMode, keys, recreateLogFolder, runC3, + spawnWithLogging, testDeploymentCommitMessage, testProjectDir, + waitForExit, } from "./helpers"; import type { FrameworkMap, FrameworkName } from "../src/templates"; import type { RunnerConfig } from "./helpers"; -import type { Suite, TestContext } from "vitest"; +import type { WriteStream } from "fs"; +import type { Suite } from "vitest"; const TEST_TIMEOUT = 1000 * 60 * 5; const LONG_TIMEOUT = 1000 * 60 * 10; -type FrameworkTestConfig = Omit & { - expectResponseToContain: string; +type FrameworkTestConfig = RunnerConfig & { testCommitMessage: boolean; - timeout?: number; unsupportedPms?: string[]; unsupportedOSs?: string[]; + verifyDev?: { + route: string; + expectedText: string; + }; + verifyBuild?: { + outputDir: string; + script: string; + route: string; + expectedText: string; + }; }; -let frameworkMap: FrameworkMap; - -describe.concurrent(`E2E: Web frameworks`, () => { - // These are ordered based on speed and reliability for ease of debugging - const frameworkTests: Record = { - astro: { - expectResponseToContain: "Hello, Astronaut!", - testCommitMessage: true, - unsupportedOSs: ["win32"], +// These are ordered based on speed and reliability for ease of debugging +const frameworkTests: Record = { + astro: { + testCommitMessage: true, + unsupportedOSs: ["win32"], + verifyDeploy: { + route: "/", + expectedText: "Hello, Astronaut!", }, - docusaurus: { - expectResponseToContain: "Dinosaurs are cool", - unsupportedPms: ["bun"], - testCommitMessage: true, - unsupportedOSs: ["win32"], - timeout: LONG_TIMEOUT, + }, + docusaurus: { + unsupportedPms: ["bun"], + testCommitMessage: true, + unsupportedOSs: ["win32"], + timeout: LONG_TIMEOUT, + verifyDeploy: { + route: "/", + expectedText: "Dinosaurs are cool", }, - angular: { - expectResponseToContain: "Congratulations! Your app is running.", - testCommitMessage: true, - timeout: LONG_TIMEOUT, + }, + angular: { + testCommitMessage: true, + timeout: LONG_TIMEOUT, + verifyDeploy: { + route: "/", + expectedText: "Congratulations! Your app is running.", }, - gatsby: { - expectResponseToContain: "Gatsby!", - unsupportedPms: ["bun", "pnpm"], - promptHandlers: [ - { - matcher: /Would you like to use a template\?/, - input: ["n"], - }, - ], - testCommitMessage: true, - timeout: LONG_TIMEOUT, + }, + gatsby: { + unsupportedPms: ["bun", "pnpm"], + promptHandlers: [ + { + matcher: /Would you like to use a template\?/, + input: ["n"], + }, + ], + testCommitMessage: true, + timeout: LONG_TIMEOUT, + verifyDeploy: { + route: "/", + expectedText: "Gatsby!", + }, + }, + hono: { + testCommitMessage: false, + verifyDeploy: { + route: "/", + expectedText: "Hello Hono!", }, - hono: { - expectResponseToContain: "Hello Hono!", - testCommitMessage: false, + }, + qwik: { + promptHandlers: [ + { + matcher: /Yes looks good, finish update/, + input: [keys.enter], + }, + ], + testCommitMessage: true, + unsupportedOSs: ["win32"], + unsupportedPms: ["yarn"], + verifyDeploy: { + route: "/", + expectedText: "Welcome to Qwik", }, - qwik: { - expectResponseToContain: "Welcome to Qwik", - promptHandlers: [ - { - matcher: /Yes looks good, finish update/, - input: [keys.enter], - }, - ], - testCommitMessage: true, - unsupportedOSs: ["win32"], - unsupportedPms: ["yarn"], + verifyDev: { + route: "/test", + expectedText: "C3_TEST", }, - remix: { - expectResponseToContain: "Welcome to Remix", - testCommitMessage: true, - timeout: LONG_TIMEOUT, - unsupportedPms: ["yarn"], + verifyBuild: { + outputDir: "./dist", + script: "build", + route: "/test", + expectedText: "C3_TEST", }, - next: { - expectResponseToContain: "Create Next App", - promptHandlers: [ - { - matcher: /Do you want to use the next-on-pages eslint-plugin\?/, - input: ["y"], - }, - ], - testCommitMessage: true, - quarantine: true, + }, + remix: { + testCommitMessage: true, + timeout: LONG_TIMEOUT, + unsupportedPms: ["yarn"], + verifyDeploy: { + route: "/", + expectedText: "Welcome to Remix", + }, + }, + next: { + promptHandlers: [ + { + matcher: /Do you want to use the next-on-pages eslint-plugin\?/, + input: ["y"], + }, + ], + testCommitMessage: true, + quarantine: true, + verifyDeploy: { + route: "/", + expectedText: "Create Next App", }, - nuxt: { - expectResponseToContain: "Welcome to Nuxt!", - testCommitMessage: true, - timeout: LONG_TIMEOUT, + }, + nuxt: { + testCommitMessage: true, + timeout: LONG_TIMEOUT, + verifyDeploy: { + route: "/", + expectedText: "Welcome to Nuxt!", }, - react: { - expectResponseToContain: "React App", - testCommitMessage: true, - unsupportedOSs: ["win32"], - timeout: LONG_TIMEOUT, + }, + react: { + testCommitMessage: true, + unsupportedOSs: ["win32"], + timeout: LONG_TIMEOUT, + verifyDeploy: { + route: "/", + expectedText: "React App", }, - solid: { - expectResponseToContain: "Hello world", - promptHandlers: [ - { - matcher: /Which template do you want to use/, - input: [keys.enter], - }, - { - matcher: /Server Side Rendering/, - input: [keys.enter], - }, - { - matcher: /Use TypeScript/, - input: [keys.enter], - }, - ], - testCommitMessage: true, - timeout: LONG_TIMEOUT, - unsupportedOSs: ["win32"], + }, + solid: { + promptHandlers: [ + { + matcher: /Which template do you want to use/, + input: [keys.enter], + }, + { + matcher: /Server Side Rendering/, + input: [keys.enter], + }, + { + matcher: /Use TypeScript/, + input: [keys.enter], + }, + ], + testCommitMessage: true, + timeout: LONG_TIMEOUT, + unsupportedOSs: ["win32"], + verifyDeploy: { + route: "/", + expectedText: "Hello world", }, - svelte: { - expectResponseToContain: "SvelteKit app", - promptHandlers: [ - { - matcher: /Which Svelte app template/, - input: [keys.enter], - }, - { - matcher: /Add type checking with TypeScript/, - input: [keys.down, keys.enter], - }, - { - matcher: /Select additional options/, - input: [keys.enter], - }, - ], - testCommitMessage: true, - unsupportedOSs: ["win32"], - unsupportedPms: ["npm"], + }, + svelte: { + promptHandlers: [ + { + matcher: /Which Svelte app template/, + input: [keys.enter], + }, + { + matcher: /Add type checking with TypeScript/, + input: [keys.down, keys.enter], + }, + { + matcher: /Select additional options/, + input: [keys.enter], + }, + ], + testCommitMessage: true, + unsupportedOSs: ["win32"], + unsupportedPms: ["npm"], + verifyDeploy: { + route: "/", + expectedText: "SvelteKit app", }, - vue: { - expectResponseToContain: "Vite App", - testCommitMessage: true, - unsupportedOSs: ["win32"], + }, + vue: { + testCommitMessage: true, + unsupportedOSs: ["win32"], + verifyDeploy: { + route: "/", + expectedText: "Vite App", }, - }; + }, +}; + +describe.concurrent(`E2E: Web frameworks`, () => { + let frameworkMap: FrameworkMap; + let logStream: WriteStream; beforeAll(async (ctx) => { frameworkMap = await getFrameworkMap(); recreateLogFolder(ctx as Suite); }); - const runCli = async ( - framework: string, - projectPath: string, - { ctx, argv = [], promptHandlers = [] }: RunnerConfig - ) => { - const args = [ - projectPath, - "--type", - "webFramework", - "--framework", - framework, - "--deploy", - "--no-open", - "--no-git", - ]; - - args.push(...argv); - - const { output } = await runC3({ - ctx, - argv: args, - promptHandlers, - outputPrefix: `[${framework}]`, - }); - - // Relevant project files should have been created - expect(projectPath).toExist(); - const pkgJsonPath = join(projectPath, "package.json"); - expect(pkgJsonPath).toExist(); - - // Wrangler should be installed - const wranglerPath = join(projectPath, "node_modules/wrangler"); - expect(wranglerPath).toExist(); - - // TODO: Before the refactor introduced in https://github.com/cloudflare/workers-sdk/pull/4754 - // we used to test the packageJson scripts transformations here, try to re-implement such - // checks (might be harder given the switch to a transform function compared to the old - // object based substitution) - - return { output }; - }; - - const runCliWithDeploy = async ( - framework: string, - projectName: string, - projectPath: string, - ctx: TestContext, - testCommitMessage: boolean - ) => { - const { argv, overrides, promptHandlers, expectResponseToContain } = - frameworkTests[framework]; - - const { output } = await runCli(framework, projectPath, { - ctx, - overrides, - promptHandlers, - argv: [...(argv ?? [])], - }); - - // Verify deployment - const deployedUrlRe = - /deployment is ready at: (https:\/\/.+\.(pages|workers)\.dev)/; - - const match = output.match(deployedUrlRe); - if (!match || !match[1]) { - expect(false, "Couldn't find deployment url in C3 output").toBe(true); - return; - } + beforeEach(async (ctx) => { + logStream = createTestLogStream(ctx); + }); - const projectUrl = match[1]; - - await retry({ times: 5 }, async () => { - await new Promise((resolve) => setTimeout(resolve, 1000)); // wait a second - const res = await fetch(projectUrl); - const body = await res.text(); - if (!body.includes(expectResponseToContain)) { - throw new Error( - `(${framework}) Deployed page (${projectUrl}) didn't contain expected string: "${expectResponseToContain}"` - ); - } - }); - - if (testCommitMessage) { - await testDeploymentCommitMessage(projectName, framework); - } - }; + afterEach(async () => { + logStream.close(); + }); Object.keys(frameworkTests).forEach((framework) => { const { @@ -265,23 +259,69 @@ describe.concurrent(`E2E: Web frameworks`, () => { // Skip if the package manager is unsupported shouldRun &&= !unsupportedPms?.includes(process.env.TEST_PM ?? ""); + // Skip if the OS is unsupported shouldRun &&= !unsupportedOSs?.includes(process.platform); test.runIf(shouldRun)( framework, - async (ctx) => { + async () => { const { getPath, getName, clean } = testProjectDir("pages"); const projectPath = getPath(framework); const projectName = getName(framework); const frameworkConfig = frameworkMap[framework as FrameworkName]; + + const { argv, promptHandlers, verifyDeploy } = + frameworkTests[framework]; + + if (!verifyDeploy) { + expect( + true, + "A `deploy` configuration must be defined for all framework tests" + ).toBe(false); + return; + } + try { - await runCliWithDeploy( + const deploymentUrl = await runCli( framework, - projectName, projectPath, - ctx, - testCommitMessage + logStream, + { + argv: [...(argv ?? [])], + promptHandlers, + } + ); + + // Relevant project files should have been created + expect(projectPath).toExist(); + const pkgJsonPath = join(projectPath, "package.json"); + expect(pkgJsonPath).toExist(); + + // Wrangler should be installed + const wranglerPath = join(projectPath, "node_modules/wrangler"); + expect(wranglerPath).toExist(); + + if (testCommitMessage) { + await testDeploymentCommitMessage(projectName, framework); + } + + // Make a request to the deployed project and verify it was successful + await verifyDeployment( + `${deploymentUrl}${verifyDeploy.route}`, + verifyDeploy.expectedText ); + + // Copy over any test fixture files + const fixturePath = join(__dirname, "fixtures", framework); + if (existsSync(fixturePath)) { + await cp(fixturePath, projectPath, { + recursive: true, + force: true, + }); + } + + await verifyDevScript(framework, projectPath, logStream); + await verifyBuildScript(framework, projectPath, logStream); } finally { clean(framework); // Cleanup the project in case we need to retry it @@ -295,8 +335,158 @@ describe.concurrent(`E2E: Web frameworks`, () => { { retry: 1, timeout: timeout || TEST_TIMEOUT } ); }); - - // test.skip("Hono (wrangler defaults)", async (ctx) => { - // await runCli("hono", { ctx, argv: ["--wrangler-defaults"] }); - // }); }); + +const runCli = async ( + framework: string, + projectPath: string, + logStream: WriteStream, + { argv = [], promptHandlers = [] }: RunnerConfig +) => { + const args = [ + projectPath, + "--type", + "webFramework", + "--framework", + framework, + "--deploy", + "--no-open", + "--no-git", + ]; + + args.push(...argv); + + const { output } = await runC3(args, promptHandlers, logStream); + + const deployedUrlRe = + /deployment is ready at: (https:\/\/.+\.(pages|workers)\.dev)/; + + const match = output.match(deployedUrlRe); + if (!match || !match[1]) { + expect(false, "Couldn't find deployment url in C3 output").toBe(true); + return ""; + } + + return match[1]; +}; + +const verifyDeployment = async ( + deploymentUrl: string, + expectedText: string +) => { + await retry({ times: 5 }, async () => { + await sleep(1000); + const res = await fetch(deploymentUrl); + const body = await res.text(); + if (!body.includes(expectedText)) { + throw new Error( + `Deployed page (${deploymentUrl}) didn't contain expected string: "${expectedText}"` + ); + } + }); +}; + +const verifyDevScript = async ( + framework: string, + projectPath: string, + logStream: WriteStream +) => { + const { verifyDev } = frameworkTests[framework]; + if (!verifyDev) { + return; + } + + const frameworkMap = await getFrameworkMap(); + const template = frameworkMap[framework as FrameworkName]; + + // Run the devserver on a random port to avoid colliding with other tests + const TEST_PORT = Math.ceil(Math.random() * 1000) + 20000; + + const { name: pm } = detectPackageManager(); + const proc = spawnWithLogging( + [ + pm, + "run", + template.devScript as string, + pm === "npm" ? "--" : "", + "--port", + `${TEST_PORT}`, + ], + { + cwd: projectPath, + env: { + NODE_ENV: "development", + }, + }, + logStream + ); + + // Wait a few seconds for dev server to spin up + await sleep(4000); + + // Make a request to the specified test route + const res = await fetch(`http://localhost:${TEST_PORT}${verifyDev.route}`); + const body = await res.text(); + + // Kill the process gracefully so ports can be cleaned up + proc.kill("SIGINT"); + + // Wait for a second to allow process to exit cleanly. Otherwise, the port might + // end up camped and cause future runs to fail + await sleep(1000); + + expect(body).toContain(verifyDev.expectedText); +}; + +const verifyBuildScript = async ( + framework: string, + projectPath: string, + logStream: WriteStream +) => { + const { verifyBuild } = frameworkTests[framework]; + + if (!verifyBuild) { + return; + } + + const { outputDir, script, route, expectedText } = verifyBuild; + + // Run the build script + const { name: pm, npx } = detectPackageManager(); + const buildProc = spawnWithLogging( + [pm, "run", script], + { + cwd: projectPath, + }, + logStream + ); + await waitForExit(buildProc); + + // Run wrangler dev on a random port to avoid colliding with other tests + const TEST_PORT = Math.ceil(Math.random() * 1000) + 20000; + + const devProc = spawnWithLogging( + [npx, "wrangler", "pages", "dev", outputDir, "--port", `${TEST_PORT}`], + { + cwd: projectPath, + }, + logStream + ); + + // Wait a few seconds for dev server to spin up + await sleep(4000); + + // Make a request to the specified test route + const res = await fetch(`http://localhost:${TEST_PORT}${route}`); + const body = await res.text(); + + // Kill the process gracefully so ports can be cleaned up + devProc.kill("SIGINT"); + + // Wait for a second to allow process to exit cleanly. Otherwise, the port might + // end up camped and cause future runs to fail + await sleep(1000); + + // Verify expectation after killing the process so that it exits cleanly in case of failure + expect(body).toContain(expectedText); +}; diff --git a/packages/create-cloudflare/e2e-tests/helpers.ts b/packages/create-cloudflare/e2e-tests/helpers.ts index 3889b07b1e9a..1005c1b1578c 100644 --- a/packages/create-cloudflare/e2e-tests/helpers.ts +++ b/packages/create-cloudflare/e2e-tests/helpers.ts @@ -12,11 +12,14 @@ import { stripAnsi } from "@cloudflare/cli"; import { spawn } from "cross-spawn"; import { retry } from "helpers/command"; import { sleep } from "helpers/common"; -import { detectPackageManager } from "helpers/packages"; import { fetch } from "undici"; import { expect } from "vitest"; import { version } from "../package.json"; -import { quoteShellArgs } from "../src/common"; +import type { + ChildProcessWithoutNullStreams, + SpawnOptionsWithoutStdio, +} from "child_process"; +import type { WriteStream } from "fs"; import type { Suite, TestContext } from "vitest"; export const C3_E2E_PREFIX = "c3-e2e-"; @@ -31,104 +34,142 @@ export const keys = { left: "\x1b\x5b\x44", }; +const testEnv = { + ...process.env, + // The following env vars are set to ensure that package managers + // do not use the same global cache and accidentally hit race conditions. + YARN_CACHE_FOLDER: "./.yarn/cache", + YARN_ENABLE_GLOBAL_CACHE: "false", + PNPM_HOME: "./.pnpm", + npm_config_cache: "./.npm/cache", +}; + export type PromptHandler = { matcher: RegExp; input: string[]; }; export type RunnerConfig = { - overrides?: { - packageScripts?: Record; - }; promptHandlers?: PromptHandler[]; argv?: string[]; - outputPrefix?: string; quarantine?: boolean; - ctx: TestContext; + timeout?: number; + verifyDeploy?: { + route: string; + expectedText: string; + }; }; -export const runC3 = async ({ - argv = [], - promptHandlers = [], - ctx, -}: RunnerConfig) => { - const cmd = "node"; - const args = ["./dist/cli.js", ...argv]; - const proc = spawn(cmd, args, { - env: { - ...process.env, - // The following env vars are set to ensure that package managers - // do not use the same global cache and accidentally hit race conditions. - YARN_CACHE_FOLDER: "./.yarn/cache", - YARN_ENABLE_GLOBAL_CACHE: "false", - PNPM_HOME: "./.pnpm", - npm_config_cache: "./.npm/cache", - }, - }); +export const runC3 = async ( + argv: string[] = [], + promptHandlers: PromptHandler[] = [], + logStream: WriteStream +) => { + const cmd = ["node", "./dist/cli.js", ...argv]; + const proc = spawnWithLogging(cmd, { env: testEnv }, logStream); - promptHandlers = [...promptHandlers]; + // Clone the prompt handlers so we can consume them destructively + promptHandlers = promptHandlers && [...promptHandlers]; - const { name: pm } = detectPackageManager(); + const onData = (data: string) => { + const lines: string[] = data.toString().split("\n"); + const currentDialog = promptHandlers[0]; - const stdout: string[] = []; - const stderr: string[] = []; + lines.forEach(async (line) => { + if (currentDialog && currentDialog.matcher.test(line)) { + // Add a small sleep to avoid input race + await sleep(1000); - promptHandlers = promptHandlers && [...promptHandlers]; + currentDialog.input.forEach((keystroke) => { + proc.stdin.write(keystroke); + }); - // The .ansi extension allows for editor extensions that format ansi terminal codes - const logFilename = `${normalizeTestName(ctx)}.ansi`; - const logStream = createWriteStream( - join(getLogPath(ctx.meta.suite), logFilename) - ); + // Consume the handler once we've used it + promptHandlers.shift(); - logStream.write( - `Running C3 with command: \`${quoteShellArgs([ - cmd, - ...args, - ])}\` (using ${pm})\n\n` - ); + // If we've consumed the last prompt handler, close the input stream + // Otherwise, the process wont exit properly + if (promptHandlers[0] === undefined) { + proc.stdin.end(); + } + } + }); + }; - await new Promise((resolve, rejects) => { - proc.stdout.on("data", (data) => { - const lines: string[] = data.toString().split("\n"); - const currentDialog = promptHandlers[0]; + return waitForExit(proc, onData); +}; - lines.forEach(async (line) => { - stdout.push(line); +/** + * Spawn a child process and attach a handler that will log any output from + * `stdout` or errors from `stderror` to a dedicated log file. + * + * @param args The command and arguments as an array + * @param opts Additional options to be passed to the `spawn` call + * @param logStream A write stream to the log file for the test + * @returns the child process that was created + */ +export const spawnWithLogging = ( + args: string[], + opts: SpawnOptionsWithoutStdio, + logStream: WriteStream +) => { + const [cmd, ...argv] = args; - const stripped = stripAnsi(line).trim(); - if (stripped.length > 0) { - logStream.write(`${stripped}\n`); - } + const proc = spawn(cmd, argv, { + ...opts, + env: { + ...testEnv, + ...opts.env, + }, + }); - if (currentDialog && currentDialog.matcher.test(line)) { - // Add a small sleep to avoid input race - await sleep(1000); + logStream.write(`\nRunning command: ${[cmd, ...argv].join(" ")}\n\n`); + + proc.stdout.on("data", (data) => { + const lines: string[] = data.toString().split("\n"); + + lines.forEach(async (line) => { + const stripped = stripAnsi(line).trim(); + if (stripped.length > 0) { + logStream.write(`${stripped}\n`); + } + }); + }); - currentDialog.input.forEach((keystroke) => { - proc.stdin.write(keystroke); - }); + proc.stderr.on("data", (data) => { + logStream.write(data); + }); - // Consume the handler once we've used it - promptHandlers.shift(); + return proc; +}; - // If we've consumed the last prompt handler, close the input stream - // Otherwise, the process wont exit properly - if (promptHandlers[0] === undefined) { - proc.stdin.end(); - } - } - }); +/** + * An async function that waits on a spawned process to run to completion, collecting + * any output or errors from `stdout` and `stderr`, respectively. + * + * @param proc The child process to wait for + * @param onData An optional handler to be called on `stdout.on('data')` + */ +export const waitForExit = async ( + proc: ChildProcessWithoutNullStreams, + onData?: (chunk: string) => void +) => { + const stdout: string[] = []; + const stderr: string[] = []; + + await new Promise((resolve, rejects) => { + proc.stdout.on("data", (data) => { + stdout.push(data); + if (onData) { + onData(data); + } }); proc.stderr.on("data", (data) => { - logStream.write(data); stderr.push(data); }); proc.on("close", (code) => { - logStream.close(); - if (code === 0) { resolve(null); } else { @@ -151,6 +192,14 @@ export const runC3 = async ({ }; }; +export const createTestLogStream = (ctx: TestContext) => { + // The .ansi extension allows for editor extensions that format ansi terminal codes + const fileName = `${normalizeTestName(ctx)}.ansi`; + return createWriteStream(join(getLogPath(ctx.task.suite), fileName), { + flags: "a", + }); +}; + export const recreateLogFolder = (suite: Suite) => { // Clean the old folder if exists (useful for dev) rmSync(getLogPath(suite), { @@ -172,13 +221,13 @@ const getLogPath = (suite: Suite) => { }; const normalizeTestName = (ctx: TestContext) => { - const baseName = ctx.meta.name + const baseName = ctx.task.name .toLowerCase() .replace(/\s+/g, "_") // replace any whitespace with `_` .replace(/\W/g, ""); // strip special characters // Ensure that each retry gets its own log file - const retryCount = ctx.meta.result?.retryCount ?? 0; + const retryCount = ctx.task.result?.retryCount ?? 0; const suffix = retryCount > 0 ? `_${retryCount}` : ""; return baseName + suffix; }; @@ -205,7 +254,7 @@ export const testProjectDir = (suite: string) => { } catch (e) { if (typeof e === "object" && e !== null && "code" in e) { const code = e.code; - if (code === "EBUSY" || code === "ENOENT") { + if (code === "EBUSY" || code === "ENOENT" || code === "ENOTEMPTY") { return; } } diff --git a/packages/create-cloudflare/e2e-tests/workers.test.ts b/packages/create-cloudflare/e2e-tests/workers.test.ts index cc63b176d5c5..63877bbbbc05 100644 --- a/packages/create-cloudflare/e2e-tests/workers.test.ts +++ b/packages/create-cloudflare/e2e-tests/workers.test.ts @@ -1,142 +1,74 @@ import { join } from "path"; import { retry } from "helpers/command"; +import { sleep } from "helpers/common"; import { readToml } from "helpers/files"; import { fetch } from "undici"; -import { beforeAll, describe, expect, test } from "vitest"; +import { beforeAll, beforeEach, describe, expect, test } from "vitest"; import { deleteWorker } from "../scripts/common"; import { frameworkToTest } from "./frameworkToTest"; import { + createTestLogStream, isQuarantineMode, recreateLogFolder, runC3, testProjectDir, } from "./helpers"; import type { RunnerConfig } from "./helpers"; -import type { Suite, TestContext } from "vitest"; +import type { WriteStream } from "fs"; +import type { Suite } from "vitest"; const TEST_TIMEOUT = 1000 * 60 * 5; -type WorkerTestConfig = Omit & { - expectResponseToContain?: string; - timeout?: number; +type WorkerTestConfig = RunnerConfig & { name?: string; template: string; }; + +const workerTemplates: WorkerTestConfig[] = [ + { + template: "hello-world", + verifyDeploy: { + route: "/", + expectedText: "Hello World!", + }, + }, + { + template: "common", + verifyDeploy: { + route: "/", + expectedText: "Try making requests to:", + }, + }, + { + template: "queues", + // Skipped for now, since C3 does not yet support resource creation + }, + { + template: "scheduled", + // Skipped for now, since it's not possible to test scheduled events on deployed Workers + }, + { + template: "openapi", + promptHandlers: [], + verifyDeploy: { + route: "/", + expectedText: "SwaggerUI", + }, + }, +]; + describe .skipIf(frameworkToTest || isQuarantineMode() || process.platform === "win32") .concurrent(`E2E: Workers templates`, () => { - const workerTemplates: WorkerTestConfig[] = [ - { - expectResponseToContain: "Hello World!", - template: "hello-world", - }, - { - template: "common", - expectResponseToContain: "Try making requests to:", - }, - { - template: "queues", - // Skipped for now, since C3 does not yet support resource creation - // expectResponseToContain: - }, - { - template: "scheduled", - // Skipped for now, since it's not possible to test scheduled events on deployed Workers - // expectResponseToContain: - }, - { - template: "openapi", - expectResponseToContain: "SwaggerUI", - promptHandlers: [], - }, - ]; + let logStream: WriteStream; beforeAll((ctx) => { recreateLogFolder(ctx as Suite); }); - const runCli = async ( - template: string, - projectPath: string, - { ctx, argv = [], promptHandlers = [] }: RunnerConfig - ) => { - const args = [projectPath, "--type", template, "--no-open", "--no-git"]; - - args.push(...argv); - - const { output } = await runC3({ - ctx, - argv: args, - promptHandlers, - outputPrefix: `[${template}]`, - }); - - // Relevant project files should have been created - expect(projectPath).toExist(); - - const gitignorePath = join(projectPath, ".gitignore"); - expect(gitignorePath).toExist(); - - const pkgJsonPath = join(projectPath, "package.json"); - expect(pkgJsonPath).toExist(); - - const wranglerPath = join(projectPath, "node_modules/wrangler"); - expect(wranglerPath).toExist(); - - const tomlPath = join(projectPath, "wrangler.toml"); - expect(tomlPath).toExist(); - - const config = readToml(tomlPath) as { main: string }; - - expect(join(projectPath, config.main)).toExist(); - - return { output }; - }; - - const runCliWithDeploy = async ( - template: WorkerTestConfig, - projectPath: string, - ctx: TestContext - ) => { - const { argv, overrides, promptHandlers, expectResponseToContain } = - template; - - const { output } = await runCli(template.template, projectPath, { - ctx, - overrides, - promptHandlers, - argv: [ - // Skip deployment if the test config has no response expectation - expectResponseToContain ? "--deploy" : "--no-deploy", - ...(argv ?? []), - ], - }); - - if (expectResponseToContain) { - // Verify deployment - const deployedUrlRe = - /deployment is ready at: (https:\/\/.+\.(workers)\.dev)/; - - const match = output.match(deployedUrlRe); - if (!match || !match[1]) { - expect(false, "Couldn't find deployment url in C3 output").toBe(true); - return; - } - - const projectUrl = match[1]; - - await retry({ times: 5 }, async () => { - await new Promise((resolve) => setTimeout(resolve, 1000)); // wait a second - const res = await fetch(projectUrl); - const body = await res.text(); - if (!body.includes(expectResponseToContain)) { - throw new Error( - `(${template}) Deployed page (${projectUrl}) didn't contain expected string: "${expectResponseToContain}"` - ); - } - }); - } - }; + beforeEach(async (ctx) => { + logStream = createTestLogStream(ctx); + }); workerTemplates .flatMap((template) => @@ -170,12 +102,39 @@ describe const name = template.name ?? template.template; test( name, - async (ctx) => { + async () => { const { getPath, getName, clean } = testProjectDir("workers"); const projectPath = getPath(name); const projectName = getName(name); try { - await runCliWithDeploy(template, projectPath, ctx); + const deployedUrl = await runCli( + template, + projectPath, + logStream + ); + + // Relevant project files should have been created + expect(projectPath).toExist(); + + const gitignorePath = join(projectPath, ".gitignore"); + expect(gitignorePath).toExist(); + + const pkgJsonPath = join(projectPath, "package.json"); + expect(pkgJsonPath).toExist(); + + const wranglerPath = join(projectPath, "node_modules/wrangler"); + expect(wranglerPath).toExist(); + + const tomlPath = join(projectPath, "wrangler.toml"); + expect(tomlPath).toExist(); + + const config = readToml(tomlPath) as { main: string }; + expect(join(projectPath, config.main)).toExist(); + + const { verifyDeploy } = template; + if (verifyDeploy && deployedUrl) { + await verifyDeployment(deployedUrl, verifyDeploy.expectedText); + } } finally { clean(name); await deleteWorker(projectName); @@ -185,3 +144,55 @@ describe ); }); }); + +const runCli = async ( + template: WorkerTestConfig, + projectPath: string, + logStream: WriteStream +) => { + const { argv, promptHandlers, verifyDeploy } = template; + + const args = [ + projectPath, + "--type", + template.template, + "--no-open", + "--no-git", + verifyDeploy ? "--deploy" : "--no-deploy", + ...(argv ?? []), + ]; + + const { output } = await runC3(args, promptHandlers, logStream); + + if (!verifyDeploy) { + return null; + } + + // Verify deployment + const deployedUrlRe = + /deployment is ready at: (https:\/\/.+\.(workers)\.dev)/; + + const match = output.match(deployedUrlRe); + if (!match || !match[1]) { + expect(false, "Couldn't find deployment url in C3 output").toBe(true); + return; + } + + return match[1]; +}; + +const verifyDeployment = async ( + deploymentUrl: string, + expectedString: string +) => { + await retry({ times: 5 }, async () => { + await sleep(1000); + const res = await fetch(deploymentUrl); + const body = await res.text(); + if (!body.includes(expectedString)) { + throw new Error( + `(Deployed page (${deploymentUrl}) didn't contain expected string: "${expectedString}"` + ); + } + }); +}; diff --git a/packages/create-cloudflare/package.json b/packages/create-cloudflare/package.json index dae99bcb974f..35c2a4ab00cb 100644 --- a/packages/create-cloudflare/package.json +++ b/packages/create-cloudflare/package.json @@ -1,6 +1,6 @@ { "name": "create-cloudflare", - "version": "2.10.0", + "version": "2.11.2", "description": "A CLI for creating and deploying new applications to Cloudflare.", "keywords": [ "cloudflare", diff --git a/packages/create-cloudflare/src/cli.ts b/packages/create-cloudflare/src/cli.ts index a81694925331..08e087be787d 100644 --- a/packages/create-cloudflare/src/cli.ts +++ b/packages/create-cloudflare/src/cli.ts @@ -29,7 +29,8 @@ import { createProject } from "./pages"; import { copyTemplateFiles, selectTemplate, - updatePackageJson, + updatePackageName, + updatePackageScripts, } from "./templates"; import { installWorkersTypes, updateWranglerToml } from "./workers"; import type { C3Args, C3Context } from "types"; @@ -122,7 +123,7 @@ const create = async (ctx: C3Context) => { } await copyTemplateFiles(ctx); - await updatePackageJson(ctx); + await updatePackageName(ctx); chdir(ctx.project.path); await npmInstall(ctx); @@ -144,6 +145,8 @@ const configure = async (ctx: C3Context) => { await template.configure({ ...ctx }); } + await updatePackageScripts(ctx); + await offerGit(ctx); await gitCommit(ctx); diff --git a/packages/create-cloudflare/src/frameworks/package.json b/packages/create-cloudflare/src/frameworks/package.json index e0e596e15b5f..17ad2d33b88c 100644 --- a/packages/create-cloudflare/src/frameworks/package.json +++ b/packages/create-cloudflare/src/frameworks/package.json @@ -7,16 +7,16 @@ ], "dependencies": { "create-astro": "4.7.2", - "@angular/create": "17.1.1", + "@angular/create": "17.1.2", "create-docusaurus": "3.1.1", "create-hono": "0.3.2", - "create-next-app": "14.0.4", - "create-qwik": "1.4.2", + "create-next-app": "14.1.0", + "create-qwik": "1.4.4", "create-react-app": "5.0.1", - "create-remix": "2.5.1", + "create-remix": "2.6.0", "create-solid": "0.4.10", "create-svelte": "6.0.8", - "create-vue": "3.9.1", + "create-vue": "3.9.2", "gatsby": "5.13.3", "nuxi": "3.10.0" }, diff --git a/packages/create-cloudflare/src/helpers/codemod.ts b/packages/create-cloudflare/src/helpers/codemod.ts index 36a0df18a0de..d1e924e1afcd 100644 --- a/packages/create-cloudflare/src/helpers/codemod.ts +++ b/packages/create-cloudflare/src/helpers/codemod.ts @@ -6,8 +6,28 @@ import * as typescriptParser from "recast/parsers/typescript"; import { readFile, writeFile } from "./files"; import type { Program } from "esprima"; +/* + CODEMOD TIPS & TRICKS + ===================== + + More info about parsing and transforming can be found in the `recast` docs: + https://github.com/benjamn/recast + + `recast` uses the `ast-types` library under the hood for basic AST operations + and defining node types. If you need to manipulate or manually construct AST nodes as + part of a code mod operation, be sure to check the `ast-types` documentation: + https://github.com/benjamn/ast-types + + Last but not least, AST viewers can be extremely helpful when trying to write + a transformer: + - https://astexplorer.net/ + - https://ts-ast-viewer.com/# + +*/ + // Parse an input string as javascript and return an ast export const parseJs = (src: string) => { + src = src.trim(); try { return recast.parse(src, { parser: esprimaParser }); } catch (error) { @@ -17,6 +37,7 @@ export const parseJs = (src: string) => { // Parse an input string as typescript and return an ast export const parseTs = (src: string) => { + src = src.trim(); try { return recast.parse(src, { parser: typescriptParser }); } catch (error) { diff --git a/packages/create-cloudflare/src/templates.ts b/packages/create-cloudflare/src/templates.ts index 3c6d4a9ca27c..78d25f36d070 100644 --- a/packages/create-cloudflare/src/templates.ts +++ b/packages/create-cloudflare/src/templates.ts @@ -367,26 +367,40 @@ const downloadRemoteTemplate = async (src: string) => { } }; -export const updatePackageJson = async (ctx: C3Context) => { - const s = spinner(); - s.start("Updating `package.json`"); - +export const updatePackageName = async (ctx: C3Context) => { // Update package.json with project name const placeholderNames = ["", "TBD", ""]; const pkgJsonPath = resolve(ctx.project.path, "package.json"); - let pkgJson = readJSON(pkgJsonPath); + const pkgJson = readJSON(pkgJsonPath); - if (placeholderNames.includes(pkgJson.name)) { - pkgJson.name = ctx.project.name; + if (!placeholderNames.includes(pkgJson.name)) { + return; } - // Run any transformers defined by the template - if (ctx.template.transformPackageJson) { - const transformed = await ctx.template.transformPackageJson(pkgJson); - pkgJson = deepmerge(pkgJson, transformed); + const s = spinner(); + s.start("Updating name in `package.json`"); + + pkgJson.name = ctx.project.name; + + writeJSON(pkgJsonPath, pkgJson); + s.stop(`${brandColor("updated")} ${dim("`package.json`")}`); +}; + +export const updatePackageScripts = async (ctx: C3Context) => { + if (!ctx.template.transformPackageJson) { + return; } - // Write the finalized package.json to disk + const s = spinner(); + s.start("Updating `package.json` scripts"); + + const pkgJsonPath = resolve(ctx.project.path, "package.json"); + let pkgJson = readJSON(pkgJsonPath); + + // Run any transformers defined by the template + const transformed = await ctx.template.transformPackageJson(pkgJson); + pkgJson = deepmerge(pkgJson, transformed); + writeJSON(pkgJsonPath, pkgJson); s.stop(`${brandColor("updated")} ${dim("`package.json`")}`); }; diff --git a/packages/create-cloudflare/templates/next/c3.ts b/packages/create-cloudflare/templates/next/c3.ts index 43e766d01aa5..7dd1a29a4355 100644 --- a/packages/create-cloudflare/templates/next/c3.ts +++ b/packages/create-cloudflare/templates/next/c3.ts @@ -112,7 +112,7 @@ const configure = async (ctx: C3Context) => { await writeEslintrc(ctx); } - writeFile(`${projectPath}/next.config.js`, nextConfig); + writeFile(`${projectPath}/next.config.mjs`, nextConfig); updateStatus("Updated the next.config.js file"); writeFile(`${projectPath}/README.md`, readme); diff --git a/packages/create-cloudflare/templates/next/templates.ts b/packages/create-cloudflare/templates/next/templates.ts index f932240b8864..e638756acce4 100644 --- a/packages/create-cloudflare/templates/next/templates.ts +++ b/packages/create-cloudflare/templates/next/templates.ts @@ -180,30 +180,30 @@ const styles = { } as const; `; -export const nextConfig = `/** @type {import('next').NextConfig} */ -const nextConfig = {} +export const nextConfig = `import { setupDevBindings } from '@cloudflare/next-on-pages/next-dev'; -module.exports = nextConfig +/** @type {import('next').NextConfig} */ +const nextConfig = {}; // Here we use the @cloudflare/next-on-pages next-dev module to allow us to use bindings during local development // (when running the application with \`next dev\`), for more information see: -// https://github.com/dario-piotrowicz/next-on-pages/blob/8e93067/internal-packages/next-dev/README.md +// https://github.com/cloudflare/next-on-pages/blob/8e93067/internal-packages/next-dev/README.md if (process.env.NODE_ENV === 'development') { - import('@cloudflare/next-on-pages/next-dev').then(({ setupDevBindings }) => { - setupDevBindings({ - bindings: { - // Add here the Cloudflare Bindings you want to have available during local development, - // for more details on Bindings see: https://developers.cloudflare.com/pages/functions/bindings/) - // - // KV Example: - // MY_KV: { - // type: 'kv', - // id: 'xxx', - // } - } - }) - }) + await setupDevBindings({ + bindings: { + // Add here the Cloudflare Bindings you want to have available during local development, + // for more details on Bindings see: https://developers.cloudflare.com/pages/functions/bindings/) + // + // KV Example: + // MY_KV: { + // type: 'kv', + // id: 'xxx', + // } + } + }); } + +export default nextConfig; `; export const envDts = `declare global { diff --git a/packages/create-cloudflare/templates/qwik/c3.ts b/packages/create-cloudflare/templates/qwik/c3.ts index 04764717d670..0256e90dc917 100644 --- a/packages/create-cloudflare/templates/qwik/c3.ts +++ b/packages/create-cloudflare/templates/qwik/c3.ts @@ -1,9 +1,13 @@ import { endSection } from "@cloudflare/cli"; +import { brandColor } from "@cloudflare/cli/colors"; +import { spinner } from "@cloudflare/cli/interactive"; +import { parseTs, transformFile } from "helpers/codemod"; import { runCommand, runFrameworkGenerator } from "helpers/command"; -import { compatDateFlag } from "helpers/files"; +import { usesTypescript } from "helpers/files"; import { detectPackageManager } from "helpers/packages"; import { quoteShellArgs } from "../../src/common"; import type { TemplateConfig } from "../../src/templates"; +import type * as recast from "recast"; import type { C3Context } from "types"; const { npm, npx } = detectPackageManager(); @@ -12,11 +16,62 @@ const generate = async (ctx: C3Context) => { await runFrameworkGenerator(ctx, ["basic", ctx.project.name]); }; -const configure = async () => { +const configure = async (ctx: C3Context) => { // Add the pages integration const cmd = [npx, "qwik", "add", "cloudflare-pages"]; endSection(`Running ${quoteShellArgs(cmd)}`); await runCommand(cmd); + + addBindingsProxy(ctx); +}; + +const addBindingsProxy = (ctx: C3Context) => { + // Qwik only has a typescript template atm. + // This check is an extra precaution + if (!usesTypescript(ctx)) { + return; + } + + const s = spinner(); + s.start("Updating `vite.config.ts`"); + + // Insert the env declaration after the last import (but before the rest of the body) + const envDeclaration = ` +let env = {}; + +if(process.env.NODE_ENV === 'development') { + const { getBindingsProxy } = await import('wrangler'); + const { bindings } = await getBindingsProxy(); + env = bindings; +} +`; + + transformFile("vite.config.ts", { + visitProgram: function (n) { + const lastImportIndex = n.node.body.findLastIndex( + (t) => t.type === "ImportDeclaration" + ); + n.get("body").insertAt(lastImportIndex + 1, envDeclaration); + + return false; + }, + }); + + // Populate the `qwikCity` plugin with the platform object containing the `env` defined above. + const platformObject = parseTs(`{ platform: { env } }`); + + transformFile("vite.config.ts", { + visitCallExpression: function (n) { + const callee = n.node.callee as recast.types.namedTypes.Identifier; + if (callee.name === "qwikCity") { + n.node.arguments = [platformObject]; + } + + this.traverse(n); + }, + }); + + s.stop(`${brandColor("updated")} \`vite.config.ts\``); }; const config: TemplateConfig = { @@ -24,12 +79,13 @@ const config: TemplateConfig = { id: "qwik", displayName: "Qwik", platform: "pages", + devScript: "dev", + deployScript: "deploy", generate, configure, transformPackageJson: async () => ({ scripts: { - "pages:dev": `wrangler pages dev ${await compatDateFlag()} -- ${npm} run dev`, - "pages:deploy": `${npm} run build && wrangler pages deploy ./dist`, + deploy: `${npm} run build && wrangler pages deploy ./dist`, }, }), }; diff --git a/packages/create-cloudflare/tsconfig.json b/packages/create-cloudflare/tsconfig.json index 465ea5cdbcab..0168b9a96931 100644 --- a/packages/create-cloudflare/tsconfig.json +++ b/packages/create-cloudflare/tsconfig.json @@ -5,6 +5,7 @@ "exclude": [ "node_modules", "dist", + "e2e-tests/fixtures/*", // exclude all template files other than the top level ones so // that we can catch `c3.ts`. For example, any top level files in // templates/angular/ will be included, but any directories will not diff --git a/packages/edge-preview-authenticated-proxy/package.json b/packages/edge-preview-authenticated-proxy/package.json index f0c1dd56ed3a..9137c63de23e 100644 --- a/packages/edge-preview-authenticated-proxy/package.json +++ b/packages/edge-preview-authenticated-proxy/package.json @@ -4,7 +4,7 @@ "private": true, "scripts": { "check:lint": "eslint .", - "publish": "wrangler deploy", + "deploy": "wrangler deploy", "start": "wrangler dev", "test": "vitest run", "type:tests": "tsc -p ./tests/tsconfig.json", diff --git a/packages/edge-preview-authenticated-proxy/src/index.ts b/packages/edge-preview-authenticated-proxy/src/index.ts index 828bb6d9a8bd..99f2215ec5fe 100644 --- a/packages/edge-preview-authenticated-proxy/src/index.ts +++ b/packages/edge-preview-authenticated-proxy/src/index.ts @@ -74,6 +74,19 @@ class PreviewRequestFailed extends HttpError { } } +class InvalidURL extends HttpError { + constructor(private readonly url: string) { + super("Invalid URL", 400, false); + } + get data() { + return { url: this.url }; + } +} + +function assertValidURL(maybeUrl: string) { + if (!URL.canParse(maybeUrl)) throw new InvalidURL(maybeUrl); +} + function switchRemote(url: URL, remote: string) { const workerUrl = new URL(url); const remoteUrl = new URL(remote); @@ -252,6 +265,9 @@ async function updatePreviewToken(url: URL, env: Env, ctx: ExecutionContext) { throw new TokenUpdateFailed(); } + assertValidURL(prewarmUrl); + assertValidURL(remote); + ctx.waitUntil( fetch(prewarmUrl, { method: "POST", @@ -296,6 +312,7 @@ async function handleTokenExchange(url: URL) { if (!exchangeUrl) { throw new NoExchangeUrl(); } + assertValidURL(exchangeUrl); const exchangeRes = await fetch(exchangeUrl); if (exchangeRes.status !== 200) { const exchange = new URL(exchangeUrl); diff --git a/packages/edge-preview-authenticated-proxy/tests/index.test.ts b/packages/edge-preview-authenticated-proxy/tests/index.test.ts index 2fe601286ceb..615d68dc3d7b 100644 --- a/packages/edge-preview-authenticated-proxy/tests/index.test.ts +++ b/packages/edge-preview-authenticated-proxy/tests/index.test.ts @@ -107,6 +107,16 @@ compatibility_date = "2023-01-01" ` ); }); + it("should reject invalid exchange_url", async () => { + const resp = await worker.fetch( + `https://preview.devprod.cloudflare.dev/exchange?exchange_url=not_an_exchange_url`, + { method: "POST" } + ); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"Error\\",\\"message\\":\\"Invalid URL\\"}"' + ); + }); it("should allow tokens > 4096 bytes", async () => { // 4096 is the size limit for cookies const token = randomBytes(4096).toString("hex"); @@ -179,6 +189,28 @@ compatibility_date = "2023-01-01" .split(";")[0] .split("=")[1]; }); + it("should reject invalid prewarm url", async () => { + const resp = await worker.fetch( + `https://random-data.preview.devprod.cloudflare.dev/.update-preview-token?token=TEST_TOKEN&prewarm=not_a_prewarm_url&remote=${encodeURIComponent( + `http://127.0.0.1:${remote.port}` + )}&suffix=${encodeURIComponent("/hello?world")}` + ); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"Error\\",\\"message\\":\\"Invalid URL\\"}"' + ); + }); + it("should reject invalid remote url", async () => { + const resp = await worker.fetch( + `https://random-data.preview.devprod.cloudflare.dev/.update-preview-token?token=TEST_TOKEN&prewarm=${encodeURIComponent( + `http://127.0.0.1:${remote.port}/prewarm` + )}&remote=not_a_remote_url&suffix=${encodeURIComponent("/hello?world")}` + ); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"Error\\",\\"message\\":\\"Invalid URL\\"}"' + ); + }); it("should convert cookie to header", async () => { const resp = await worker.fetch( diff --git a/packages/format-errors/package.json b/packages/format-errors/package.json index ec2d6da71247..6dd910dc39f9 100644 --- a/packages/format-errors/package.json +++ b/packages/format-errors/package.json @@ -4,7 +4,7 @@ "private": true, "scripts": { "check:lint": "eslint .", - "publish": "wrangler deploy", + "deploy": "wrangler deploy", "build": "wrangler build", "start": "wrangler dev" }, diff --git a/packages/format-errors/src/index.ts b/packages/format-errors/src/index.ts index d620a7d37ad3..e5dfd10a2e2a 100644 --- a/packages/format-errors/src/index.ts +++ b/packages/format-errors/src/index.ts @@ -155,8 +155,16 @@ export default { }, }, }); + + // Validate payload outside of Sentry/metrics reporting + let payload: Payload; + try { + payload = PayloadSchema.parse(await request.json()); + } catch { + return new Response("Invalid payload", { status: 400 }); + } + try { - const payload = PayloadSchema.parse(await request.json()); return handlePrettyErrorRequest(payload); } catch (e) { sentry.captureException(e); diff --git a/packages/miniflare/CHANGELOG.md b/packages/miniflare/CHANGELOG.md index a43f6d47bac2..0f637769d2a9 100644 --- a/packages/miniflare/CHANGELOG.md +++ b/packages/miniflare/CHANGELOG.md @@ -1,5 +1,53 @@ # miniflare +## 3.20240129.1 + +### Minor Changes + +- [#4905](https://github.com/cloudflare/workers-sdk/pull/4905) [`148feff6`](https://github.com/cloudflare/workers-sdk/commit/148feff60c9bf3886c0e0fd1ea98049955c27659) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - feature: add a `getCf` method to Miniflare instances + + add a new `getCf` method attached to instances of `Miniflare`, this `getCf` returns + the `cf` object that the Miniflare instance provides to the actual workers and it + depends of the core option of the same name + + Example: + + ```ts + import { Miniflare } from "miniflare"; + + const mf = new Miniflare({ ... }); + + const cf = await mf.getCf(); + + console.log(`country = ${cf.country} ; colo = ${cf.colo}`); // logs 'country = GB ; colo = LHR' + ``` + +## 3.20240129.0 + +### Minor Changes + +- [#4873](https://github.com/cloudflare/workers-sdk/pull/4873) [`1e424ff2`](https://github.com/cloudflare/workers-sdk/commit/1e424ff280610657e997df8290d0b39b0393c845) Thanks [@dom96](https://github.com/dom96)! - feature: implemented basic Python support + + Here is an example showing how to construct a MiniFlare instance with a Python module: + + ```js + const mf = new Miniflare({ + modules: [ + { + type: "PythonModule", + path: "index", + contents: + "from js import Response;\ndef fetch(request):\n return Response.new('hello')", + }, + ], + compatibilityFlags: ["experimental"], + }); + ``` + +### Patch Changes + +- [#4874](https://github.com/cloudflare/workers-sdk/pull/4874) [`749fa3c0`](https://github.com/cloudflare/workers-sdk/commit/749fa3c05e6b9fcaa59a72f60f7936b7beaed5ad) Thanks [@mrbbot](https://github.com/mrbbot)! - chore: bump `workerd` to [`1.20240129.0`](https://github.com/cloudflare/workerd/releases/tag/v1.20240129.0) + ## 3.20231218.4 ### Patch Changes diff --git a/packages/miniflare/README.md b/packages/miniflare/README.md index b23ee9ea6b7f..163fd1cae65b 100644 --- a/packages/miniflare/README.md +++ b/packages/miniflare/README.md @@ -406,7 +406,7 @@ parameter in module format Workers. await this.STORE.fetch(this.baseURL + key, { method: "DELETE" }); } } - + // env has the type { STORE: Fetcher, NAMESPACE?: string } export default function (env) { return new MiniKV(env); @@ -770,3 +770,7 @@ defined at the top-level. `Miniflare#dispatchFetch()` cannot be called. Additionally, calling this function will invalidate any values returned by the `Miniflare#get*()` methods, preventing them from being used. + +- `getCf(): Promise>` + + Returns the same object returned from incoming `Request`'s `cf` property. This object depends on the `cf` property from `SharedOptions`. diff --git a/packages/miniflare/package.json b/packages/miniflare/package.json index 6f2b0d7452d8..48180d7e3b79 100644 --- a/packages/miniflare/package.json +++ b/packages/miniflare/package.json @@ -1,6 +1,6 @@ { "name": "miniflare", - "version": "3.20231218.4", + "version": "3.20240129.1", "description": "Fun, full-featured, fully-local simulator for Cloudflare Workers", "keywords": [ "cloudflare", @@ -49,7 +49,7 @@ "glob-to-regexp": "^0.4.1", "stoppable": "^1.1.0", "undici": "^5.28.2", - "workerd": "1.20231218.0", + "workerd": "1.20240129.0", "ws": "^8.11.0", "youch": "^3.2.2", "zod": "^3.20.6" diff --git a/packages/miniflare/src/index.ts b/packages/miniflare/src/index.ts index 58a61a0dc374..6215a283c648 100644 --- a/packages/miniflare/src/index.ts +++ b/packages/miniflare/src/index.ts @@ -615,6 +615,8 @@ export class Miniflare { #runtimeDispatcher?: Dispatcher; #proxyClient?: ProxyClient; + #cfObject?: Record = {}; + // Path to temporary directory for use as scratch space/"in-memory" Durable // Object storage. Note this may not exist, it's up to the consumers to // create this if needed. Deleted on `dispose()`. @@ -962,6 +964,7 @@ export class Miniflare { const sharedOpts = this.#sharedOpts; sharedOpts.core.cf = await setupCf(this.#log, sharedOpts.core.cf); + this.#cfObject = sharedOpts.core.cf; const durableObjectClassNames = getDurableObjectClassNames(allWorkerOpts); const wrappedBindingNames = getWrappedBindingNames( @@ -1169,7 +1172,13 @@ export class Miniflare { ); } - return { services: servicesArray, sockets, extensions }; + const autogates = [ + // Enables Python support in workerd. + // TODO(later): remove this once this gate is removed from workerd. + "workerd-autogate-builtin-wasm-modules" + ]; + + return { services: servicesArray, sockets, extensions, autogates }; } async #assembleAndUpdateConfig() { @@ -1318,6 +1327,13 @@ export class Miniflare { return this.#waitForReady(); } + async getCf(): Promise> { + this.#checkDisposed(); + await this.ready; + + return JSON.parse(JSON.stringify(this.#cfObject)); + } + async getInspectorURL(): Promise { this.#checkDisposed(); await this.ready; @@ -1525,6 +1541,7 @@ export class Miniflare { // Get a `Fetcher` to that worker (NOTE: the `ProxyServer` Durable Object // shares its `env` with Miniflare's entry worker, so has access to routes) const bindingName = CoreBindings.SERVICE_USER_ROUTE_PREFIX + workerName; + const fetcher = proxyClient.env[bindingName]; if (fetcher === undefined) { // `#findAndAssertWorkerIndex()` will throw if a "worker" doesn't exist diff --git a/packages/miniflare/src/plugins/core/modules.ts b/packages/miniflare/src/plugins/core/modules.ts index 9cb043b17fa1..9bb782aad392 100644 --- a/packages/miniflare/src/plugins/core/modules.ts +++ b/packages/miniflare/src/plugins/core/modules.ts @@ -46,6 +46,8 @@ export const ModuleRuleTypeSchema = z.enum([ "Text", "Data", "CompiledWasm", + "PythonModule", + "PythonRequirement" ]); export type ModuleRuleType = z.infer; @@ -347,6 +349,12 @@ ${dim(modulesConfig)}`; case "CompiledWasm": this.modules.push({ name, wasm: data }); break; + case "PythonModule": + this.modules.push({ name, pythonModule: data.toString("utf-8") }); + break; + case "PythonRequirement": + this.modules.push({ name, pythonRequirement: data.toString("utf-8") }); + break; default: // `type` should've been validated against `ModuleRuleTypeSchema` const exhaustive: never = rule.type; @@ -405,6 +413,10 @@ export function convertModuleDefinition( return { name, data: contentsToArray(contents) }; case "CompiledWasm": return { name, wasm: contentsToArray(contents) }; + case "PythonModule": + return { name, pythonModule: contentsToString(contents) }; + case "PythonRequirement": + return { name, pythonRequirement: contentsToString(contents) }; default: // `type` should've been validated against `ModuleRuleTypeSchema` const exhaustive: never = def.type; @@ -425,6 +437,8 @@ function convertWorkerModule(mod: Worker_Module): ModuleDefinition { else if ("text" in m) return { path, type: "Text" }; else if ("data" in m) return { path, type: "Data" }; else if ("wasm" in m) return { path, type: "CompiledWasm" }; + else if ("pythonModule" in m) return { path, type: "PythonModule" }; + else if ("pythonRequirement" in m) return { path, type: "PythonRequirement" }; // This function is only used for building error messages including // generated modules, and these are the types we generate. diff --git a/packages/miniflare/src/plugins/core/proxy/client.ts b/packages/miniflare/src/plugins/core/proxy/client.ts index 670d0a5b14b3..138285a2a5af 100644 --- a/packages/miniflare/src/plugins/core/proxy/client.ts +++ b/packages/miniflare/src/plugins/core/proxy/client.ts @@ -332,6 +332,7 @@ class ProxyStubHandler implements ProxyHandler { { value: stringifiedResult, unbufferedStream }, this.revivers ); + // We get an empty stack trace if we thread the caller through here, // specifying `this.#parseAsyncResponse` is good enough though, we just // get an extra `processTicksAndRejections` entry diff --git a/packages/miniflare/src/runtime/config/workerd.capnp b/packages/miniflare/src/runtime/config/workerd.capnp index 0bd98b55515e..863de6667b62 100644 --- a/packages/miniflare/src/runtime/config/workerd.capnp +++ b/packages/miniflare/src/runtime/config/workerd.capnp @@ -34,7 +34,12 @@ # afraid to fall back to code for anything the config cannot express, as Workers are very fast # to execute! -$import "/capnp/c++.capnp".namespace("workerd::server::config"); +# Any capnp files imported here must be: +# 1. embedded into workerd-meta.capnp +# 2. added to `tryImportBulitin` in workerd.c++ (grep for '"/workerd/workerd.capnp"'). +using Cxx = import "/capnp/c++.capnp"; +$Cxx.namespace("workerd::server::config"); +$Cxx.allowCancellation; struct Config { # Top-level configuration for a workerd instance. @@ -74,6 +79,11 @@ struct Config { extensions @3 :List(Extension); # Extensions provide capabilities to all workers. Extensions are usually prepared separately # and are late-linked with the app using this config field. + + autogates @4 :List(Text); + # A list of gates which are enabled. + # These are used to gate features/changes in workerd and in our internal repo. See the equivalent + # config definition in our internal repo for more details. } # ======================================================================================== @@ -252,6 +262,15 @@ struct Worker { # (a) allows for importing Node.js-compat built-ins without the node: specifier-prefix # (b) exposes the subset of common Node.js globals such as process, Buffer, etc that # we implement in the workerd runtime. + + pythonModule @8 :Text; + # A Python module. All bundles containing this value type are converted into a JS/WASM Worker + # Bundle prior to execution. + + pythonRequirement @9 :Text; + # A Python package that is required by this bundle. The package must be supported by + # Pyodide (https://pyodide.org/en/stable/usage/packages-in-pyodide.html). All packages listed + # will be installed prior to the execution of the worker. } } @@ -485,7 +504,7 @@ struct Worker { } } - globalOutbound @6 :ServiceDesignator = (name = "internet"); + globalOutbound @6 :ServiceDesignator = "internet"; # Where should the global "fetch" go to? The default is the service called "internet", which # should usually be configured to talk to the public internet. @@ -579,6 +598,9 @@ struct Worker { # TODO(someday): Support distributing objects across a cluster. At present, objects are always # local to one instance of the runtime. + + moduleFallback @13 :Text; + } struct ExternalServer { diff --git a/packages/miniflare/src/runtime/config/workerd.capnp.d.ts b/packages/miniflare/src/runtime/config/workerd.capnp.d.ts index 60d71619aeea..d16661156a5c 100644 --- a/packages/miniflare/src/runtime/config/workerd.capnp.d.ts +++ b/packages/miniflare/src/runtime/config/workerd.capnp.d.ts @@ -2,989 +2,983 @@ * This file has been automatically generated by the [capnpc-ts utility](https://github.com/jdiaz5513/capnp-ts). */ import * as capnp from "capnp-ts"; -import { Struct as __S } from "capnp-ts"; +import { Struct as __S } from 'capnp-ts'; export declare const _capnpFileId = "e6afd26682091c01"; export declare class Config extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - static _Services: capnp.ListCtor; - static _Sockets: capnp.ListCtor; - static _Extensions: capnp.ListCtor; - adoptServices(value: capnp.Orphan>): void; - disownServices(): capnp.Orphan>; - getServices(): capnp.List; - hasServices(): boolean; - initServices(length: number): capnp.List; - setServices(value: capnp.List): void; - adoptSockets(value: capnp.Orphan>): void; - disownSockets(): capnp.Orphan>; - getSockets(): capnp.List; - hasSockets(): boolean; - initSockets(length: number): capnp.List; - setSockets(value: capnp.List): void; - adoptV8Flags(value: capnp.Orphan>): void; - disownV8Flags(): capnp.Orphan>; - getV8Flags(): capnp.List; - hasV8Flags(): boolean; - initV8Flags(length: number): capnp.List; - setV8Flags(value: capnp.List): void; - adoptExtensions(value: capnp.Orphan>): void; - disownExtensions(): capnp.Orphan>; - getExtensions(): capnp.List; - hasExtensions(): boolean; - initExtensions(length: number): capnp.List; - setExtensions(value: capnp.List): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + static _Services: capnp.ListCtor; + static _Sockets: capnp.ListCtor; + static _Extensions: capnp.ListCtor; + adoptServices(value: capnp.Orphan>): void; + disownServices(): capnp.Orphan>; + getServices(): capnp.List; + hasServices(): boolean; + initServices(length: number): capnp.List; + setServices(value: capnp.List): void; + adoptSockets(value: capnp.Orphan>): void; + disownSockets(): capnp.Orphan>; + getSockets(): capnp.List; + hasSockets(): boolean; + initSockets(length: number): capnp.List; + setSockets(value: capnp.List): void; + adoptV8Flags(value: capnp.Orphan>): void; + disownV8Flags(): capnp.Orphan>; + getV8Flags(): capnp.List; + hasV8Flags(): boolean; + initV8Flags(length: number): capnp.List; + setV8Flags(value: capnp.List): void; + adoptExtensions(value: capnp.Orphan>): void; + disownExtensions(): capnp.Orphan>; + getExtensions(): capnp.List; + hasExtensions(): boolean; + initExtensions(length: number): capnp.List; + setExtensions(value: capnp.List): void; + adoptAutogates(value: capnp.Orphan>): void; + disownAutogates(): capnp.Orphan>; + getAutogates(): capnp.List; + hasAutogates(): boolean; + initAutogates(length: number): capnp.List; + setAutogates(value: capnp.List): void; + toString(): string; } export declare class Socket_Https extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - adoptOptions(value: capnp.Orphan): void; - disownOptions(): capnp.Orphan; - getOptions(): HttpOptions; - hasOptions(): boolean; - initOptions(): HttpOptions; - setOptions(value: HttpOptions): void; - adoptTlsOptions(value: capnp.Orphan): void; - disownTlsOptions(): capnp.Orphan; - getTlsOptions(): TlsOptions; - hasTlsOptions(): boolean; - initTlsOptions(): TlsOptions; - setTlsOptions(value: TlsOptions): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + adoptOptions(value: capnp.Orphan): void; + disownOptions(): capnp.Orphan; + getOptions(): HttpOptions; + hasOptions(): boolean; + initOptions(): HttpOptions; + setOptions(value: HttpOptions): void; + adoptTlsOptions(value: capnp.Orphan): void; + disownTlsOptions(): capnp.Orphan; + getTlsOptions(): TlsOptions; + hasTlsOptions(): boolean; + initTlsOptions(): TlsOptions; + setTlsOptions(value: TlsOptions): void; + toString(): string; } export declare enum Socket_Which { - HTTP = 0, - HTTPS = 1, + HTTP = 0, + HTTPS = 1 } export declare class Socket extends __S { - static readonly HTTP = Socket_Which.HTTP; - static readonly HTTPS = Socket_Which.HTTPS; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - setName(value: string): void; - getAddress(): string; - setAddress(value: string): void; - adoptHttp(value: capnp.Orphan): void; - disownHttp(): capnp.Orphan; - getHttp(): HttpOptions; - hasHttp(): boolean; - initHttp(): HttpOptions; - isHttp(): boolean; - setHttp(value: HttpOptions): void; - getHttps(): Socket_Https; - initHttps(): Socket_Https; - isHttps(): boolean; - setHttps(): void; - adoptService(value: capnp.Orphan): void; - disownService(): capnp.Orphan; - getService(): ServiceDesignator; - hasService(): boolean; - initService(): ServiceDesignator; - setService(value: ServiceDesignator): void; - toString(): string; - which(): Socket_Which; + static readonly HTTP = Socket_Which.HTTP; + static readonly HTTPS = Socket_Which.HTTPS; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + setName(value: string): void; + getAddress(): string; + setAddress(value: string): void; + adoptHttp(value: capnp.Orphan): void; + disownHttp(): capnp.Orphan; + getHttp(): HttpOptions; + hasHttp(): boolean; + initHttp(): HttpOptions; + isHttp(): boolean; + setHttp(value: HttpOptions): void; + getHttps(): Socket_Https; + initHttps(): Socket_Https; + isHttps(): boolean; + setHttps(): void; + adoptService(value: capnp.Orphan): void; + disownService(): capnp.Orphan; + getService(): ServiceDesignator; + hasService(): boolean; + initService(): ServiceDesignator; + setService(value: ServiceDesignator): void; + toString(): string; + which(): Socket_Which; } export declare enum Service_Which { - UNSPECIFIED = 0, - WORKER = 1, - NETWORK = 2, - EXTERNAL = 3, - DISK = 4, + UNSPECIFIED = 0, + WORKER = 1, + NETWORK = 2, + EXTERNAL = 3, + DISK = 4 } export declare class Service extends __S { - static readonly UNSPECIFIED = Service_Which.UNSPECIFIED; - static readonly WORKER = Service_Which.WORKER; - static readonly NETWORK = Service_Which.NETWORK; - static readonly EXTERNAL = Service_Which.EXTERNAL; - static readonly DISK = Service_Which.DISK; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - setName(value: string): void; - isUnspecified(): boolean; - setUnspecified(): void; - adoptWorker(value: capnp.Orphan): void; - disownWorker(): capnp.Orphan; - getWorker(): Worker; - hasWorker(): boolean; - initWorker(): Worker; - isWorker(): boolean; - setWorker(value: Worker): void; - adoptNetwork(value: capnp.Orphan): void; - disownNetwork(): capnp.Orphan; - getNetwork(): Network; - hasNetwork(): boolean; - initNetwork(): Network; - isNetwork(): boolean; - setNetwork(value: Network): void; - adoptExternal(value: capnp.Orphan): void; - disownExternal(): capnp.Orphan; - getExternal(): ExternalServer; - hasExternal(): boolean; - initExternal(): ExternalServer; - isExternal(): boolean; - setExternal(value: ExternalServer): void; - adoptDisk(value: capnp.Orphan): void; - disownDisk(): capnp.Orphan; - getDisk(): DiskDirectory; - hasDisk(): boolean; - initDisk(): DiskDirectory; - isDisk(): boolean; - setDisk(value: DiskDirectory): void; - toString(): string; - which(): Service_Which; + static readonly UNSPECIFIED = Service_Which.UNSPECIFIED; + static readonly WORKER = Service_Which.WORKER; + static readonly NETWORK = Service_Which.NETWORK; + static readonly EXTERNAL = Service_Which.EXTERNAL; + static readonly DISK = Service_Which.DISK; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + setName(value: string): void; + isUnspecified(): boolean; + setUnspecified(): void; + adoptWorker(value: capnp.Orphan): void; + disownWorker(): capnp.Orphan; + getWorker(): Worker; + hasWorker(): boolean; + initWorker(): Worker; + isWorker(): boolean; + setWorker(value: Worker): void; + adoptNetwork(value: capnp.Orphan): void; + disownNetwork(): capnp.Orphan; + getNetwork(): Network; + hasNetwork(): boolean; + initNetwork(): Network; + isNetwork(): boolean; + setNetwork(value: Network): void; + adoptExternal(value: capnp.Orphan): void; + disownExternal(): capnp.Orphan; + getExternal(): ExternalServer; + hasExternal(): boolean; + initExternal(): ExternalServer; + isExternal(): boolean; + setExternal(value: ExternalServer): void; + adoptDisk(value: capnp.Orphan): void; + disownDisk(): capnp.Orphan; + getDisk(): DiskDirectory; + hasDisk(): boolean; + initDisk(): DiskDirectory; + isDisk(): boolean; + setDisk(value: DiskDirectory): void; + toString(): string; + which(): Service_Which; } export declare class ServiceDesignator extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - setName(value: string): void; - getEntrypoint(): string; - setEntrypoint(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + setName(value: string): void; + getEntrypoint(): string; + setEntrypoint(value: string): void; + toString(): string; } export declare enum Worker_Module_Which { - ES_MODULE = 0, - COMMON_JS_MODULE = 1, - TEXT = 2, - DATA = 3, - WASM = 4, - JSON = 5, - NODE_JS_COMPAT_MODULE = 6, + ES_MODULE = 0, + COMMON_JS_MODULE = 1, + TEXT = 2, + DATA = 3, + WASM = 4, + JSON = 5, + NODE_JS_COMPAT_MODULE = 6, + PYTHON_MODULE = 7, + PYTHON_REQUIREMENT = 8 } export declare class Worker_Module extends __S { - static readonly ES_MODULE = Worker_Module_Which.ES_MODULE; - static readonly COMMON_JS_MODULE = Worker_Module_Which.COMMON_JS_MODULE; - static readonly TEXT = Worker_Module_Which.TEXT; - static readonly DATA = Worker_Module_Which.DATA; - static readonly WASM = Worker_Module_Which.WASM; - static readonly JSON = Worker_Module_Which.JSON; - static readonly NODE_JS_COMPAT_MODULE = - Worker_Module_Which.NODE_JS_COMPAT_MODULE; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - setName(value: string): void; - getEsModule(): string; - isEsModule(): boolean; - setEsModule(value: string): void; - getCommonJsModule(): string; - isCommonJsModule(): boolean; - setCommonJsModule(value: string): void; - getText(): string; - isText(): boolean; - setText(value: string): void; - adoptData(value: capnp.Orphan): void; - disownData(): capnp.Orphan; - getData(): capnp.Data; - hasData(): boolean; - initData(length: number): capnp.Data; - isData(): boolean; - setData(value: capnp.Data): void; - adoptWasm(value: capnp.Orphan): void; - disownWasm(): capnp.Orphan; - getWasm(): capnp.Data; - hasWasm(): boolean; - initWasm(length: number): capnp.Data; - isWasm(): boolean; - setWasm(value: capnp.Data): void; - getJson(): string; - isJson(): boolean; - setJson(value: string): void; - getNodeJsCompatModule(): string; - isNodeJsCompatModule(): boolean; - setNodeJsCompatModule(value: string): void; - toString(): string; - which(): Worker_Module_Which; + static readonly ES_MODULE = Worker_Module_Which.ES_MODULE; + static readonly COMMON_JS_MODULE = Worker_Module_Which.COMMON_JS_MODULE; + static readonly TEXT = Worker_Module_Which.TEXT; + static readonly DATA = Worker_Module_Which.DATA; + static readonly WASM = Worker_Module_Which.WASM; + static readonly JSON = Worker_Module_Which.JSON; + static readonly NODE_JS_COMPAT_MODULE = Worker_Module_Which.NODE_JS_COMPAT_MODULE; + static readonly PYTHON_MODULE = Worker_Module_Which.PYTHON_MODULE; + static readonly PYTHON_REQUIREMENT = Worker_Module_Which.PYTHON_REQUIREMENT; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + setName(value: string): void; + getEsModule(): string; + isEsModule(): boolean; + setEsModule(value: string): void; + getCommonJsModule(): string; + isCommonJsModule(): boolean; + setCommonJsModule(value: string): void; + getText(): string; + isText(): boolean; + setText(value: string): void; + adoptData(value: capnp.Orphan): void; + disownData(): capnp.Orphan; + getData(): capnp.Data; + hasData(): boolean; + initData(length: number): capnp.Data; + isData(): boolean; + setData(value: capnp.Data): void; + adoptWasm(value: capnp.Orphan): void; + disownWasm(): capnp.Orphan; + getWasm(): capnp.Data; + hasWasm(): boolean; + initWasm(length: number): capnp.Data; + isWasm(): boolean; + setWasm(value: capnp.Data): void; + getJson(): string; + isJson(): boolean; + setJson(value: string): void; + getNodeJsCompatModule(): string; + isNodeJsCompatModule(): boolean; + setNodeJsCompatModule(value: string): void; + getPythonModule(): string; + isPythonModule(): boolean; + setPythonModule(value: string): void; + getPythonRequirement(): string; + isPythonRequirement(): boolean; + setPythonRequirement(value: string): void; + toString(): string; + which(): Worker_Module_Which; } export declare enum Worker_Binding_Type_Which { - UNSPECIFIED = 0, - TEXT = 1, - DATA = 2, - JSON = 3, - WASM = 4, - CRYPTO_KEY = 5, - SERVICE = 6, - DURABLE_OBJECT_NAMESPACE = 7, - KV_NAMESPACE = 8, - R2BUCKET = 9, - R2ADMIN = 10, - QUEUE = 11, - ANALYTICS_ENGINE = 12, - HYPERDRIVE = 13, + UNSPECIFIED = 0, + TEXT = 1, + DATA = 2, + JSON = 3, + WASM = 4, + CRYPTO_KEY = 5, + SERVICE = 6, + DURABLE_OBJECT_NAMESPACE = 7, + KV_NAMESPACE = 8, + R2BUCKET = 9, + R2ADMIN = 10, + QUEUE = 11, + ANALYTICS_ENGINE = 12, + HYPERDRIVE = 13 } export declare class Worker_Binding_Type extends __S { - static readonly UNSPECIFIED = Worker_Binding_Type_Which.UNSPECIFIED; - static readonly TEXT = Worker_Binding_Type_Which.TEXT; - static readonly DATA = Worker_Binding_Type_Which.DATA; - static readonly JSON = Worker_Binding_Type_Which.JSON; - static readonly WASM = Worker_Binding_Type_Which.WASM; - static readonly CRYPTO_KEY = Worker_Binding_Type_Which.CRYPTO_KEY; - static readonly SERVICE = Worker_Binding_Type_Which.SERVICE; - static readonly DURABLE_OBJECT_NAMESPACE = - Worker_Binding_Type_Which.DURABLE_OBJECT_NAMESPACE; - static readonly KV_NAMESPACE = Worker_Binding_Type_Which.KV_NAMESPACE; - static readonly R2BUCKET = Worker_Binding_Type_Which.R2BUCKET; - static readonly R2ADMIN = Worker_Binding_Type_Which.R2ADMIN; - static readonly QUEUE = Worker_Binding_Type_Which.QUEUE; - static readonly ANALYTICS_ENGINE = Worker_Binding_Type_Which.ANALYTICS_ENGINE; - static readonly HYPERDRIVE = Worker_Binding_Type_Which.HYPERDRIVE; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - isUnspecified(): boolean; - setUnspecified(): void; - isText(): boolean; - setText(): void; - isData(): boolean; - setData(): void; - isJson(): boolean; - setJson(): void; - isWasm(): boolean; - setWasm(): void; - adoptCryptoKey( - value: capnp.Orphan> - ): void; - disownCryptoKey(): capnp.Orphan>; - getCryptoKey(): capnp.List; - hasCryptoKey(): boolean; - initCryptoKey(length: number): capnp.List; - isCryptoKey(): boolean; - setCryptoKey(value: capnp.List): void; - isService(): boolean; - setService(): void; - isDurableObjectNamespace(): boolean; - setDurableObjectNamespace(): void; - isKvNamespace(): boolean; - setKvNamespace(): void; - isR2Bucket(): boolean; - setR2Bucket(): void; - isR2Admin(): boolean; - setR2Admin(): void; - isQueue(): boolean; - setQueue(): void; - isAnalyticsEngine(): boolean; - setAnalyticsEngine(): void; - isHyperdrive(): boolean; - setHyperdrive(): void; - toString(): string; - which(): Worker_Binding_Type_Which; + static readonly UNSPECIFIED = Worker_Binding_Type_Which.UNSPECIFIED; + static readonly TEXT = Worker_Binding_Type_Which.TEXT; + static readonly DATA = Worker_Binding_Type_Which.DATA; + static readonly JSON = Worker_Binding_Type_Which.JSON; + static readonly WASM = Worker_Binding_Type_Which.WASM; + static readonly CRYPTO_KEY = Worker_Binding_Type_Which.CRYPTO_KEY; + static readonly SERVICE = Worker_Binding_Type_Which.SERVICE; + static readonly DURABLE_OBJECT_NAMESPACE = Worker_Binding_Type_Which.DURABLE_OBJECT_NAMESPACE; + static readonly KV_NAMESPACE = Worker_Binding_Type_Which.KV_NAMESPACE; + static readonly R2BUCKET = Worker_Binding_Type_Which.R2BUCKET; + static readonly R2ADMIN = Worker_Binding_Type_Which.R2ADMIN; + static readonly QUEUE = Worker_Binding_Type_Which.QUEUE; + static readonly ANALYTICS_ENGINE = Worker_Binding_Type_Which.ANALYTICS_ENGINE; + static readonly HYPERDRIVE = Worker_Binding_Type_Which.HYPERDRIVE; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + isUnspecified(): boolean; + setUnspecified(): void; + isText(): boolean; + setText(): void; + isData(): boolean; + setData(): void; + isJson(): boolean; + setJson(): void; + isWasm(): boolean; + setWasm(): void; + adoptCryptoKey(value: capnp.Orphan>): void; + disownCryptoKey(): capnp.Orphan>; + getCryptoKey(): capnp.List; + hasCryptoKey(): boolean; + initCryptoKey(length: number): capnp.List; + isCryptoKey(): boolean; + setCryptoKey(value: capnp.List): void; + isService(): boolean; + setService(): void; + isDurableObjectNamespace(): boolean; + setDurableObjectNamespace(): void; + isKvNamespace(): boolean; + setKvNamespace(): void; + isR2Bucket(): boolean; + setR2Bucket(): void; + isR2Admin(): boolean; + setR2Admin(): void; + isQueue(): boolean; + setQueue(): void; + isAnalyticsEngine(): boolean; + setAnalyticsEngine(): void; + isHyperdrive(): boolean; + setHyperdrive(): void; + toString(): string; + which(): Worker_Binding_Type_Which; } export declare class Worker_Binding_DurableObjectNamespaceDesignator extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getClassName(): string; - setClassName(value: string): void; - getServiceName(): string; - setServiceName(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getClassName(): string; + setClassName(value: string): void; + getServiceName(): string; + setServiceName(value: string): void; + toString(): string; } export declare enum Worker_Binding_CryptoKey_Usage { - ENCRYPT = 0, - DECRYPT = 1, - SIGN = 2, - VERIFY = 3, - DERIVE_KEY = 4, - DERIVE_BITS = 5, - WRAP_KEY = 6, - UNWRAP_KEY = 7, + ENCRYPT = 0, + DECRYPT = 1, + SIGN = 2, + VERIFY = 3, + DERIVE_KEY = 4, + DERIVE_BITS = 5, + WRAP_KEY = 6, + UNWRAP_KEY = 7 } export declare enum Worker_Binding_CryptoKey_Algorithm_Which { - NAME = 0, - JSON = 1, + NAME = 0, + JSON = 1 } export declare class Worker_Binding_CryptoKey_Algorithm extends __S { - static readonly NAME = Worker_Binding_CryptoKey_Algorithm_Which.NAME; - static readonly JSON = Worker_Binding_CryptoKey_Algorithm_Which.JSON; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - isName(): boolean; - setName(value: string): void; - getJson(): string; - isJson(): boolean; - setJson(value: string): void; - toString(): string; - which(): Worker_Binding_CryptoKey_Algorithm_Which; + static readonly NAME = Worker_Binding_CryptoKey_Algorithm_Which.NAME; + static readonly JSON = Worker_Binding_CryptoKey_Algorithm_Which.JSON; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + isName(): boolean; + setName(value: string): void; + getJson(): string; + isJson(): boolean; + setJson(value: string): void; + toString(): string; + which(): Worker_Binding_CryptoKey_Algorithm_Which; } export declare enum Worker_Binding_CryptoKey_Which { - RAW = 0, - HEX = 1, - BASE64 = 2, - PKCS8 = 3, - SPKI = 4, - JWK = 5, + RAW = 0, + HEX = 1, + BASE64 = 2, + PKCS8 = 3, + SPKI = 4, + JWK = 5 } export declare class Worker_Binding_CryptoKey extends __S { - static readonly RAW = Worker_Binding_CryptoKey_Which.RAW; - static readonly HEX = Worker_Binding_CryptoKey_Which.HEX; - static readonly BASE64 = Worker_Binding_CryptoKey_Which.BASE64; - static readonly PKCS8 = Worker_Binding_CryptoKey_Which.PKCS8; - static readonly SPKI = Worker_Binding_CryptoKey_Which.SPKI; - static readonly JWK = Worker_Binding_CryptoKey_Which.JWK; - static readonly Usage: typeof Worker_Binding_CryptoKey_Usage; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultExtractable: DataView; - }; - adoptRaw(value: capnp.Orphan): void; - disownRaw(): capnp.Orphan; - getRaw(): capnp.Data; - hasRaw(): boolean; - initRaw(length: number): capnp.Data; - isRaw(): boolean; - setRaw(value: capnp.Data): void; - getHex(): string; - isHex(): boolean; - setHex(value: string): void; - getBase64(): string; - isBase64(): boolean; - setBase64(value: string): void; - getPkcs8(): string; - isPkcs8(): boolean; - setPkcs8(value: string): void; - getSpki(): string; - isSpki(): boolean; - setSpki(value: string): void; - getJwk(): string; - isJwk(): boolean; - setJwk(value: string): void; - getAlgorithm(): Worker_Binding_CryptoKey_Algorithm; - initAlgorithm(): Worker_Binding_CryptoKey_Algorithm; - getExtractable(): boolean; - setExtractable(value: boolean): void; - adoptUsages( - value: capnp.Orphan> - ): void; - disownUsages(): capnp.Orphan>; - getUsages(): capnp.List; - hasUsages(): boolean; - initUsages(length: number): capnp.List; - setUsages(value: capnp.List): void; - toString(): string; - which(): Worker_Binding_CryptoKey_Which; + static readonly RAW = Worker_Binding_CryptoKey_Which.RAW; + static readonly HEX = Worker_Binding_CryptoKey_Which.HEX; + static readonly BASE64 = Worker_Binding_CryptoKey_Which.BASE64; + static readonly PKCS8 = Worker_Binding_CryptoKey_Which.PKCS8; + static readonly SPKI = Worker_Binding_CryptoKey_Which.SPKI; + static readonly JWK = Worker_Binding_CryptoKey_Which.JWK; + static readonly Usage: typeof Worker_Binding_CryptoKey_Usage; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultExtractable: DataView; + }; + adoptRaw(value: capnp.Orphan): void; + disownRaw(): capnp.Orphan; + getRaw(): capnp.Data; + hasRaw(): boolean; + initRaw(length: number): capnp.Data; + isRaw(): boolean; + setRaw(value: capnp.Data): void; + getHex(): string; + isHex(): boolean; + setHex(value: string): void; + getBase64(): string; + isBase64(): boolean; + setBase64(value: string): void; + getPkcs8(): string; + isPkcs8(): boolean; + setPkcs8(value: string): void; + getSpki(): string; + isSpki(): boolean; + setSpki(value: string): void; + getJwk(): string; + isJwk(): boolean; + setJwk(value: string): void; + getAlgorithm(): Worker_Binding_CryptoKey_Algorithm; + initAlgorithm(): Worker_Binding_CryptoKey_Algorithm; + getExtractable(): boolean; + setExtractable(value: boolean): void; + adoptUsages(value: capnp.Orphan>): void; + disownUsages(): capnp.Orphan>; + getUsages(): capnp.List; + hasUsages(): boolean; + initUsages(length: number): capnp.List; + setUsages(value: capnp.List): void; + toString(): string; + which(): Worker_Binding_CryptoKey_Which; } export declare class Worker_Binding_WrappedBinding extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultEntrypoint: string; - }; - static _InnerBindings: capnp.ListCtor; - getModuleName(): string; - setModuleName(value: string): void; - getEntrypoint(): string; - setEntrypoint(value: string): void; - adoptInnerBindings(value: capnp.Orphan>): void; - disownInnerBindings(): capnp.Orphan>; - getInnerBindings(): capnp.List; - hasInnerBindings(): boolean; - initInnerBindings(length: number): capnp.List; - setInnerBindings(value: capnp.List): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultEntrypoint: string; + }; + static _InnerBindings: capnp.ListCtor; + getModuleName(): string; + setModuleName(value: string): void; + getEntrypoint(): string; + setEntrypoint(value: string): void; + adoptInnerBindings(value: capnp.Orphan>): void; + disownInnerBindings(): capnp.Orphan>; + getInnerBindings(): capnp.List; + hasInnerBindings(): boolean; + initInnerBindings(length: number): capnp.List; + setInnerBindings(value: capnp.List): void; + toString(): string; } export declare class Worker_Binding_Parameter extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - adoptType(value: capnp.Orphan): void; - disownType(): capnp.Orphan; - getType(): Worker_Binding_Type; - hasType(): boolean; - initType(): Worker_Binding_Type; - setType(value: Worker_Binding_Type): void; - getOptional(): boolean; - setOptional(value: boolean): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + adoptType(value: capnp.Orphan): void; + disownType(): capnp.Orphan; + getType(): Worker_Binding_Type; + hasType(): boolean; + initType(): Worker_Binding_Type; + setType(value: Worker_Binding_Type): void; + getOptional(): boolean; + setOptional(value: boolean): void; + toString(): string; } export declare class Worker_Binding_Hyperdrive extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - adoptDesignator(value: capnp.Orphan): void; - disownDesignator(): capnp.Orphan; - getDesignator(): ServiceDesignator; - hasDesignator(): boolean; - initDesignator(): ServiceDesignator; - setDesignator(value: ServiceDesignator): void; - getDatabase(): string; - setDatabase(value: string): void; - getUser(): string; - setUser(value: string): void; - getPassword(): string; - setPassword(value: string): void; - getScheme(): string; - setScheme(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + adoptDesignator(value: capnp.Orphan): void; + disownDesignator(): capnp.Orphan; + getDesignator(): ServiceDesignator; + hasDesignator(): boolean; + initDesignator(): ServiceDesignator; + setDesignator(value: ServiceDesignator): void; + getDatabase(): string; + setDatabase(value: string): void; + getUser(): string; + setUser(value: string): void; + getPassword(): string; + setPassword(value: string): void; + getScheme(): string; + setScheme(value: string): void; + toString(): string; } export declare enum Worker_Binding_Which { - UNSPECIFIED = 0, - PARAMETER = 1, - TEXT = 2, - DATA = 3, - JSON = 4, - WASM_MODULE = 5, - CRYPTO_KEY = 6, - SERVICE = 7, - DURABLE_OBJECT_NAMESPACE = 8, - KV_NAMESPACE = 9, - R2BUCKET = 10, - R2ADMIN = 11, - WRAPPED = 12, - QUEUE = 13, - FROM_ENVIRONMENT = 14, - ANALYTICS_ENGINE = 15, - HYPERDRIVE = 16, - UNSAFE_EVAL = 17, + UNSPECIFIED = 0, + PARAMETER = 1, + TEXT = 2, + DATA = 3, + JSON = 4, + WASM_MODULE = 5, + CRYPTO_KEY = 6, + SERVICE = 7, + DURABLE_OBJECT_NAMESPACE = 8, + KV_NAMESPACE = 9, + R2BUCKET = 10, + R2ADMIN = 11, + WRAPPED = 12, + QUEUE = 13, + FROM_ENVIRONMENT = 14, + ANALYTICS_ENGINE = 15, + HYPERDRIVE = 16, + UNSAFE_EVAL = 17 } export declare class Worker_Binding extends __S { - static readonly UNSPECIFIED = Worker_Binding_Which.UNSPECIFIED; - static readonly PARAMETER = Worker_Binding_Which.PARAMETER; - static readonly TEXT = Worker_Binding_Which.TEXT; - static readonly DATA = Worker_Binding_Which.DATA; - static readonly JSON = Worker_Binding_Which.JSON; - static readonly WASM_MODULE = Worker_Binding_Which.WASM_MODULE; - static readonly CRYPTO_KEY = Worker_Binding_Which.CRYPTO_KEY; - static readonly SERVICE = Worker_Binding_Which.SERVICE; - static readonly DURABLE_OBJECT_NAMESPACE = - Worker_Binding_Which.DURABLE_OBJECT_NAMESPACE; - static readonly KV_NAMESPACE = Worker_Binding_Which.KV_NAMESPACE; - static readonly R2BUCKET = Worker_Binding_Which.R2BUCKET; - static readonly R2ADMIN = Worker_Binding_Which.R2ADMIN; - static readonly WRAPPED = Worker_Binding_Which.WRAPPED; - static readonly QUEUE = Worker_Binding_Which.QUEUE; - static readonly FROM_ENVIRONMENT = Worker_Binding_Which.FROM_ENVIRONMENT; - static readonly ANALYTICS_ENGINE = Worker_Binding_Which.ANALYTICS_ENGINE; - static readonly HYPERDRIVE = Worker_Binding_Which.HYPERDRIVE; - static readonly UNSAFE_EVAL = Worker_Binding_Which.UNSAFE_EVAL; - static readonly Type: typeof Worker_Binding_Type; - static readonly DurableObjectNamespaceDesignator: typeof Worker_Binding_DurableObjectNamespaceDesignator; - static readonly CryptoKey: typeof Worker_Binding_CryptoKey; - static readonly WrappedBinding: typeof Worker_Binding_WrappedBinding; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - setName(value: string): void; - isUnspecified(): boolean; - setUnspecified(): void; - getParameter(): Worker_Binding_Parameter; - initParameter(): Worker_Binding_Parameter; - isParameter(): boolean; - setParameter(): void; - getText(): string; - isText(): boolean; - setText(value: string): void; - adoptData(value: capnp.Orphan): void; - disownData(): capnp.Orphan; - getData(): capnp.Data; - hasData(): boolean; - initData(length: number): capnp.Data; - isData(): boolean; - setData(value: capnp.Data): void; - getJson(): string; - isJson(): boolean; - setJson(value: string): void; - adoptWasmModule(value: capnp.Orphan): void; - disownWasmModule(): capnp.Orphan; - getWasmModule(): capnp.Data; - hasWasmModule(): boolean; - initWasmModule(length: number): capnp.Data; - isWasmModule(): boolean; - setWasmModule(value: capnp.Data): void; - adoptCryptoKey(value: capnp.Orphan): void; - disownCryptoKey(): capnp.Orphan; - getCryptoKey(): Worker_Binding_CryptoKey; - hasCryptoKey(): boolean; - initCryptoKey(): Worker_Binding_CryptoKey; - isCryptoKey(): boolean; - setCryptoKey(value: Worker_Binding_CryptoKey): void; - adoptService(value: capnp.Orphan): void; - disownService(): capnp.Orphan; - getService(): ServiceDesignator; - hasService(): boolean; - initService(): ServiceDesignator; - isService(): boolean; - setService(value: ServiceDesignator): void; - adoptDurableObjectNamespace( - value: capnp.Orphan - ): void; - disownDurableObjectNamespace(): capnp.Orphan; - getDurableObjectNamespace(): Worker_Binding_DurableObjectNamespaceDesignator; - hasDurableObjectNamespace(): boolean; - initDurableObjectNamespace(): Worker_Binding_DurableObjectNamespaceDesignator; - isDurableObjectNamespace(): boolean; - setDurableObjectNamespace( - value: Worker_Binding_DurableObjectNamespaceDesignator - ): void; - adoptKvNamespace(value: capnp.Orphan): void; - disownKvNamespace(): capnp.Orphan; - getKvNamespace(): ServiceDesignator; - hasKvNamespace(): boolean; - initKvNamespace(): ServiceDesignator; - isKvNamespace(): boolean; - setKvNamespace(value: ServiceDesignator): void; - adoptR2Bucket(value: capnp.Orphan): void; - disownR2Bucket(): capnp.Orphan; - getR2Bucket(): ServiceDesignator; - hasR2Bucket(): boolean; - initR2Bucket(): ServiceDesignator; - isR2Bucket(): boolean; - setR2Bucket(value: ServiceDesignator): void; - adoptR2Admin(value: capnp.Orphan): void; - disownR2Admin(): capnp.Orphan; - getR2Admin(): ServiceDesignator; - hasR2Admin(): boolean; - initR2Admin(): ServiceDesignator; - isR2Admin(): boolean; - setR2Admin(value: ServiceDesignator): void; - adoptWrapped(value: capnp.Orphan): void; - disownWrapped(): capnp.Orphan; - getWrapped(): Worker_Binding_WrappedBinding; - hasWrapped(): boolean; - initWrapped(): Worker_Binding_WrappedBinding; - isWrapped(): boolean; - setWrapped(value: Worker_Binding_WrappedBinding): void; - adoptQueue(value: capnp.Orphan): void; - disownQueue(): capnp.Orphan; - getQueue(): ServiceDesignator; - hasQueue(): boolean; - initQueue(): ServiceDesignator; - isQueue(): boolean; - setQueue(value: ServiceDesignator): void; - getFromEnvironment(): string; - isFromEnvironment(): boolean; - setFromEnvironment(value: string): void; - adoptAnalyticsEngine(value: capnp.Orphan): void; - disownAnalyticsEngine(): capnp.Orphan; - getAnalyticsEngine(): ServiceDesignator; - hasAnalyticsEngine(): boolean; - initAnalyticsEngine(): ServiceDesignator; - isAnalyticsEngine(): boolean; - setAnalyticsEngine(value: ServiceDesignator): void; - getHyperdrive(): Worker_Binding_Hyperdrive; - initHyperdrive(): Worker_Binding_Hyperdrive; - isHyperdrive(): boolean; - setHyperdrive(): void; - isUnsafeEval(): boolean; - setUnsafeEval(): void; - toString(): string; - which(): Worker_Binding_Which; + static readonly UNSPECIFIED = Worker_Binding_Which.UNSPECIFIED; + static readonly PARAMETER = Worker_Binding_Which.PARAMETER; + static readonly TEXT = Worker_Binding_Which.TEXT; + static readonly DATA = Worker_Binding_Which.DATA; + static readonly JSON = Worker_Binding_Which.JSON; + static readonly WASM_MODULE = Worker_Binding_Which.WASM_MODULE; + static readonly CRYPTO_KEY = Worker_Binding_Which.CRYPTO_KEY; + static readonly SERVICE = Worker_Binding_Which.SERVICE; + static readonly DURABLE_OBJECT_NAMESPACE = Worker_Binding_Which.DURABLE_OBJECT_NAMESPACE; + static readonly KV_NAMESPACE = Worker_Binding_Which.KV_NAMESPACE; + static readonly R2BUCKET = Worker_Binding_Which.R2BUCKET; + static readonly R2ADMIN = Worker_Binding_Which.R2ADMIN; + static readonly WRAPPED = Worker_Binding_Which.WRAPPED; + static readonly QUEUE = Worker_Binding_Which.QUEUE; + static readonly FROM_ENVIRONMENT = Worker_Binding_Which.FROM_ENVIRONMENT; + static readonly ANALYTICS_ENGINE = Worker_Binding_Which.ANALYTICS_ENGINE; + static readonly HYPERDRIVE = Worker_Binding_Which.HYPERDRIVE; + static readonly UNSAFE_EVAL = Worker_Binding_Which.UNSAFE_EVAL; + static readonly Type: typeof Worker_Binding_Type; + static readonly DurableObjectNamespaceDesignator: typeof Worker_Binding_DurableObjectNamespaceDesignator; + static readonly CryptoKey: typeof Worker_Binding_CryptoKey; + static readonly WrappedBinding: typeof Worker_Binding_WrappedBinding; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + setName(value: string): void; + isUnspecified(): boolean; + setUnspecified(): void; + getParameter(): Worker_Binding_Parameter; + initParameter(): Worker_Binding_Parameter; + isParameter(): boolean; + setParameter(): void; + getText(): string; + isText(): boolean; + setText(value: string): void; + adoptData(value: capnp.Orphan): void; + disownData(): capnp.Orphan; + getData(): capnp.Data; + hasData(): boolean; + initData(length: number): capnp.Data; + isData(): boolean; + setData(value: capnp.Data): void; + getJson(): string; + isJson(): boolean; + setJson(value: string): void; + adoptWasmModule(value: capnp.Orphan): void; + disownWasmModule(): capnp.Orphan; + getWasmModule(): capnp.Data; + hasWasmModule(): boolean; + initWasmModule(length: number): capnp.Data; + isWasmModule(): boolean; + setWasmModule(value: capnp.Data): void; + adoptCryptoKey(value: capnp.Orphan): void; + disownCryptoKey(): capnp.Orphan; + getCryptoKey(): Worker_Binding_CryptoKey; + hasCryptoKey(): boolean; + initCryptoKey(): Worker_Binding_CryptoKey; + isCryptoKey(): boolean; + setCryptoKey(value: Worker_Binding_CryptoKey): void; + adoptService(value: capnp.Orphan): void; + disownService(): capnp.Orphan; + getService(): ServiceDesignator; + hasService(): boolean; + initService(): ServiceDesignator; + isService(): boolean; + setService(value: ServiceDesignator): void; + adoptDurableObjectNamespace(value: capnp.Orphan): void; + disownDurableObjectNamespace(): capnp.Orphan; + getDurableObjectNamespace(): Worker_Binding_DurableObjectNamespaceDesignator; + hasDurableObjectNamespace(): boolean; + initDurableObjectNamespace(): Worker_Binding_DurableObjectNamespaceDesignator; + isDurableObjectNamespace(): boolean; + setDurableObjectNamespace(value: Worker_Binding_DurableObjectNamespaceDesignator): void; + adoptKvNamespace(value: capnp.Orphan): void; + disownKvNamespace(): capnp.Orphan; + getKvNamespace(): ServiceDesignator; + hasKvNamespace(): boolean; + initKvNamespace(): ServiceDesignator; + isKvNamespace(): boolean; + setKvNamespace(value: ServiceDesignator): void; + adoptR2Bucket(value: capnp.Orphan): void; + disownR2Bucket(): capnp.Orphan; + getR2Bucket(): ServiceDesignator; + hasR2Bucket(): boolean; + initR2Bucket(): ServiceDesignator; + isR2Bucket(): boolean; + setR2Bucket(value: ServiceDesignator): void; + adoptR2Admin(value: capnp.Orphan): void; + disownR2Admin(): capnp.Orphan; + getR2Admin(): ServiceDesignator; + hasR2Admin(): boolean; + initR2Admin(): ServiceDesignator; + isR2Admin(): boolean; + setR2Admin(value: ServiceDesignator): void; + adoptWrapped(value: capnp.Orphan): void; + disownWrapped(): capnp.Orphan; + getWrapped(): Worker_Binding_WrappedBinding; + hasWrapped(): boolean; + initWrapped(): Worker_Binding_WrappedBinding; + isWrapped(): boolean; + setWrapped(value: Worker_Binding_WrappedBinding): void; + adoptQueue(value: capnp.Orphan): void; + disownQueue(): capnp.Orphan; + getQueue(): ServiceDesignator; + hasQueue(): boolean; + initQueue(): ServiceDesignator; + isQueue(): boolean; + setQueue(value: ServiceDesignator): void; + getFromEnvironment(): string; + isFromEnvironment(): boolean; + setFromEnvironment(value: string): void; + adoptAnalyticsEngine(value: capnp.Orphan): void; + disownAnalyticsEngine(): capnp.Orphan; + getAnalyticsEngine(): ServiceDesignator; + hasAnalyticsEngine(): boolean; + initAnalyticsEngine(): ServiceDesignator; + isAnalyticsEngine(): boolean; + setAnalyticsEngine(value: ServiceDesignator): void; + getHyperdrive(): Worker_Binding_Hyperdrive; + initHyperdrive(): Worker_Binding_Hyperdrive; + isHyperdrive(): boolean; + setHyperdrive(): void; + isUnsafeEval(): boolean; + setUnsafeEval(): void; + toString(): string; + which(): Worker_Binding_Which; } export declare enum Worker_DurableObjectNamespace_Which { - UNIQUE_KEY = 0, - EPHEMERAL_LOCAL = 1, + UNIQUE_KEY = 0, + EPHEMERAL_LOCAL = 1 } export declare class Worker_DurableObjectNamespace extends __S { - static readonly UNIQUE_KEY = Worker_DurableObjectNamespace_Which.UNIQUE_KEY; - static readonly EPHEMERAL_LOCAL = - Worker_DurableObjectNamespace_Which.EPHEMERAL_LOCAL; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getClassName(): string; - setClassName(value: string): void; - getUniqueKey(): string; - isUniqueKey(): boolean; - setUniqueKey(value: string): void; - isEphemeralLocal(): boolean; - setEphemeralLocal(): void; - getPreventEviction(): boolean; - setPreventEviction(value: boolean): void; - toString(): string; - which(): Worker_DurableObjectNamespace_Which; + static readonly UNIQUE_KEY = Worker_DurableObjectNamespace_Which.UNIQUE_KEY; + static readonly EPHEMERAL_LOCAL = Worker_DurableObjectNamespace_Which.EPHEMERAL_LOCAL; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getClassName(): string; + setClassName(value: string): void; + getUniqueKey(): string; + isUniqueKey(): boolean; + setUniqueKey(value: string): void; + isEphemeralLocal(): boolean; + setEphemeralLocal(): void; + getPreventEviction(): boolean; + setPreventEviction(value: boolean): void; + toString(): string; + which(): Worker_DurableObjectNamespace_Which; } export declare enum Worker_DurableObjectStorage_Which { - NONE = 0, - IN_MEMORY = 1, - LOCAL_DISK = 2, + NONE = 0, + IN_MEMORY = 1, + LOCAL_DISK = 2 } export declare class Worker_DurableObjectStorage extends __S { - static readonly NONE = Worker_DurableObjectStorage_Which.NONE; - static readonly IN_MEMORY = Worker_DurableObjectStorage_Which.IN_MEMORY; - static readonly LOCAL_DISK = Worker_DurableObjectStorage_Which.LOCAL_DISK; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - isNone(): boolean; - setNone(): void; - isInMemory(): boolean; - setInMemory(): void; - getLocalDisk(): string; - isLocalDisk(): boolean; - setLocalDisk(value: string): void; - toString(): string; - which(): Worker_DurableObjectStorage_Which; + static readonly NONE = Worker_DurableObjectStorage_Which.NONE; + static readonly IN_MEMORY = Worker_DurableObjectStorage_Which.IN_MEMORY; + static readonly LOCAL_DISK = Worker_DurableObjectStorage_Which.LOCAL_DISK; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + isNone(): boolean; + setNone(): void; + isInMemory(): boolean; + setInMemory(): void; + getLocalDisk(): string; + isLocalDisk(): boolean; + setLocalDisk(value: string): void; + toString(): string; + which(): Worker_DurableObjectStorage_Which; } export declare enum Worker_Which { - MODULES = 0, - SERVICE_WORKER_SCRIPT = 1, - INHERIT = 2, + MODULES = 0, + SERVICE_WORKER_SCRIPT = 1, + INHERIT = 2 } export declare class Worker extends __S { - static readonly MODULES = Worker_Which.MODULES; - static readonly SERVICE_WORKER_SCRIPT = Worker_Which.SERVICE_WORKER_SCRIPT; - static readonly INHERIT = Worker_Which.INHERIT; - static readonly Module: typeof Worker_Module; - static readonly Binding: typeof Worker_Binding; - static readonly DurableObjectNamespace: typeof Worker_DurableObjectNamespace; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultGlobalOutbound: capnp.Pointer; - }; - static _Modules: capnp.ListCtor; - static _Bindings: capnp.ListCtor; - static _DurableObjectNamespaces: capnp.ListCtor; - adoptModules(value: capnp.Orphan>): void; - disownModules(): capnp.Orphan>; - getModules(): capnp.List; - hasModules(): boolean; - initModules(length: number): capnp.List; - isModules(): boolean; - setModules(value: capnp.List): void; - getServiceWorkerScript(): string; - isServiceWorkerScript(): boolean; - setServiceWorkerScript(value: string): void; - getInherit(): string; - isInherit(): boolean; - setInherit(value: string): void; - getCompatibilityDate(): string; - setCompatibilityDate(value: string): void; - adoptCompatibilityFlags(value: capnp.Orphan>): void; - disownCompatibilityFlags(): capnp.Orphan>; - getCompatibilityFlags(): capnp.List; - hasCompatibilityFlags(): boolean; - initCompatibilityFlags(length: number): capnp.List; - setCompatibilityFlags(value: capnp.List): void; - adoptBindings(value: capnp.Orphan>): void; - disownBindings(): capnp.Orphan>; - getBindings(): capnp.List; - hasBindings(): boolean; - initBindings(length: number): capnp.List; - setBindings(value: capnp.List): void; - adoptGlobalOutbound(value: capnp.Orphan): void; - disownGlobalOutbound(): capnp.Orphan; - getGlobalOutbound(): ServiceDesignator; - hasGlobalOutbound(): boolean; - initGlobalOutbound(): ServiceDesignator; - setGlobalOutbound(value: ServiceDesignator): void; - adoptCacheApiOutbound(value: capnp.Orphan): void; - disownCacheApiOutbound(): capnp.Orphan; - getCacheApiOutbound(): ServiceDesignator; - hasCacheApiOutbound(): boolean; - initCacheApiOutbound(): ServiceDesignator; - setCacheApiOutbound(value: ServiceDesignator): void; - adoptDurableObjectNamespaces( - value: capnp.Orphan> - ): void; - disownDurableObjectNamespaces(): capnp.Orphan< - capnp.List - >; - getDurableObjectNamespaces(): capnp.List; - hasDurableObjectNamespaces(): boolean; - initDurableObjectNamespaces( - length: number - ): capnp.List; - setDurableObjectNamespaces( - value: capnp.List - ): void; - getDurableObjectUniqueKeyModifier(): string; - setDurableObjectUniqueKeyModifier(value: string): void; - getDurableObjectStorage(): Worker_DurableObjectStorage; - initDurableObjectStorage(): Worker_DurableObjectStorage; - toString(): string; - which(): Worker_Which; + static readonly MODULES = Worker_Which.MODULES; + static readonly SERVICE_WORKER_SCRIPT = Worker_Which.SERVICE_WORKER_SCRIPT; + static readonly INHERIT = Worker_Which.INHERIT; + static readonly Module: typeof Worker_Module; + static readonly Binding: typeof Worker_Binding; + static readonly DurableObjectNamespace: typeof Worker_DurableObjectNamespace; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultGlobalOutbound: capnp.Pointer; + }; + static _Modules: capnp.ListCtor; + static _Bindings: capnp.ListCtor; + static _DurableObjectNamespaces: capnp.ListCtor; + adoptModules(value: capnp.Orphan>): void; + disownModules(): capnp.Orphan>; + getModules(): capnp.List; + hasModules(): boolean; + initModules(length: number): capnp.List; + isModules(): boolean; + setModules(value: capnp.List): void; + getServiceWorkerScript(): string; + isServiceWorkerScript(): boolean; + setServiceWorkerScript(value: string): void; + getInherit(): string; + isInherit(): boolean; + setInherit(value: string): void; + getCompatibilityDate(): string; + setCompatibilityDate(value: string): void; + adoptCompatibilityFlags(value: capnp.Orphan>): void; + disownCompatibilityFlags(): capnp.Orphan>; + getCompatibilityFlags(): capnp.List; + hasCompatibilityFlags(): boolean; + initCompatibilityFlags(length: number): capnp.List; + setCompatibilityFlags(value: capnp.List): void; + adoptBindings(value: capnp.Orphan>): void; + disownBindings(): capnp.Orphan>; + getBindings(): capnp.List; + hasBindings(): boolean; + initBindings(length: number): capnp.List; + setBindings(value: capnp.List): void; + adoptGlobalOutbound(value: capnp.Orphan): void; + disownGlobalOutbound(): capnp.Orphan; + getGlobalOutbound(): ServiceDesignator; + hasGlobalOutbound(): boolean; + initGlobalOutbound(): ServiceDesignator; + setGlobalOutbound(value: ServiceDesignator): void; + adoptCacheApiOutbound(value: capnp.Orphan): void; + disownCacheApiOutbound(): capnp.Orphan; + getCacheApiOutbound(): ServiceDesignator; + hasCacheApiOutbound(): boolean; + initCacheApiOutbound(): ServiceDesignator; + setCacheApiOutbound(value: ServiceDesignator): void; + adoptDurableObjectNamespaces(value: capnp.Orphan>): void; + disownDurableObjectNamespaces(): capnp.Orphan>; + getDurableObjectNamespaces(): capnp.List; + hasDurableObjectNamespaces(): boolean; + initDurableObjectNamespaces(length: number): capnp.List; + setDurableObjectNamespaces(value: capnp.List): void; + getDurableObjectUniqueKeyModifier(): string; + setDurableObjectUniqueKeyModifier(value: string): void; + getDurableObjectStorage(): Worker_DurableObjectStorage; + initDurableObjectStorage(): Worker_DurableObjectStorage; + getModuleFallback(): string; + setModuleFallback(value: string): void; + toString(): string; + which(): Worker_Which; } export declare class ExternalServer_Https extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - adoptOptions(value: capnp.Orphan): void; - disownOptions(): capnp.Orphan; - getOptions(): HttpOptions; - hasOptions(): boolean; - initOptions(): HttpOptions; - setOptions(value: HttpOptions): void; - adoptTlsOptions(value: capnp.Orphan): void; - disownTlsOptions(): capnp.Orphan; - getTlsOptions(): TlsOptions; - hasTlsOptions(): boolean; - initTlsOptions(): TlsOptions; - setTlsOptions(value: TlsOptions): void; - getCertificateHost(): string; - setCertificateHost(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + adoptOptions(value: capnp.Orphan): void; + disownOptions(): capnp.Orphan; + getOptions(): HttpOptions; + hasOptions(): boolean; + initOptions(): HttpOptions; + setOptions(value: HttpOptions): void; + adoptTlsOptions(value: capnp.Orphan): void; + disownTlsOptions(): capnp.Orphan; + getTlsOptions(): TlsOptions; + hasTlsOptions(): boolean; + initTlsOptions(): TlsOptions; + setTlsOptions(value: TlsOptions): void; + getCertificateHost(): string; + setCertificateHost(value: string): void; + toString(): string; } export declare class ExternalServer_Tcp extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - adoptTlsOptions(value: capnp.Orphan): void; - disownTlsOptions(): capnp.Orphan; - getTlsOptions(): TlsOptions; - hasTlsOptions(): boolean; - initTlsOptions(): TlsOptions; - setTlsOptions(value: TlsOptions): void; - getCertificateHost(): string; - setCertificateHost(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + adoptTlsOptions(value: capnp.Orphan): void; + disownTlsOptions(): capnp.Orphan; + getTlsOptions(): TlsOptions; + hasTlsOptions(): boolean; + initTlsOptions(): TlsOptions; + setTlsOptions(value: TlsOptions): void; + getCertificateHost(): string; + setCertificateHost(value: string): void; + toString(): string; } export declare enum ExternalServer_Which { - HTTP = 0, - HTTPS = 1, - TCP = 2, + HTTP = 0, + HTTPS = 1, + TCP = 2 } export declare class ExternalServer extends __S { - static readonly HTTP = ExternalServer_Which.HTTP; - static readonly HTTPS = ExternalServer_Which.HTTPS; - static readonly TCP = ExternalServer_Which.TCP; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getAddress(): string; - setAddress(value: string): void; - adoptHttp(value: capnp.Orphan): void; - disownHttp(): capnp.Orphan; - getHttp(): HttpOptions; - hasHttp(): boolean; - initHttp(): HttpOptions; - isHttp(): boolean; - setHttp(value: HttpOptions): void; - getHttps(): ExternalServer_Https; - initHttps(): ExternalServer_Https; - isHttps(): boolean; - setHttps(): void; - getTcp(): ExternalServer_Tcp; - initTcp(): ExternalServer_Tcp; - isTcp(): boolean; - setTcp(): void; - toString(): string; - which(): ExternalServer_Which; + static readonly HTTP = ExternalServer_Which.HTTP; + static readonly HTTPS = ExternalServer_Which.HTTPS; + static readonly TCP = ExternalServer_Which.TCP; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getAddress(): string; + setAddress(value: string): void; + adoptHttp(value: capnp.Orphan): void; + disownHttp(): capnp.Orphan; + getHttp(): HttpOptions; + hasHttp(): boolean; + initHttp(): HttpOptions; + isHttp(): boolean; + setHttp(value: HttpOptions): void; + getHttps(): ExternalServer_Https; + initHttps(): ExternalServer_Https; + isHttps(): boolean; + setHttps(): void; + getTcp(): ExternalServer_Tcp; + initTcp(): ExternalServer_Tcp; + isTcp(): boolean; + setTcp(): void; + toString(): string; + which(): ExternalServer_Which; } export declare class Network extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultAllow: capnp.Pointer; - }; - adoptAllow(value: capnp.Orphan>): void; - disownAllow(): capnp.Orphan>; - getAllow(): capnp.List; - hasAllow(): boolean; - initAllow(length: number): capnp.List; - setAllow(value: capnp.List): void; - adoptDeny(value: capnp.Orphan>): void; - disownDeny(): capnp.Orphan>; - getDeny(): capnp.List; - hasDeny(): boolean; - initDeny(length: number): capnp.List; - setDeny(value: capnp.List): void; - adoptTlsOptions(value: capnp.Orphan): void; - disownTlsOptions(): capnp.Orphan; - getTlsOptions(): TlsOptions; - hasTlsOptions(): boolean; - initTlsOptions(): TlsOptions; - setTlsOptions(value: TlsOptions): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultAllow: capnp.Pointer; + }; + adoptAllow(value: capnp.Orphan>): void; + disownAllow(): capnp.Orphan>; + getAllow(): capnp.List; + hasAllow(): boolean; + initAllow(length: number): capnp.List; + setAllow(value: capnp.List): void; + adoptDeny(value: capnp.Orphan>): void; + disownDeny(): capnp.Orphan>; + getDeny(): capnp.List; + hasDeny(): boolean; + initDeny(length: number): capnp.List; + setDeny(value: capnp.List): void; + adoptTlsOptions(value: capnp.Orphan): void; + disownTlsOptions(): capnp.Orphan; + getTlsOptions(): TlsOptions; + hasTlsOptions(): boolean; + initTlsOptions(): TlsOptions; + setTlsOptions(value: TlsOptions): void; + toString(): string; } export declare class DiskDirectory extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultWritable: DataView; - defaultAllowDotfiles: DataView; - }; - getPath(): string; - setPath(value: string): void; - getWritable(): boolean; - setWritable(value: boolean): void; - getAllowDotfiles(): boolean; - setAllowDotfiles(value: boolean): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultWritable: DataView; + defaultAllowDotfiles: DataView; + }; + getPath(): string; + setPath(value: string): void; + getWritable(): boolean; + setWritable(value: boolean): void; + getAllowDotfiles(): boolean; + setAllowDotfiles(value: boolean): void; + toString(): string; } export declare enum HttpOptions_Style { - HOST = 0, - PROXY = 1, + HOST = 0, + PROXY = 1 } export declare class HttpOptions_Header extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getName(): string; - setName(value: string): void; - getValue(): string; - setValue(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getName(): string; + setName(value: string): void; + getValue(): string; + setValue(value: string): void; + toString(): string; } export declare class HttpOptions extends __S { - static readonly Style: typeof HttpOptions_Style; - static readonly Header: typeof HttpOptions_Header; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultStyle: DataView; - }; - static _InjectRequestHeaders: capnp.ListCtor; - static _InjectResponseHeaders: capnp.ListCtor; - getStyle(): HttpOptions_Style; - setStyle(value: HttpOptions_Style): void; - getForwardedProtoHeader(): string; - setForwardedProtoHeader(value: string): void; - getCfBlobHeader(): string; - setCfBlobHeader(value: string): void; - adoptInjectRequestHeaders( - value: capnp.Orphan> - ): void; - disownInjectRequestHeaders(): capnp.Orphan>; - getInjectRequestHeaders(): capnp.List; - hasInjectRequestHeaders(): boolean; - initInjectRequestHeaders(length: number): capnp.List; - setInjectRequestHeaders(value: capnp.List): void; - adoptInjectResponseHeaders( - value: capnp.Orphan> - ): void; - disownInjectResponseHeaders(): capnp.Orphan>; - getInjectResponseHeaders(): capnp.List; - hasInjectResponseHeaders(): boolean; - initInjectResponseHeaders(length: number): capnp.List; - setInjectResponseHeaders(value: capnp.List): void; - toString(): string; + static readonly Style: typeof HttpOptions_Style; + static readonly Header: typeof HttpOptions_Header; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultStyle: DataView; + }; + static _InjectRequestHeaders: capnp.ListCtor; + static _InjectResponseHeaders: capnp.ListCtor; + getStyle(): HttpOptions_Style; + setStyle(value: HttpOptions_Style): void; + getForwardedProtoHeader(): string; + setForwardedProtoHeader(value: string): void; + getCfBlobHeader(): string; + setCfBlobHeader(value: string): void; + adoptInjectRequestHeaders(value: capnp.Orphan>): void; + disownInjectRequestHeaders(): capnp.Orphan>; + getInjectRequestHeaders(): capnp.List; + hasInjectRequestHeaders(): boolean; + initInjectRequestHeaders(length: number): capnp.List; + setInjectRequestHeaders(value: capnp.List): void; + adoptInjectResponseHeaders(value: capnp.Orphan>): void; + disownInjectResponseHeaders(): capnp.Orphan>; + getInjectResponseHeaders(): capnp.List; + hasInjectResponseHeaders(): boolean; + initInjectResponseHeaders(length: number): capnp.List; + setInjectResponseHeaders(value: capnp.List): void; + toString(): string; } export declare class TlsOptions_Keypair extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - getPrivateKey(): string; - setPrivateKey(value: string): void; - getCertificateChain(): string; - setCertificateChain(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + getPrivateKey(): string; + setPrivateKey(value: string): void; + getCertificateChain(): string; + setCertificateChain(value: string): void; + toString(): string; } export declare enum TlsOptions_Version { - GOOD_DEFAULT = 0, - SSL3 = 1, - TLS1DOT0 = 2, - TLS1DOT1 = 3, - TLS1DOT2 = 4, - TLS1DOT3 = 5, + GOOD_DEFAULT = 0, + SSL3 = 1, + TLS1DOT0 = 2, + TLS1DOT1 = 3, + TLS1DOT2 = 4, + TLS1DOT3 = 5 } export declare class TlsOptions extends __S { - static readonly Keypair: typeof TlsOptions_Keypair; - static readonly Version: typeof TlsOptions_Version; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultRequireClientCerts: DataView; - defaultTrustBrowserCas: DataView; - defaultMinVersion: DataView; - }; - adoptKeypair(value: capnp.Orphan): void; - disownKeypair(): capnp.Orphan; - getKeypair(): TlsOptions_Keypair; - hasKeypair(): boolean; - initKeypair(): TlsOptions_Keypair; - setKeypair(value: TlsOptions_Keypair): void; - getRequireClientCerts(): boolean; - setRequireClientCerts(value: boolean): void; - getTrustBrowserCas(): boolean; - setTrustBrowserCas(value: boolean): void; - adoptTrustedCertificates(value: capnp.Orphan>): void; - disownTrustedCertificates(): capnp.Orphan>; - getTrustedCertificates(): capnp.List; - hasTrustedCertificates(): boolean; - initTrustedCertificates(length: number): capnp.List; - setTrustedCertificates(value: capnp.List): void; - getMinVersion(): TlsOptions_Version; - setMinVersion(value: TlsOptions_Version): void; - getCipherList(): string; - setCipherList(value: string): void; - toString(): string; + static readonly Keypair: typeof TlsOptions_Keypair; + static readonly Version: typeof TlsOptions_Version; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultRequireClientCerts: DataView; + defaultTrustBrowserCas: DataView; + defaultMinVersion: DataView; + }; + adoptKeypair(value: capnp.Orphan): void; + disownKeypair(): capnp.Orphan; + getKeypair(): TlsOptions_Keypair; + hasKeypair(): boolean; + initKeypair(): TlsOptions_Keypair; + setKeypair(value: TlsOptions_Keypair): void; + getRequireClientCerts(): boolean; + setRequireClientCerts(value: boolean): void; + getTrustBrowserCas(): boolean; + setTrustBrowserCas(value: boolean): void; + adoptTrustedCertificates(value: capnp.Orphan>): void; + disownTrustedCertificates(): capnp.Orphan>; + getTrustedCertificates(): capnp.List; + hasTrustedCertificates(): boolean; + initTrustedCertificates(length: number): capnp.List; + setTrustedCertificates(value: capnp.List): void; + getMinVersion(): TlsOptions_Version; + setMinVersion(value: TlsOptions_Version): void; + getCipherList(): string; + setCipherList(value: string): void; + toString(): string; } export declare class Extension_Module extends __S { - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - defaultInternal: DataView; - }; - getName(): string; - setName(value: string): void; - getInternal(): boolean; - setInternal(value: boolean): void; - getEsModule(): string; - setEsModule(value: string): void; - toString(): string; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + defaultInternal: DataView; + }; + getName(): string; + setName(value: string): void; + getInternal(): boolean; + setInternal(value: boolean): void; + getEsModule(): string; + setEsModule(value: string): void; + toString(): string; } export declare class Extension extends __S { - static readonly Module: typeof Extension_Module; - static readonly _capnp: { - displayName: string; - id: string; - size: capnp.ObjectSize; - }; - static _Modules: capnp.ListCtor; - adoptModules(value: capnp.Orphan>): void; - disownModules(): capnp.Orphan>; - getModules(): capnp.List; - hasModules(): boolean; - initModules(length: number): capnp.List; - setModules(value: capnp.List): void; - toString(): string; + static readonly Module: typeof Extension_Module; + static readonly _capnp: { + displayName: string; + id: string; + size: capnp.ObjectSize; + }; + static _Modules: capnp.ListCtor; + adoptModules(value: capnp.Orphan>): void; + disownModules(): capnp.Orphan>; + getModules(): capnp.List; + hasModules(): boolean; + initModules(length: number): capnp.List; + setModules(value: capnp.List): void; + toString(): string; } diff --git a/packages/miniflare/src/runtime/config/workerd.capnp.js b/packages/miniflare/src/runtime/config/workerd.capnp.js index 85497180f4c3..c1d8fa74308e 100644 --- a/packages/miniflare/src/runtime/config/workerd.capnp.js +++ b/packages/miniflare/src/runtime/config/workerd.capnp.js @@ -1,50 +1,7 @@ "use strict"; /* tslint:disable */ Object.defineProperty(exports, "__esModule", { value: true }); -exports.Extension = - exports.Extension_Module = - exports.TlsOptions = - exports.TlsOptions_Version = - exports.TlsOptions_Keypair = - exports.HttpOptions = - exports.HttpOptions_Header = - exports.HttpOptions_Style = - exports.DiskDirectory = - exports.Network = - exports.ExternalServer = - exports.ExternalServer_Which = - exports.ExternalServer_Tcp = - exports.ExternalServer_Https = - exports.Worker = - exports.Worker_Which = - exports.Worker_DurableObjectStorage = - exports.Worker_DurableObjectStorage_Which = - exports.Worker_DurableObjectNamespace = - exports.Worker_DurableObjectNamespace_Which = - exports.Worker_Binding = - exports.Worker_Binding_Which = - exports.Worker_Binding_Hyperdrive = - exports.Worker_Binding_Parameter = - exports.Worker_Binding_WrappedBinding = - exports.Worker_Binding_CryptoKey = - exports.Worker_Binding_CryptoKey_Which = - exports.Worker_Binding_CryptoKey_Algorithm = - exports.Worker_Binding_CryptoKey_Algorithm_Which = - exports.Worker_Binding_CryptoKey_Usage = - exports.Worker_Binding_DurableObjectNamespaceDesignator = - exports.Worker_Binding_Type = - exports.Worker_Binding_Type_Which = - exports.Worker_Module = - exports.Worker_Module_Which = - exports.ServiceDesignator = - exports.Service = - exports.Service_Which = - exports.Socket = - exports.Socket_Which = - exports.Socket_Https = - exports.Config = - exports._capnpFileId = - void 0; +exports.Extension = exports.Extension_Module = exports.TlsOptions = exports.TlsOptions_Version = exports.TlsOptions_Keypair = exports.HttpOptions = exports.HttpOptions_Header = exports.HttpOptions_Style = exports.DiskDirectory = exports.Network = exports.ExternalServer = exports.ExternalServer_Which = exports.ExternalServer_Tcp = exports.ExternalServer_Https = exports.Worker = exports.Worker_Which = exports.Worker_DurableObjectStorage = exports.Worker_DurableObjectStorage_Which = exports.Worker_DurableObjectNamespace = exports.Worker_DurableObjectNamespace_Which = exports.Worker_Binding = exports.Worker_Binding_Which = exports.Worker_Binding_Hyperdrive = exports.Worker_Binding_Parameter = exports.Worker_Binding_WrappedBinding = exports.Worker_Binding_CryptoKey = exports.Worker_Binding_CryptoKey_Which = exports.Worker_Binding_CryptoKey_Algorithm = exports.Worker_Binding_CryptoKey_Algorithm_Which = exports.Worker_Binding_CryptoKey_Usage = exports.Worker_Binding_DurableObjectNamespaceDesignator = exports.Worker_Binding_Type = exports.Worker_Binding_Type_Which = exports.Worker_Module = exports.Worker_Module_Which = exports.ServiceDesignator = exports.Service = exports.Service_Which = exports.Socket = exports.Socket_Which = exports.Socket_Https = exports.Config = exports._capnpFileId = void 0; /** * This file has been automatically generated by the [capnpc-ts utility](https://github.com/jdiaz5513/capnp-ts). */ @@ -52,382 +9,200 @@ const capnp = require("capnp-ts"); const capnp_ts_1 = require("capnp-ts"); exports._capnpFileId = "e6afd26682091c01"; class Config extends capnp_ts_1.Struct { - adoptServices(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownServices() { - return capnp_ts_1.Struct.disown(this.getServices()); - } - getServices() { - return capnp_ts_1.Struct.getList(0, Config._Services, this); - } - hasServices() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initServices(length) { - return capnp_ts_1.Struct.initList(0, Config._Services, length, this); - } - setServices(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - adoptSockets(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownSockets() { - return capnp_ts_1.Struct.disown(this.getSockets()); - } - getSockets() { - return capnp_ts_1.Struct.getList(1, Config._Sockets, this); - } - hasSockets() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initSockets(length) { - return capnp_ts_1.Struct.initList(1, Config._Sockets, length, this); - } - setSockets(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptV8Flags(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownV8Flags() { - return capnp_ts_1.Struct.disown(this.getV8Flags()); - } - getV8Flags() { - return capnp_ts_1.Struct.getList(2, capnp.TextList, this); - } - hasV8Flags() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initV8Flags(length) { - return capnp_ts_1.Struct.initList(2, capnp.TextList, length, this); - } - setV8Flags(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - adoptExtensions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); - } - disownExtensions() { - return capnp_ts_1.Struct.disown(this.getExtensions()); - } - getExtensions() { - return capnp_ts_1.Struct.getList(3, Config._Extensions, this); - } - hasExtensions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); - } - initExtensions(length) { - return capnp_ts_1.Struct.initList(3, Config._Extensions, length, this); - } - setExtensions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); - } - toString() { - return "Config_" + super.toString(); - } + adoptServices(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); } + disownServices() { return capnp_ts_1.Struct.disown(this.getServices()); } + getServices() { return capnp_ts_1.Struct.getList(0, Config._Services, this); } + hasServices() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initServices(length) { return capnp_ts_1.Struct.initList(0, Config._Services, length, this); } + setServices(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); } + adoptSockets(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownSockets() { return capnp_ts_1.Struct.disown(this.getSockets()); } + getSockets() { return capnp_ts_1.Struct.getList(1, Config._Sockets, this); } + hasSockets() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initSockets(length) { return capnp_ts_1.Struct.initList(1, Config._Sockets, length, this); } + setSockets(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + adoptV8Flags(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownV8Flags() { return capnp_ts_1.Struct.disown(this.getV8Flags()); } + getV8Flags() { return capnp_ts_1.Struct.getList(2, capnp.TextList, this); } + hasV8Flags() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initV8Flags(length) { return capnp_ts_1.Struct.initList(2, capnp.TextList, length, this); } + setV8Flags(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + adoptExtensions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); } + disownExtensions() { return capnp_ts_1.Struct.disown(this.getExtensions()); } + getExtensions() { return capnp_ts_1.Struct.getList(3, Config._Extensions, this); } + hasExtensions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); } + initExtensions(length) { return capnp_ts_1.Struct.initList(3, Config._Extensions, length, this); } + setExtensions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); } + adoptAutogates(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(4, this)); } + disownAutogates() { return capnp_ts_1.Struct.disown(this.getAutogates()); } + getAutogates() { return capnp_ts_1.Struct.getList(4, capnp.TextList, this); } + hasAutogates() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(4, this)); } + initAutogates(length) { return capnp_ts_1.Struct.initList(4, capnp.TextList, length, this); } + setAutogates(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(4, this)); } + toString() { return "Config_" + super.toString(); } } exports.Config = Config; -Config._capnp = { - displayName: "Config", - id: "8794486c76aaa7d6", - size: new capnp_ts_1.ObjectSize(0, 4), -}; +Config._capnp = { displayName: "Config", id: "8794486c76aaa7d6", size: new capnp_ts_1.ObjectSize(0, 5) }; class Socket_Https extends capnp_ts_1.Struct { - adoptOptions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownOptions() { - return capnp_ts_1.Struct.disown(this.getOptions()); - } - getOptions() { - return capnp_ts_1.Struct.getStruct(2, HttpOptions, this); - } - hasOptions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initOptions() { - return capnp_ts_1.Struct.initStructAt(2, HttpOptions, this); - } - setOptions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - adoptTlsOptions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); - } - disownTlsOptions() { - return capnp_ts_1.Struct.disown(this.getTlsOptions()); - } - getTlsOptions() { - return capnp_ts_1.Struct.getStruct(3, TlsOptions, this); - } - hasTlsOptions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); - } - initTlsOptions() { - return capnp_ts_1.Struct.initStructAt(3, TlsOptions, this); - } - setTlsOptions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); - } - toString() { - return "Socket_Https_" + super.toString(); - } + adoptOptions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownOptions() { return capnp_ts_1.Struct.disown(this.getOptions()); } + getOptions() { return capnp_ts_1.Struct.getStruct(2, HttpOptions, this); } + hasOptions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initOptions() { return capnp_ts_1.Struct.initStructAt(2, HttpOptions, this); } + setOptions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + adoptTlsOptions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); } + disownTlsOptions() { return capnp_ts_1.Struct.disown(this.getTlsOptions()); } + getTlsOptions() { return capnp_ts_1.Struct.getStruct(3, TlsOptions, this); } + hasTlsOptions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); } + initTlsOptions() { return capnp_ts_1.Struct.initStructAt(3, TlsOptions, this); } + setTlsOptions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); } + toString() { return "Socket_Https_" + super.toString(); } } exports.Socket_Https = Socket_Https; -Socket_Https._capnp = { - displayName: "https", - id: "de123876383cbbdc", - size: new capnp_ts_1.ObjectSize(8, 5), -}; +Socket_Https._capnp = { displayName: "https", id: "de123876383cbbdc", size: new capnp_ts_1.ObjectSize(8, 5) }; var Socket_Which; (function (Socket_Which) { - Socket_Which[(Socket_Which["HTTP"] = 0)] = "HTTP"; - Socket_Which[(Socket_Which["HTTPS"] = 1)] = "HTTPS"; -})((Socket_Which = exports.Socket_Which || (exports.Socket_Which = {}))); + Socket_Which[Socket_Which["HTTP"] = 0] = "HTTP"; + Socket_Which[Socket_Which["HTTPS"] = 1] = "HTTPS"; +})(Socket_Which = exports.Socket_Which || (exports.Socket_Which = {})); class Socket extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getAddress() { - return capnp_ts_1.Struct.getText(1, this); - } - setAddress(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - adoptHttp(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownHttp() { - return capnp_ts_1.Struct.disown(this.getHttp()); - } - getHttp() { - capnp_ts_1.Struct.testWhich( - "http", - capnp_ts_1.Struct.getUint16(0, this), - 0, - this - ); - return capnp_ts_1.Struct.getStruct(2, HttpOptions, this); - } - hasHttp() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initHttp() { - capnp_ts_1.Struct.setUint16(0, 0, this); - return capnp_ts_1.Struct.initStructAt(2, HttpOptions, this); - } - isHttp() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setHttp(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - getHttps() { - capnp_ts_1.Struct.testWhich( - "https", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getAs(Socket_Https, this); - } - initHttps() { - capnp_ts_1.Struct.setUint16(0, 1, this); - return capnp_ts_1.Struct.getAs(Socket_Https, this); - } - isHttps() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setHttps() { - capnp_ts_1.Struct.setUint16(0, 1, this); - } - adoptService(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(4, this)); - } - disownService() { - return capnp_ts_1.Struct.disown(this.getService()); - } - getService() { - return capnp_ts_1.Struct.getStruct(4, ServiceDesignator, this); - } - hasService() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(4, this)); - } - initService() { - return capnp_ts_1.Struct.initStructAt(4, ServiceDesignator, this); - } - setService(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(4, this)); - } - toString() { - return "Socket_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getAddress() { return capnp_ts_1.Struct.getText(1, this); } + setAddress(value) { capnp_ts_1.Struct.setText(1, value, this); } + adoptHttp(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); + } + disownHttp() { return capnp_ts_1.Struct.disown(this.getHttp()); } + getHttp() { + capnp_ts_1.Struct.testWhich("http", capnp_ts_1.Struct.getUint16(0, this), 0, this); + return capnp_ts_1.Struct.getStruct(2, HttpOptions, this); + } + hasHttp() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initHttp() { + capnp_ts_1.Struct.setUint16(0, 0, this); + return capnp_ts_1.Struct.initStructAt(2, HttpOptions, this); + } + isHttp() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setHttp(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); + } + getHttps() { + capnp_ts_1.Struct.testWhich("https", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getAs(Socket_Https, this); + } + initHttps() { + capnp_ts_1.Struct.setUint16(0, 1, this); + return capnp_ts_1.Struct.getAs(Socket_Https, this); + } + isHttps() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setHttps() { capnp_ts_1.Struct.setUint16(0, 1, this); } + adoptService(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(4, this)); } + disownService() { return capnp_ts_1.Struct.disown(this.getService()); } + getService() { return capnp_ts_1.Struct.getStruct(4, ServiceDesignator, this); } + hasService() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(4, this)); } + initService() { return capnp_ts_1.Struct.initStructAt(4, ServiceDesignator, this); } + setService(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(4, this)); } + toString() { return "Socket_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Socket = Socket; Socket.HTTP = Socket_Which.HTTP; Socket.HTTPS = Socket_Which.HTTPS; -Socket._capnp = { - displayName: "Socket", - id: "9a0eba45530ee79f", - size: new capnp_ts_1.ObjectSize(8, 5), -}; +Socket._capnp = { displayName: "Socket", id: "9a0eba45530ee79f", size: new capnp_ts_1.ObjectSize(8, 5) }; var Service_Which; (function (Service_Which) { - Service_Which[(Service_Which["UNSPECIFIED"] = 0)] = "UNSPECIFIED"; - Service_Which[(Service_Which["WORKER"] = 1)] = "WORKER"; - Service_Which[(Service_Which["NETWORK"] = 2)] = "NETWORK"; - Service_Which[(Service_Which["EXTERNAL"] = 3)] = "EXTERNAL"; - Service_Which[(Service_Which["DISK"] = 4)] = "DISK"; -})((Service_Which = exports.Service_Which || (exports.Service_Which = {}))); + Service_Which[Service_Which["UNSPECIFIED"] = 0] = "UNSPECIFIED"; + Service_Which[Service_Which["WORKER"] = 1] = "WORKER"; + Service_Which[Service_Which["NETWORK"] = 2] = "NETWORK"; + Service_Which[Service_Which["EXTERNAL"] = 3] = "EXTERNAL"; + Service_Which[Service_Which["DISK"] = 4] = "DISK"; +})(Service_Which = exports.Service_Which || (exports.Service_Which = {})); class Service extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - isUnspecified() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setUnspecified() { - capnp_ts_1.Struct.setUint16(0, 0, this); - } - adoptWorker(value) { - capnp_ts_1.Struct.setUint16(0, 1, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownWorker() { - return capnp_ts_1.Struct.disown(this.getWorker()); - } - getWorker() { - capnp_ts_1.Struct.testWhich( - "worker", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getStruct(1, Worker, this); - } - hasWorker() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initWorker() { - capnp_ts_1.Struct.setUint16(0, 1, this); - return capnp_ts_1.Struct.initStructAt(1, Worker, this); - } - isWorker() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setWorker(value) { - capnp_ts_1.Struct.setUint16(0, 1, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptNetwork(value) { - capnp_ts_1.Struct.setUint16(0, 2, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownNetwork() { - return capnp_ts_1.Struct.disown(this.getNetwork()); - } - getNetwork() { - capnp_ts_1.Struct.testWhich( - "network", - capnp_ts_1.Struct.getUint16(0, this), - 2, - this - ); - return capnp_ts_1.Struct.getStruct(1, Network, this); - } - hasNetwork() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initNetwork() { - capnp_ts_1.Struct.setUint16(0, 2, this); - return capnp_ts_1.Struct.initStructAt(1, Network, this); - } - isNetwork() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setNetwork(value) { - capnp_ts_1.Struct.setUint16(0, 2, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptExternal(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownExternal() { - return capnp_ts_1.Struct.disown(this.getExternal()); - } - getExternal() { - capnp_ts_1.Struct.testWhich( - "external", - capnp_ts_1.Struct.getUint16(0, this), - 3, - this - ); - return capnp_ts_1.Struct.getStruct(1, ExternalServer, this); - } - hasExternal() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initExternal() { - capnp_ts_1.Struct.setUint16(0, 3, this); - return capnp_ts_1.Struct.initStructAt(1, ExternalServer, this); - } - isExternal() { - return capnp_ts_1.Struct.getUint16(0, this) === 3; - } - setExternal(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptDisk(value) { - capnp_ts_1.Struct.setUint16(0, 4, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownDisk() { - return capnp_ts_1.Struct.disown(this.getDisk()); - } - getDisk() { - capnp_ts_1.Struct.testWhich( - "disk", - capnp_ts_1.Struct.getUint16(0, this), - 4, - this - ); - return capnp_ts_1.Struct.getStruct(1, DiskDirectory, this); - } - hasDisk() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initDisk() { - capnp_ts_1.Struct.setUint16(0, 4, this); - return capnp_ts_1.Struct.initStructAt(1, DiskDirectory, this); - } - isDisk() { - return capnp_ts_1.Struct.getUint16(0, this) === 4; - } - setDisk(value) { - capnp_ts_1.Struct.setUint16(0, 4, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - toString() { - return "Service_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + isUnspecified() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setUnspecified() { capnp_ts_1.Struct.setUint16(0, 0, this); } + adoptWorker(value) { + capnp_ts_1.Struct.setUint16(0, 1, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownWorker() { return capnp_ts_1.Struct.disown(this.getWorker()); } + getWorker() { + capnp_ts_1.Struct.testWhich("worker", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getStruct(1, Worker, this); + } + hasWorker() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initWorker() { + capnp_ts_1.Struct.setUint16(0, 1, this); + return capnp_ts_1.Struct.initStructAt(1, Worker, this); + } + isWorker() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setWorker(value) { + capnp_ts_1.Struct.setUint16(0, 1, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptNetwork(value) { + capnp_ts_1.Struct.setUint16(0, 2, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownNetwork() { return capnp_ts_1.Struct.disown(this.getNetwork()); } + getNetwork() { + capnp_ts_1.Struct.testWhich("network", capnp_ts_1.Struct.getUint16(0, this), 2, this); + return capnp_ts_1.Struct.getStruct(1, Network, this); + } + hasNetwork() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initNetwork() { + capnp_ts_1.Struct.setUint16(0, 2, this); + return capnp_ts_1.Struct.initStructAt(1, Network, this); + } + isNetwork() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setNetwork(value) { + capnp_ts_1.Struct.setUint16(0, 2, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptExternal(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownExternal() { return capnp_ts_1.Struct.disown(this.getExternal()); } + getExternal() { + capnp_ts_1.Struct.testWhich("external", capnp_ts_1.Struct.getUint16(0, this), 3, this); + return capnp_ts_1.Struct.getStruct(1, ExternalServer, this); + } + hasExternal() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initExternal() { + capnp_ts_1.Struct.setUint16(0, 3, this); + return capnp_ts_1.Struct.initStructAt(1, ExternalServer, this); + } + isExternal() { return capnp_ts_1.Struct.getUint16(0, this) === 3; } + setExternal(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptDisk(value) { + capnp_ts_1.Struct.setUint16(0, 4, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownDisk() { return capnp_ts_1.Struct.disown(this.getDisk()); } + getDisk() { + capnp_ts_1.Struct.testWhich("disk", capnp_ts_1.Struct.getUint16(0, this), 4, this); + return capnp_ts_1.Struct.getStruct(1, DiskDirectory, this); + } + hasDisk() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initDisk() { + capnp_ts_1.Struct.setUint16(0, 4, this); + return capnp_ts_1.Struct.initStructAt(1, DiskDirectory, this); + } + isDisk() { return capnp_ts_1.Struct.getUint16(0, this) === 4; } + setDisk(value) { + capnp_ts_1.Struct.setUint16(0, 4, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + toString() { return "Service_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Service = Service; Service.UNSPECIFIED = Service_Which.UNSPECIFIED; @@ -435,202 +210,134 @@ Service.WORKER = Service_Which.WORKER; Service.NETWORK = Service_Which.NETWORK; Service.EXTERNAL = Service_Which.EXTERNAL; Service.DISK = Service_Which.DISK; -Service._capnp = { - displayName: "Service", - id: "e5c88e8bb7bcb6b9", - size: new capnp_ts_1.ObjectSize(8, 2), -}; +Service._capnp = { displayName: "Service", id: "e5c88e8bb7bcb6b9", size: new capnp_ts_1.ObjectSize(8, 2) }; class ServiceDesignator extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getEntrypoint() { - return capnp_ts_1.Struct.getText(1, this); - } - setEntrypoint(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return "ServiceDesignator_" + super.toString(); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getEntrypoint() { return capnp_ts_1.Struct.getText(1, this); } + setEntrypoint(value) { capnp_ts_1.Struct.setText(1, value, this); } + toString() { return "ServiceDesignator_" + super.toString(); } } exports.ServiceDesignator = ServiceDesignator; -ServiceDesignator._capnp = { - displayName: "ServiceDesignator", - id: "ae8ec91cee724450", - size: new capnp_ts_1.ObjectSize(0, 2), -}; +ServiceDesignator._capnp = { displayName: "ServiceDesignator", id: "ae8ec91cee724450", size: new capnp_ts_1.ObjectSize(0, 2) }; var Worker_Module_Which; (function (Worker_Module_Which) { - Worker_Module_Which[(Worker_Module_Which["ES_MODULE"] = 0)] = "ES_MODULE"; - Worker_Module_Which[(Worker_Module_Which["COMMON_JS_MODULE"] = 1)] = - "COMMON_JS_MODULE"; - Worker_Module_Which[(Worker_Module_Which["TEXT"] = 2)] = "TEXT"; - Worker_Module_Which[(Worker_Module_Which["DATA"] = 3)] = "DATA"; - Worker_Module_Which[(Worker_Module_Which["WASM"] = 4)] = "WASM"; - Worker_Module_Which[(Worker_Module_Which["JSON"] = 5)] = "JSON"; - Worker_Module_Which[(Worker_Module_Which["NODE_JS_COMPAT_MODULE"] = 6)] = - "NODE_JS_COMPAT_MODULE"; -})( - (Worker_Module_Which = - exports.Worker_Module_Which || (exports.Worker_Module_Which = {})) -); + Worker_Module_Which[Worker_Module_Which["ES_MODULE"] = 0] = "ES_MODULE"; + Worker_Module_Which[Worker_Module_Which["COMMON_JS_MODULE"] = 1] = "COMMON_JS_MODULE"; + Worker_Module_Which[Worker_Module_Which["TEXT"] = 2] = "TEXT"; + Worker_Module_Which[Worker_Module_Which["DATA"] = 3] = "DATA"; + Worker_Module_Which[Worker_Module_Which["WASM"] = 4] = "WASM"; + Worker_Module_Which[Worker_Module_Which["JSON"] = 5] = "JSON"; + Worker_Module_Which[Worker_Module_Which["NODE_JS_COMPAT_MODULE"] = 6] = "NODE_JS_COMPAT_MODULE"; + Worker_Module_Which[Worker_Module_Which["PYTHON_MODULE"] = 7] = "PYTHON_MODULE"; + Worker_Module_Which[Worker_Module_Which["PYTHON_REQUIREMENT"] = 8] = "PYTHON_REQUIREMENT"; +})(Worker_Module_Which = exports.Worker_Module_Which || (exports.Worker_Module_Which = {})); class Worker_Module extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getEsModule() { - capnp_ts_1.Struct.testWhich( - "esModule", - capnp_ts_1.Struct.getUint16(0, this), - 0, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isEsModule() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setEsModule(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.setText(1, value, this); - } - getCommonJsModule() { - capnp_ts_1.Struct.testWhich( - "commonJsModule", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isCommonJsModule() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setCommonJsModule(value) { - capnp_ts_1.Struct.setUint16(0, 1, this); - capnp_ts_1.Struct.setText(1, value, this); - } - getText() { - capnp_ts_1.Struct.testWhich( - "text", - capnp_ts_1.Struct.getUint16(0, this), - 2, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isText() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setText(value) { - capnp_ts_1.Struct.setUint16(0, 2, this); - capnp_ts_1.Struct.setText(1, value, this); - } - adoptData(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownData() { - return capnp_ts_1.Struct.disown(this.getData()); - } - getData() { - capnp_ts_1.Struct.testWhich( - "data", - capnp_ts_1.Struct.getUint16(0, this), - 3, - this - ); - return capnp_ts_1.Struct.getData(1, this); - } - hasData() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initData(length) { - capnp_ts_1.Struct.setUint16(0, 3, this); - return capnp_ts_1.Struct.initData(1, length, this); - } - isData() { - return capnp_ts_1.Struct.getUint16(0, this) === 3; - } - setData(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptWasm(value) { - capnp_ts_1.Struct.setUint16(0, 4, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownWasm() { - return capnp_ts_1.Struct.disown(this.getWasm()); - } - getWasm() { - capnp_ts_1.Struct.testWhich( - "wasm", - capnp_ts_1.Struct.getUint16(0, this), - 4, - this - ); - return capnp_ts_1.Struct.getData(1, this); - } - hasWasm() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initWasm(length) { - capnp_ts_1.Struct.setUint16(0, 4, this); - return capnp_ts_1.Struct.initData(1, length, this); - } - isWasm() { - return capnp_ts_1.Struct.getUint16(0, this) === 4; - } - setWasm(value) { - capnp_ts_1.Struct.setUint16(0, 4, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getJson() { - capnp_ts_1.Struct.testWhich( - "json", - capnp_ts_1.Struct.getUint16(0, this), - 5, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isJson() { - return capnp_ts_1.Struct.getUint16(0, this) === 5; - } - setJson(value) { - capnp_ts_1.Struct.setUint16(0, 5, this); - capnp_ts_1.Struct.setText(1, value, this); - } - getNodeJsCompatModule() { - capnp_ts_1.Struct.testWhich( - "nodeJsCompatModule", - capnp_ts_1.Struct.getUint16(0, this), - 6, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isNodeJsCompatModule() { - return capnp_ts_1.Struct.getUint16(0, this) === 6; - } - setNodeJsCompatModule(value) { - capnp_ts_1.Struct.setUint16(0, 6, this); - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return "Worker_Module_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getEsModule() { + capnp_ts_1.Struct.testWhich("esModule", capnp_ts_1.Struct.getUint16(0, this), 0, this); + return capnp_ts_1.Struct.getText(1, this); + } + isEsModule() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setEsModule(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.setText(1, value, this); + } + getCommonJsModule() { + capnp_ts_1.Struct.testWhich("commonJsModule", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getText(1, this); + } + isCommonJsModule() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setCommonJsModule(value) { + capnp_ts_1.Struct.setUint16(0, 1, this); + capnp_ts_1.Struct.setText(1, value, this); + } + getText() { + capnp_ts_1.Struct.testWhich("text", capnp_ts_1.Struct.getUint16(0, this), 2, this); + return capnp_ts_1.Struct.getText(1, this); + } + isText() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setText(value) { + capnp_ts_1.Struct.setUint16(0, 2, this); + capnp_ts_1.Struct.setText(1, value, this); + } + adoptData(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownData() { return capnp_ts_1.Struct.disown(this.getData()); } + getData() { + capnp_ts_1.Struct.testWhich("data", capnp_ts_1.Struct.getUint16(0, this), 3, this); + return capnp_ts_1.Struct.getData(1, this); + } + hasData() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initData(length) { + capnp_ts_1.Struct.setUint16(0, 3, this); + return capnp_ts_1.Struct.initData(1, length, this); + } + isData() { return capnp_ts_1.Struct.getUint16(0, this) === 3; } + setData(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptWasm(value) { + capnp_ts_1.Struct.setUint16(0, 4, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownWasm() { return capnp_ts_1.Struct.disown(this.getWasm()); } + getWasm() { + capnp_ts_1.Struct.testWhich("wasm", capnp_ts_1.Struct.getUint16(0, this), 4, this); + return capnp_ts_1.Struct.getData(1, this); + } + hasWasm() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initWasm(length) { + capnp_ts_1.Struct.setUint16(0, 4, this); + return capnp_ts_1.Struct.initData(1, length, this); + } + isWasm() { return capnp_ts_1.Struct.getUint16(0, this) === 4; } + setWasm(value) { + capnp_ts_1.Struct.setUint16(0, 4, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + getJson() { + capnp_ts_1.Struct.testWhich("json", capnp_ts_1.Struct.getUint16(0, this), 5, this); + return capnp_ts_1.Struct.getText(1, this); + } + isJson() { return capnp_ts_1.Struct.getUint16(0, this) === 5; } + setJson(value) { + capnp_ts_1.Struct.setUint16(0, 5, this); + capnp_ts_1.Struct.setText(1, value, this); + } + getNodeJsCompatModule() { + capnp_ts_1.Struct.testWhich("nodeJsCompatModule", capnp_ts_1.Struct.getUint16(0, this), 6, this); + return capnp_ts_1.Struct.getText(1, this); + } + isNodeJsCompatModule() { return capnp_ts_1.Struct.getUint16(0, this) === 6; } + setNodeJsCompatModule(value) { + capnp_ts_1.Struct.setUint16(0, 6, this); + capnp_ts_1.Struct.setText(1, value, this); + } + getPythonModule() { + capnp_ts_1.Struct.testWhich("pythonModule", capnp_ts_1.Struct.getUint16(0, this), 7, this); + return capnp_ts_1.Struct.getText(1, this); + } + isPythonModule() { return capnp_ts_1.Struct.getUint16(0, this) === 7; } + setPythonModule(value) { + capnp_ts_1.Struct.setUint16(0, 7, this); + capnp_ts_1.Struct.setText(1, value, this); + } + getPythonRequirement() { + capnp_ts_1.Struct.testWhich("pythonRequirement", capnp_ts_1.Struct.getUint16(0, this), 8, this); + return capnp_ts_1.Struct.getText(1, this); + } + isPythonRequirement() { return capnp_ts_1.Struct.getUint16(0, this) === 8; } + setPythonRequirement(value) { + capnp_ts_1.Struct.setUint16(0, 8, this); + capnp_ts_1.Struct.setText(1, value, this); + } + toString() { return "Worker_Module_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Worker_Module = Worker_Module; Worker_Module.ES_MODULE = Worker_Module_Which.ES_MODULE; @@ -640,159 +347,74 @@ Worker_Module.DATA = Worker_Module_Which.DATA; Worker_Module.WASM = Worker_Module_Which.WASM; Worker_Module.JSON = Worker_Module_Which.JSON; Worker_Module.NODE_JS_COMPAT_MODULE = Worker_Module_Which.NODE_JS_COMPAT_MODULE; -Worker_Module._capnp = { - displayName: "Module", - id: "d9d87a63770a12f3", - size: new capnp_ts_1.ObjectSize(8, 2), -}; +Worker_Module.PYTHON_MODULE = Worker_Module_Which.PYTHON_MODULE; +Worker_Module.PYTHON_REQUIREMENT = Worker_Module_Which.PYTHON_REQUIREMENT; +Worker_Module._capnp = { displayName: "Module", id: "d9d87a63770a12f3", size: new capnp_ts_1.ObjectSize(8, 2) }; var Worker_Binding_Type_Which; (function (Worker_Binding_Type_Which) { - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["UNSPECIFIED"] = 0)] = - "UNSPECIFIED"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["TEXT"] = 1)] = "TEXT"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["DATA"] = 2)] = "DATA"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["JSON"] = 3)] = "JSON"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["WASM"] = 4)] = "WASM"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["CRYPTO_KEY"] = 5)] = - "CRYPTO_KEY"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["SERVICE"] = 6)] = - "SERVICE"; - Worker_Binding_Type_Which[ - (Worker_Binding_Type_Which["DURABLE_OBJECT_NAMESPACE"] = 7) - ] = "DURABLE_OBJECT_NAMESPACE"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["KV_NAMESPACE"] = 8)] = - "KV_NAMESPACE"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["R2BUCKET"] = 9)] = - "R2BUCKET"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["R2ADMIN"] = 10)] = - "R2ADMIN"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["QUEUE"] = 11)] = - "QUEUE"; - Worker_Binding_Type_Which[ - (Worker_Binding_Type_Which["ANALYTICS_ENGINE"] = 12) - ] = "ANALYTICS_ENGINE"; - Worker_Binding_Type_Which[(Worker_Binding_Type_Which["HYPERDRIVE"] = 13)] = - "HYPERDRIVE"; -})( - (Worker_Binding_Type_Which = - exports.Worker_Binding_Type_Which || - (exports.Worker_Binding_Type_Which = {})) -); + Worker_Binding_Type_Which[Worker_Binding_Type_Which["UNSPECIFIED"] = 0] = "UNSPECIFIED"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["TEXT"] = 1] = "TEXT"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["DATA"] = 2] = "DATA"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["JSON"] = 3] = "JSON"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["WASM"] = 4] = "WASM"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["CRYPTO_KEY"] = 5] = "CRYPTO_KEY"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["SERVICE"] = 6] = "SERVICE"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["DURABLE_OBJECT_NAMESPACE"] = 7] = "DURABLE_OBJECT_NAMESPACE"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["KV_NAMESPACE"] = 8] = "KV_NAMESPACE"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["R2BUCKET"] = 9] = "R2BUCKET"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["R2ADMIN"] = 10] = "R2ADMIN"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["QUEUE"] = 11] = "QUEUE"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["ANALYTICS_ENGINE"] = 12] = "ANALYTICS_ENGINE"; + Worker_Binding_Type_Which[Worker_Binding_Type_Which["HYPERDRIVE"] = 13] = "HYPERDRIVE"; +})(Worker_Binding_Type_Which = exports.Worker_Binding_Type_Which || (exports.Worker_Binding_Type_Which = {})); class Worker_Binding_Type extends capnp_ts_1.Struct { - isUnspecified() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setUnspecified() { - capnp_ts_1.Struct.setUint16(0, 0, this); - } - isText() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setText() { - capnp_ts_1.Struct.setUint16(0, 1, this); - } - isData() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setData() { - capnp_ts_1.Struct.setUint16(0, 2, this); - } - isJson() { - return capnp_ts_1.Struct.getUint16(0, this) === 3; - } - setJson() { - capnp_ts_1.Struct.setUint16(0, 3, this); - } - isWasm() { - return capnp_ts_1.Struct.getUint16(0, this) === 4; - } - setWasm() { - capnp_ts_1.Struct.setUint16(0, 4, this); - } - adoptCryptoKey(value) { - capnp_ts_1.Struct.setUint16(0, 5, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownCryptoKey() { - return capnp_ts_1.Struct.disown(this.getCryptoKey()); - } - getCryptoKey() { - capnp_ts_1.Struct.testWhich( - "cryptoKey", - capnp_ts_1.Struct.getUint16(0, this), - 5, - this - ); - return capnp_ts_1.Struct.getList(0, capnp.Uint16List, this); - } - hasCryptoKey() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initCryptoKey(length) { - capnp_ts_1.Struct.setUint16(0, 5, this); - return capnp_ts_1.Struct.initList(0, capnp.Uint16List, length, this); - } - isCryptoKey() { - return capnp_ts_1.Struct.getUint16(0, this) === 5; - } - setCryptoKey(value) { - capnp_ts_1.Struct.setUint16(0, 5, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - isService() { - return capnp_ts_1.Struct.getUint16(0, this) === 6; - } - setService() { - capnp_ts_1.Struct.setUint16(0, 6, this); - } - isDurableObjectNamespace() { - return capnp_ts_1.Struct.getUint16(0, this) === 7; - } - setDurableObjectNamespace() { - capnp_ts_1.Struct.setUint16(0, 7, this); - } - isKvNamespace() { - return capnp_ts_1.Struct.getUint16(0, this) === 8; - } - setKvNamespace() { - capnp_ts_1.Struct.setUint16(0, 8, this); - } - isR2Bucket() { - return capnp_ts_1.Struct.getUint16(0, this) === 9; - } - setR2Bucket() { - capnp_ts_1.Struct.setUint16(0, 9, this); - } - isR2Admin() { - return capnp_ts_1.Struct.getUint16(0, this) === 10; - } - setR2Admin() { - capnp_ts_1.Struct.setUint16(0, 10, this); - } - isQueue() { - return capnp_ts_1.Struct.getUint16(0, this) === 11; - } - setQueue() { - capnp_ts_1.Struct.setUint16(0, 11, this); - } - isAnalyticsEngine() { - return capnp_ts_1.Struct.getUint16(0, this) === 12; - } - setAnalyticsEngine() { - capnp_ts_1.Struct.setUint16(0, 12, this); - } - isHyperdrive() { - return capnp_ts_1.Struct.getUint16(0, this) === 13; - } - setHyperdrive() { - capnp_ts_1.Struct.setUint16(0, 13, this); - } - toString() { - return "Worker_Binding_Type_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + isUnspecified() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setUnspecified() { capnp_ts_1.Struct.setUint16(0, 0, this); } + isText() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setText() { capnp_ts_1.Struct.setUint16(0, 1, this); } + isData() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setData() { capnp_ts_1.Struct.setUint16(0, 2, this); } + isJson() { return capnp_ts_1.Struct.getUint16(0, this) === 3; } + setJson() { capnp_ts_1.Struct.setUint16(0, 3, this); } + isWasm() { return capnp_ts_1.Struct.getUint16(0, this) === 4; } + setWasm() { capnp_ts_1.Struct.setUint16(0, 4, this); } + adoptCryptoKey(value) { + capnp_ts_1.Struct.setUint16(0, 5, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); + } + disownCryptoKey() { return capnp_ts_1.Struct.disown(this.getCryptoKey()); } + getCryptoKey() { + capnp_ts_1.Struct.testWhich("cryptoKey", capnp_ts_1.Struct.getUint16(0, this), 5, this); + return capnp_ts_1.Struct.getList(0, capnp.Uint16List, this); + } + hasCryptoKey() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initCryptoKey(length) { + capnp_ts_1.Struct.setUint16(0, 5, this); + return capnp_ts_1.Struct.initList(0, capnp.Uint16List, length, this); + } + isCryptoKey() { return capnp_ts_1.Struct.getUint16(0, this) === 5; } + setCryptoKey(value) { + capnp_ts_1.Struct.setUint16(0, 5, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); + } + isService() { return capnp_ts_1.Struct.getUint16(0, this) === 6; } + setService() { capnp_ts_1.Struct.setUint16(0, 6, this); } + isDurableObjectNamespace() { return capnp_ts_1.Struct.getUint16(0, this) === 7; } + setDurableObjectNamespace() { capnp_ts_1.Struct.setUint16(0, 7, this); } + isKvNamespace() { return capnp_ts_1.Struct.getUint16(0, this) === 8; } + setKvNamespace() { capnp_ts_1.Struct.setUint16(0, 8, this); } + isR2Bucket() { return capnp_ts_1.Struct.getUint16(0, this) === 9; } + setR2Bucket() { capnp_ts_1.Struct.setUint16(0, 9, this); } + isR2Admin() { return capnp_ts_1.Struct.getUint16(0, this) === 10; } + setR2Admin() { capnp_ts_1.Struct.setUint16(0, 10, this); } + isQueue() { return capnp_ts_1.Struct.getUint16(0, this) === 11; } + setQueue() { capnp_ts_1.Struct.setUint16(0, 11, this); } + isAnalyticsEngine() { return capnp_ts_1.Struct.getUint16(0, this) === 12; } + setAnalyticsEngine() { capnp_ts_1.Struct.setUint16(0, 12, this); } + isHyperdrive() { return capnp_ts_1.Struct.getUint16(0, this) === 13; } + setHyperdrive() { capnp_ts_1.Struct.setUint16(0, 13, this); } + toString() { return "Worker_Binding_Type_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Worker_Binding_Type = Worker_Binding_Type; Worker_Binding_Type.UNSPECIFIED = Worker_Binding_Type_Which.UNSPECIFIED; @@ -802,311 +424,151 @@ Worker_Binding_Type.JSON = Worker_Binding_Type_Which.JSON; Worker_Binding_Type.WASM = Worker_Binding_Type_Which.WASM; Worker_Binding_Type.CRYPTO_KEY = Worker_Binding_Type_Which.CRYPTO_KEY; Worker_Binding_Type.SERVICE = Worker_Binding_Type_Which.SERVICE; -Worker_Binding_Type.DURABLE_OBJECT_NAMESPACE = - Worker_Binding_Type_Which.DURABLE_OBJECT_NAMESPACE; +Worker_Binding_Type.DURABLE_OBJECT_NAMESPACE = Worker_Binding_Type_Which.DURABLE_OBJECT_NAMESPACE; Worker_Binding_Type.KV_NAMESPACE = Worker_Binding_Type_Which.KV_NAMESPACE; Worker_Binding_Type.R2BUCKET = Worker_Binding_Type_Which.R2BUCKET; Worker_Binding_Type.R2ADMIN = Worker_Binding_Type_Which.R2ADMIN; Worker_Binding_Type.QUEUE = Worker_Binding_Type_Which.QUEUE; -Worker_Binding_Type.ANALYTICS_ENGINE = - Worker_Binding_Type_Which.ANALYTICS_ENGINE; +Worker_Binding_Type.ANALYTICS_ENGINE = Worker_Binding_Type_Which.ANALYTICS_ENGINE; Worker_Binding_Type.HYPERDRIVE = Worker_Binding_Type_Which.HYPERDRIVE; -Worker_Binding_Type._capnp = { - displayName: "Type", - id: "8906a1296519bf8a", - size: new capnp_ts_1.ObjectSize(8, 1), -}; +Worker_Binding_Type._capnp = { displayName: "Type", id: "8906a1296519bf8a", size: new capnp_ts_1.ObjectSize(8, 1) }; class Worker_Binding_DurableObjectNamespaceDesignator extends capnp_ts_1.Struct { - getClassName() { - return capnp_ts_1.Struct.getText(0, this); - } - setClassName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getServiceName() { - return capnp_ts_1.Struct.getText(1, this); - } - setServiceName(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return ( - "Worker_Binding_DurableObjectNamespaceDesignator_" + super.toString() - ); - } + getClassName() { return capnp_ts_1.Struct.getText(0, this); } + setClassName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getServiceName() { return capnp_ts_1.Struct.getText(1, this); } + setServiceName(value) { capnp_ts_1.Struct.setText(1, value, this); } + toString() { return "Worker_Binding_DurableObjectNamespaceDesignator_" + super.toString(); } } -exports.Worker_Binding_DurableObjectNamespaceDesignator = - Worker_Binding_DurableObjectNamespaceDesignator; -Worker_Binding_DurableObjectNamespaceDesignator._capnp = { - displayName: "DurableObjectNamespaceDesignator", - id: "804f144ff477aac7", - size: new capnp_ts_1.ObjectSize(0, 2), -}; +exports.Worker_Binding_DurableObjectNamespaceDesignator = Worker_Binding_DurableObjectNamespaceDesignator; +Worker_Binding_DurableObjectNamespaceDesignator._capnp = { displayName: "DurableObjectNamespaceDesignator", id: "804f144ff477aac7", size: new capnp_ts_1.ObjectSize(0, 2) }; var Worker_Binding_CryptoKey_Usage; (function (Worker_Binding_CryptoKey_Usage) { - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["ENCRYPT"] = 0) - ] = "ENCRYPT"; - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["DECRYPT"] = 1) - ] = "DECRYPT"; - Worker_Binding_CryptoKey_Usage[(Worker_Binding_CryptoKey_Usage["SIGN"] = 2)] = - "SIGN"; - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["VERIFY"] = 3) - ] = "VERIFY"; - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["DERIVE_KEY"] = 4) - ] = "DERIVE_KEY"; - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["DERIVE_BITS"] = 5) - ] = "DERIVE_BITS"; - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["WRAP_KEY"] = 6) - ] = "WRAP_KEY"; - Worker_Binding_CryptoKey_Usage[ - (Worker_Binding_CryptoKey_Usage["UNWRAP_KEY"] = 7) - ] = "UNWRAP_KEY"; -})( - (Worker_Binding_CryptoKey_Usage = - exports.Worker_Binding_CryptoKey_Usage || - (exports.Worker_Binding_CryptoKey_Usage = {})) -); + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["ENCRYPT"] = 0] = "ENCRYPT"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["DECRYPT"] = 1] = "DECRYPT"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["SIGN"] = 2] = "SIGN"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["VERIFY"] = 3] = "VERIFY"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["DERIVE_KEY"] = 4] = "DERIVE_KEY"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["DERIVE_BITS"] = 5] = "DERIVE_BITS"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["WRAP_KEY"] = 6] = "WRAP_KEY"; + Worker_Binding_CryptoKey_Usage[Worker_Binding_CryptoKey_Usage["UNWRAP_KEY"] = 7] = "UNWRAP_KEY"; +})(Worker_Binding_CryptoKey_Usage = exports.Worker_Binding_CryptoKey_Usage || (exports.Worker_Binding_CryptoKey_Usage = {})); var Worker_Binding_CryptoKey_Algorithm_Which; (function (Worker_Binding_CryptoKey_Algorithm_Which) { - Worker_Binding_CryptoKey_Algorithm_Which[ - (Worker_Binding_CryptoKey_Algorithm_Which["NAME"] = 0) - ] = "NAME"; - Worker_Binding_CryptoKey_Algorithm_Which[ - (Worker_Binding_CryptoKey_Algorithm_Which["JSON"] = 1) - ] = "JSON"; -})( - (Worker_Binding_CryptoKey_Algorithm_Which = - exports.Worker_Binding_CryptoKey_Algorithm_Which || - (exports.Worker_Binding_CryptoKey_Algorithm_Which = {})) -); + Worker_Binding_CryptoKey_Algorithm_Which[Worker_Binding_CryptoKey_Algorithm_Which["NAME"] = 0] = "NAME"; + Worker_Binding_CryptoKey_Algorithm_Which[Worker_Binding_CryptoKey_Algorithm_Which["JSON"] = 1] = "JSON"; +})(Worker_Binding_CryptoKey_Algorithm_Which = exports.Worker_Binding_CryptoKey_Algorithm_Which || (exports.Worker_Binding_CryptoKey_Algorithm_Which = {})); class Worker_Binding_CryptoKey_Algorithm extends capnp_ts_1.Struct { - getName() { - capnp_ts_1.Struct.testWhich( - "name", - capnp_ts_1.Struct.getUint16(2, this), - 0, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isName() { - return capnp_ts_1.Struct.getUint16(2, this) === 0; - } - setName(value) { - capnp_ts_1.Struct.setUint16(2, 0, this); - capnp_ts_1.Struct.setText(1, value, this); - } - getJson() { - capnp_ts_1.Struct.testWhich( - "json", - capnp_ts_1.Struct.getUint16(2, this), - 1, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isJson() { - return capnp_ts_1.Struct.getUint16(2, this) === 1; - } - setJson(value) { - capnp_ts_1.Struct.setUint16(2, 1, this); - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return "Worker_Binding_CryptoKey_Algorithm_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(2, this); - } + getName() { + capnp_ts_1.Struct.testWhich("name", capnp_ts_1.Struct.getUint16(2, this), 0, this); + return capnp_ts_1.Struct.getText(1, this); + } + isName() { return capnp_ts_1.Struct.getUint16(2, this) === 0; } + setName(value) { + capnp_ts_1.Struct.setUint16(2, 0, this); + capnp_ts_1.Struct.setText(1, value, this); + } + getJson() { + capnp_ts_1.Struct.testWhich("json", capnp_ts_1.Struct.getUint16(2, this), 1, this); + return capnp_ts_1.Struct.getText(1, this); + } + isJson() { return capnp_ts_1.Struct.getUint16(2, this) === 1; } + setJson(value) { + capnp_ts_1.Struct.setUint16(2, 1, this); + capnp_ts_1.Struct.setText(1, value, this); + } + toString() { return "Worker_Binding_CryptoKey_Algorithm_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(2, this); } } exports.Worker_Binding_CryptoKey_Algorithm = Worker_Binding_CryptoKey_Algorithm; -Worker_Binding_CryptoKey_Algorithm.NAME = - Worker_Binding_CryptoKey_Algorithm_Which.NAME; -Worker_Binding_CryptoKey_Algorithm.JSON = - Worker_Binding_CryptoKey_Algorithm_Which.JSON; -Worker_Binding_CryptoKey_Algorithm._capnp = { - displayName: "algorithm", - id: "a1a040c5e00d7021", - size: new capnp_ts_1.ObjectSize(8, 3), -}; +Worker_Binding_CryptoKey_Algorithm.NAME = Worker_Binding_CryptoKey_Algorithm_Which.NAME; +Worker_Binding_CryptoKey_Algorithm.JSON = Worker_Binding_CryptoKey_Algorithm_Which.JSON; +Worker_Binding_CryptoKey_Algorithm._capnp = { displayName: "algorithm", id: "a1a040c5e00d7021", size: new capnp_ts_1.ObjectSize(8, 3) }; var Worker_Binding_CryptoKey_Which; (function (Worker_Binding_CryptoKey_Which) { - Worker_Binding_CryptoKey_Which[(Worker_Binding_CryptoKey_Which["RAW"] = 0)] = - "RAW"; - Worker_Binding_CryptoKey_Which[(Worker_Binding_CryptoKey_Which["HEX"] = 1)] = - "HEX"; - Worker_Binding_CryptoKey_Which[ - (Worker_Binding_CryptoKey_Which["BASE64"] = 2) - ] = "BASE64"; - Worker_Binding_CryptoKey_Which[ - (Worker_Binding_CryptoKey_Which["PKCS8"] = 3) - ] = "PKCS8"; - Worker_Binding_CryptoKey_Which[(Worker_Binding_CryptoKey_Which["SPKI"] = 4)] = - "SPKI"; - Worker_Binding_CryptoKey_Which[(Worker_Binding_CryptoKey_Which["JWK"] = 5)] = - "JWK"; -})( - (Worker_Binding_CryptoKey_Which = - exports.Worker_Binding_CryptoKey_Which || - (exports.Worker_Binding_CryptoKey_Which = {})) -); + Worker_Binding_CryptoKey_Which[Worker_Binding_CryptoKey_Which["RAW"] = 0] = "RAW"; + Worker_Binding_CryptoKey_Which[Worker_Binding_CryptoKey_Which["HEX"] = 1] = "HEX"; + Worker_Binding_CryptoKey_Which[Worker_Binding_CryptoKey_Which["BASE64"] = 2] = "BASE64"; + Worker_Binding_CryptoKey_Which[Worker_Binding_CryptoKey_Which["PKCS8"] = 3] = "PKCS8"; + Worker_Binding_CryptoKey_Which[Worker_Binding_CryptoKey_Which["SPKI"] = 4] = "SPKI"; + Worker_Binding_CryptoKey_Which[Worker_Binding_CryptoKey_Which["JWK"] = 5] = "JWK"; +})(Worker_Binding_CryptoKey_Which = exports.Worker_Binding_CryptoKey_Which || (exports.Worker_Binding_CryptoKey_Which = {})); class Worker_Binding_CryptoKey extends capnp_ts_1.Struct { - adoptRaw(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownRaw() { - return capnp_ts_1.Struct.disown(this.getRaw()); - } - getRaw() { - capnp_ts_1.Struct.testWhich( - "raw", - capnp_ts_1.Struct.getUint16(0, this), - 0, - this - ); - return capnp_ts_1.Struct.getData(0, this); - } - hasRaw() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initRaw(length) { - capnp_ts_1.Struct.setUint16(0, 0, this); - return capnp_ts_1.Struct.initData(0, length, this); - } - isRaw() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setRaw(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - getHex() { - capnp_ts_1.Struct.testWhich( - "hex", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isHex() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setHex(value) { - capnp_ts_1.Struct.setUint16(0, 1, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getBase64() { - capnp_ts_1.Struct.testWhich( - "base64", - capnp_ts_1.Struct.getUint16(0, this), - 2, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isBase64() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setBase64(value) { - capnp_ts_1.Struct.setUint16(0, 2, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getPkcs8() { - capnp_ts_1.Struct.testWhich( - "pkcs8", - capnp_ts_1.Struct.getUint16(0, this), - 3, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isPkcs8() { - return capnp_ts_1.Struct.getUint16(0, this) === 3; - } - setPkcs8(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getSpki() { - capnp_ts_1.Struct.testWhich( - "spki", - capnp_ts_1.Struct.getUint16(0, this), - 4, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isSpki() { - return capnp_ts_1.Struct.getUint16(0, this) === 4; - } - setSpki(value) { - capnp_ts_1.Struct.setUint16(0, 4, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getJwk() { - capnp_ts_1.Struct.testWhich( - "jwk", - capnp_ts_1.Struct.getUint16(0, this), - 5, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isJwk() { - return capnp_ts_1.Struct.getUint16(0, this) === 5; - } - setJwk(value) { - capnp_ts_1.Struct.setUint16(0, 5, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getAlgorithm() { - return capnp_ts_1.Struct.getAs(Worker_Binding_CryptoKey_Algorithm, this); - } - initAlgorithm() { - return capnp_ts_1.Struct.getAs(Worker_Binding_CryptoKey_Algorithm, this); - } - getExtractable() { - return capnp_ts_1.Struct.getBit( - 32, - this, - Worker_Binding_CryptoKey._capnp.defaultExtractable - ); - } - setExtractable(value) { - capnp_ts_1.Struct.setBit(32, value, this); - } - adoptUsages(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownUsages() { - return capnp_ts_1.Struct.disown(this.getUsages()); - } - getUsages() { - return capnp_ts_1.Struct.getList(2, capnp.Uint16List, this); - } - hasUsages() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initUsages(length) { - return capnp_ts_1.Struct.initList(2, capnp.Uint16List, length, this); - } - setUsages(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - toString() { - return "Worker_Binding_CryptoKey_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + adoptRaw(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); + } + disownRaw() { return capnp_ts_1.Struct.disown(this.getRaw()); } + getRaw() { + capnp_ts_1.Struct.testWhich("raw", capnp_ts_1.Struct.getUint16(0, this), 0, this); + return capnp_ts_1.Struct.getData(0, this); + } + hasRaw() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initRaw(length) { + capnp_ts_1.Struct.setUint16(0, 0, this); + return capnp_ts_1.Struct.initData(0, length, this); + } + isRaw() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setRaw(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); + } + getHex() { + capnp_ts_1.Struct.testWhich("hex", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getText(0, this); + } + isHex() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setHex(value) { + capnp_ts_1.Struct.setUint16(0, 1, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getBase64() { + capnp_ts_1.Struct.testWhich("base64", capnp_ts_1.Struct.getUint16(0, this), 2, this); + return capnp_ts_1.Struct.getText(0, this); + } + isBase64() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setBase64(value) { + capnp_ts_1.Struct.setUint16(0, 2, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getPkcs8() { + capnp_ts_1.Struct.testWhich("pkcs8", capnp_ts_1.Struct.getUint16(0, this), 3, this); + return capnp_ts_1.Struct.getText(0, this); + } + isPkcs8() { return capnp_ts_1.Struct.getUint16(0, this) === 3; } + setPkcs8(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getSpki() { + capnp_ts_1.Struct.testWhich("spki", capnp_ts_1.Struct.getUint16(0, this), 4, this); + return capnp_ts_1.Struct.getText(0, this); + } + isSpki() { return capnp_ts_1.Struct.getUint16(0, this) === 4; } + setSpki(value) { + capnp_ts_1.Struct.setUint16(0, 4, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getJwk() { + capnp_ts_1.Struct.testWhich("jwk", capnp_ts_1.Struct.getUint16(0, this), 5, this); + return capnp_ts_1.Struct.getText(0, this); + } + isJwk() { return capnp_ts_1.Struct.getUint16(0, this) === 5; } + setJwk(value) { + capnp_ts_1.Struct.setUint16(0, 5, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getAlgorithm() { return capnp_ts_1.Struct.getAs(Worker_Binding_CryptoKey_Algorithm, this); } + initAlgorithm() { return capnp_ts_1.Struct.getAs(Worker_Binding_CryptoKey_Algorithm, this); } + getExtractable() { return capnp_ts_1.Struct.getBit(32, this, Worker_Binding_CryptoKey._capnp.defaultExtractable); } + setExtractable(value) { capnp_ts_1.Struct.setBit(32, value, this); } + adoptUsages(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownUsages() { return capnp_ts_1.Struct.disown(this.getUsages()); } + getUsages() { return capnp_ts_1.Struct.getList(2, capnp.Uint16List, this); } + hasUsages() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initUsages(length) { return capnp_ts_1.Struct.initList(2, capnp.Uint16List, length, this); } + setUsages(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + toString() { return "Worker_Binding_CryptoKey_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Worker_Binding_CryptoKey = Worker_Binding_CryptoKey; Worker_Binding_CryptoKey.RAW = Worker_Binding_CryptoKey_Which.RAW; @@ -1116,640 +578,340 @@ Worker_Binding_CryptoKey.PKCS8 = Worker_Binding_CryptoKey_Which.PKCS8; Worker_Binding_CryptoKey.SPKI = Worker_Binding_CryptoKey_Which.SPKI; Worker_Binding_CryptoKey.JWK = Worker_Binding_CryptoKey_Which.JWK; Worker_Binding_CryptoKey.Usage = Worker_Binding_CryptoKey_Usage; -Worker_Binding_CryptoKey._capnp = { - displayName: "CryptoKey", - id: "b5e1bff0e57d6eb0", - size: new capnp_ts_1.ObjectSize(8, 3), - defaultExtractable: capnp.getBitMask(false, 0), -}; +Worker_Binding_CryptoKey._capnp = { displayName: "CryptoKey", id: "b5e1bff0e57d6eb0", size: new capnp_ts_1.ObjectSize(8, 3), defaultExtractable: capnp.getBitMask(false, 0) }; class Worker_Binding_WrappedBinding extends capnp_ts_1.Struct { - getModuleName() { - return capnp_ts_1.Struct.getText(0, this); - } - setModuleName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getEntrypoint() { - return capnp_ts_1.Struct.getText( - 1, - this, - Worker_Binding_WrappedBinding._capnp.defaultEntrypoint - ); - } - setEntrypoint(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - adoptInnerBindings(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownInnerBindings() { - return capnp_ts_1.Struct.disown(this.getInnerBindings()); - } - getInnerBindings() { - return capnp_ts_1.Struct.getList( - 2, - Worker_Binding_WrappedBinding._InnerBindings, - this - ); - } - hasInnerBindings() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initInnerBindings(length) { - return capnp_ts_1.Struct.initList( - 2, - Worker_Binding_WrappedBinding._InnerBindings, - length, - this - ); - } - setInnerBindings(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - toString() { - return "Worker_Binding_WrappedBinding_" + super.toString(); - } + getModuleName() { return capnp_ts_1.Struct.getText(0, this); } + setModuleName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getEntrypoint() { return capnp_ts_1.Struct.getText(1, this, Worker_Binding_WrappedBinding._capnp.defaultEntrypoint); } + setEntrypoint(value) { capnp_ts_1.Struct.setText(1, value, this); } + adoptInnerBindings(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownInnerBindings() { return capnp_ts_1.Struct.disown(this.getInnerBindings()); } + getInnerBindings() { return capnp_ts_1.Struct.getList(2, Worker_Binding_WrappedBinding._InnerBindings, this); } + hasInnerBindings() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initInnerBindings(length) { return capnp_ts_1.Struct.initList(2, Worker_Binding_WrappedBinding._InnerBindings, length, this); } + setInnerBindings(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + toString() { return "Worker_Binding_WrappedBinding_" + super.toString(); } } exports.Worker_Binding_WrappedBinding = Worker_Binding_WrappedBinding; -Worker_Binding_WrappedBinding._capnp = { - displayName: "WrappedBinding", - id: "e6f066b75f0ea113", - size: new capnp_ts_1.ObjectSize(0, 3), - defaultEntrypoint: "default", -}; +Worker_Binding_WrappedBinding._capnp = { displayName: "WrappedBinding", id: "e6f066b75f0ea113", size: new capnp_ts_1.ObjectSize(0, 3), defaultEntrypoint: "default" }; class Worker_Binding_Parameter extends capnp_ts_1.Struct { - adoptType(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownType() { - return capnp_ts_1.Struct.disown(this.getType()); - } - getType() { - return capnp_ts_1.Struct.getStruct(1, Worker_Binding_Type, this); - } - hasType() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initType() { - return capnp_ts_1.Struct.initStructAt(1, Worker_Binding_Type, this); - } - setType(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getOptional() { - return capnp_ts_1.Struct.getBit(16, this); - } - setOptional(value) { - capnp_ts_1.Struct.setBit(16, value, this); - } - toString() { - return "Worker_Binding_Parameter_" + super.toString(); - } + adoptType(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownType() { return capnp_ts_1.Struct.disown(this.getType()); } + getType() { return capnp_ts_1.Struct.getStruct(1, Worker_Binding_Type, this); } + hasType() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initType() { return capnp_ts_1.Struct.initStructAt(1, Worker_Binding_Type, this); } + setType(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + getOptional() { return capnp_ts_1.Struct.getBit(16, this); } + setOptional(value) { capnp_ts_1.Struct.setBit(16, value, this); } + toString() { return "Worker_Binding_Parameter_" + super.toString(); } } exports.Worker_Binding_Parameter = Worker_Binding_Parameter; -Worker_Binding_Parameter._capnp = { - displayName: "parameter", - id: "dc57e1258d26d152", - size: new capnp_ts_1.ObjectSize(8, 6), -}; +Worker_Binding_Parameter._capnp = { displayName: "parameter", id: "dc57e1258d26d152", size: new capnp_ts_1.ObjectSize(8, 6) }; class Worker_Binding_Hyperdrive extends capnp_ts_1.Struct { - adoptDesignator(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownDesignator() { - return capnp_ts_1.Struct.disown(this.getDesignator()); - } - getDesignator() { - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasDesignator() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initDesignator() { - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - setDesignator(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getDatabase() { - return capnp_ts_1.Struct.getText(2, this); - } - setDatabase(value) { - capnp_ts_1.Struct.setText(2, value, this); - } - getUser() { - return capnp_ts_1.Struct.getText(3, this); - } - setUser(value) { - capnp_ts_1.Struct.setText(3, value, this); - } - getPassword() { - return capnp_ts_1.Struct.getText(4, this); - } - setPassword(value) { - capnp_ts_1.Struct.setText(4, value, this); - } - getScheme() { - return capnp_ts_1.Struct.getText(5, this); - } - setScheme(value) { - capnp_ts_1.Struct.setText(5, value, this); - } - toString() { - return "Worker_Binding_Hyperdrive_" + super.toString(); - } + adoptDesignator(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownDesignator() { return capnp_ts_1.Struct.disown(this.getDesignator()); } + getDesignator() { return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); } + hasDesignator() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initDesignator() { return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); } + setDesignator(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + getDatabase() { return capnp_ts_1.Struct.getText(2, this); } + setDatabase(value) { capnp_ts_1.Struct.setText(2, value, this); } + getUser() { return capnp_ts_1.Struct.getText(3, this); } + setUser(value) { capnp_ts_1.Struct.setText(3, value, this); } + getPassword() { return capnp_ts_1.Struct.getText(4, this); } + setPassword(value) { capnp_ts_1.Struct.setText(4, value, this); } + getScheme() { return capnp_ts_1.Struct.getText(5, this); } + setScheme(value) { capnp_ts_1.Struct.setText(5, value, this); } + toString() { return "Worker_Binding_Hyperdrive_" + super.toString(); } } exports.Worker_Binding_Hyperdrive = Worker_Binding_Hyperdrive; -Worker_Binding_Hyperdrive._capnp = { - displayName: "hyperdrive", - id: "ad6c391cd55f3134", - size: new capnp_ts_1.ObjectSize(8, 6), -}; +Worker_Binding_Hyperdrive._capnp = { displayName: "hyperdrive", id: "ad6c391cd55f3134", size: new capnp_ts_1.ObjectSize(8, 6) }; var Worker_Binding_Which; (function (Worker_Binding_Which) { - Worker_Binding_Which[(Worker_Binding_Which["UNSPECIFIED"] = 0)] = - "UNSPECIFIED"; - Worker_Binding_Which[(Worker_Binding_Which["PARAMETER"] = 1)] = "PARAMETER"; - Worker_Binding_Which[(Worker_Binding_Which["TEXT"] = 2)] = "TEXT"; - Worker_Binding_Which[(Worker_Binding_Which["DATA"] = 3)] = "DATA"; - Worker_Binding_Which[(Worker_Binding_Which["JSON"] = 4)] = "JSON"; - Worker_Binding_Which[(Worker_Binding_Which["WASM_MODULE"] = 5)] = - "WASM_MODULE"; - Worker_Binding_Which[(Worker_Binding_Which["CRYPTO_KEY"] = 6)] = "CRYPTO_KEY"; - Worker_Binding_Which[(Worker_Binding_Which["SERVICE"] = 7)] = "SERVICE"; - Worker_Binding_Which[(Worker_Binding_Which["DURABLE_OBJECT_NAMESPACE"] = 8)] = - "DURABLE_OBJECT_NAMESPACE"; - Worker_Binding_Which[(Worker_Binding_Which["KV_NAMESPACE"] = 9)] = - "KV_NAMESPACE"; - Worker_Binding_Which[(Worker_Binding_Which["R2BUCKET"] = 10)] = "R2BUCKET"; - Worker_Binding_Which[(Worker_Binding_Which["R2ADMIN"] = 11)] = "R2ADMIN"; - Worker_Binding_Which[(Worker_Binding_Which["WRAPPED"] = 12)] = "WRAPPED"; - Worker_Binding_Which[(Worker_Binding_Which["QUEUE"] = 13)] = "QUEUE"; - Worker_Binding_Which[(Worker_Binding_Which["FROM_ENVIRONMENT"] = 14)] = - "FROM_ENVIRONMENT"; - Worker_Binding_Which[(Worker_Binding_Which["ANALYTICS_ENGINE"] = 15)] = - "ANALYTICS_ENGINE"; - Worker_Binding_Which[(Worker_Binding_Which["HYPERDRIVE"] = 16)] = - "HYPERDRIVE"; - Worker_Binding_Which[(Worker_Binding_Which["UNSAFE_EVAL"] = 17)] = - "UNSAFE_EVAL"; -})( - (Worker_Binding_Which = - exports.Worker_Binding_Which || (exports.Worker_Binding_Which = {})) -); + Worker_Binding_Which[Worker_Binding_Which["UNSPECIFIED"] = 0] = "UNSPECIFIED"; + Worker_Binding_Which[Worker_Binding_Which["PARAMETER"] = 1] = "PARAMETER"; + Worker_Binding_Which[Worker_Binding_Which["TEXT"] = 2] = "TEXT"; + Worker_Binding_Which[Worker_Binding_Which["DATA"] = 3] = "DATA"; + Worker_Binding_Which[Worker_Binding_Which["JSON"] = 4] = "JSON"; + Worker_Binding_Which[Worker_Binding_Which["WASM_MODULE"] = 5] = "WASM_MODULE"; + Worker_Binding_Which[Worker_Binding_Which["CRYPTO_KEY"] = 6] = "CRYPTO_KEY"; + Worker_Binding_Which[Worker_Binding_Which["SERVICE"] = 7] = "SERVICE"; + Worker_Binding_Which[Worker_Binding_Which["DURABLE_OBJECT_NAMESPACE"] = 8] = "DURABLE_OBJECT_NAMESPACE"; + Worker_Binding_Which[Worker_Binding_Which["KV_NAMESPACE"] = 9] = "KV_NAMESPACE"; + Worker_Binding_Which[Worker_Binding_Which["R2BUCKET"] = 10] = "R2BUCKET"; + Worker_Binding_Which[Worker_Binding_Which["R2ADMIN"] = 11] = "R2ADMIN"; + Worker_Binding_Which[Worker_Binding_Which["WRAPPED"] = 12] = "WRAPPED"; + Worker_Binding_Which[Worker_Binding_Which["QUEUE"] = 13] = "QUEUE"; + Worker_Binding_Which[Worker_Binding_Which["FROM_ENVIRONMENT"] = 14] = "FROM_ENVIRONMENT"; + Worker_Binding_Which[Worker_Binding_Which["ANALYTICS_ENGINE"] = 15] = "ANALYTICS_ENGINE"; + Worker_Binding_Which[Worker_Binding_Which["HYPERDRIVE"] = 16] = "HYPERDRIVE"; + Worker_Binding_Which[Worker_Binding_Which["UNSAFE_EVAL"] = 17] = "UNSAFE_EVAL"; +})(Worker_Binding_Which = exports.Worker_Binding_Which || (exports.Worker_Binding_Which = {})); class Worker_Binding extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - isUnspecified() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setUnspecified() { - capnp_ts_1.Struct.setUint16(0, 0, this); - } - getParameter() { - capnp_ts_1.Struct.testWhich( - "parameter", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getAs(Worker_Binding_Parameter, this); - } - initParameter() { - capnp_ts_1.Struct.setUint16(0, 1, this); - return capnp_ts_1.Struct.getAs(Worker_Binding_Parameter, this); - } - isParameter() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setParameter() { - capnp_ts_1.Struct.setUint16(0, 1, this); - } - getText() { - capnp_ts_1.Struct.testWhich( - "text", - capnp_ts_1.Struct.getUint16(0, this), - 2, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isText() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setText(value) { - capnp_ts_1.Struct.setUint16(0, 2, this); - capnp_ts_1.Struct.setText(1, value, this); - } - adoptData(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownData() { - return capnp_ts_1.Struct.disown(this.getData()); - } - getData() { - capnp_ts_1.Struct.testWhich( - "data", - capnp_ts_1.Struct.getUint16(0, this), - 3, - this - ); - return capnp_ts_1.Struct.getData(1, this); - } - hasData() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initData(length) { - capnp_ts_1.Struct.setUint16(0, 3, this); - return capnp_ts_1.Struct.initData(1, length, this); - } - isData() { - return capnp_ts_1.Struct.getUint16(0, this) === 3; - } - setData(value) { - capnp_ts_1.Struct.setUint16(0, 3, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getJson() { - capnp_ts_1.Struct.testWhich( - "json", - capnp_ts_1.Struct.getUint16(0, this), - 4, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isJson() { - return capnp_ts_1.Struct.getUint16(0, this) === 4; - } - setJson(value) { - capnp_ts_1.Struct.setUint16(0, 4, this); - capnp_ts_1.Struct.setText(1, value, this); - } - adoptWasmModule(value) { - capnp_ts_1.Struct.setUint16(0, 5, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownWasmModule() { - return capnp_ts_1.Struct.disown(this.getWasmModule()); - } - getWasmModule() { - capnp_ts_1.Struct.testWhich( - "wasmModule", - capnp_ts_1.Struct.getUint16(0, this), - 5, - this - ); - return capnp_ts_1.Struct.getData(1, this); - } - hasWasmModule() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initWasmModule(length) { - capnp_ts_1.Struct.setUint16(0, 5, this); - return capnp_ts_1.Struct.initData(1, length, this); - } - isWasmModule() { - return capnp_ts_1.Struct.getUint16(0, this) === 5; - } - setWasmModule(value) { - capnp_ts_1.Struct.setUint16(0, 5, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptCryptoKey(value) { - capnp_ts_1.Struct.setUint16(0, 6, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownCryptoKey() { - return capnp_ts_1.Struct.disown(this.getCryptoKey()); - } - getCryptoKey() { - capnp_ts_1.Struct.testWhich( - "cryptoKey", - capnp_ts_1.Struct.getUint16(0, this), - 6, - this - ); - return capnp_ts_1.Struct.getStruct(1, Worker_Binding_CryptoKey, this); - } - hasCryptoKey() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initCryptoKey() { - capnp_ts_1.Struct.setUint16(0, 6, this); - return capnp_ts_1.Struct.initStructAt(1, Worker_Binding_CryptoKey, this); - } - isCryptoKey() { - return capnp_ts_1.Struct.getUint16(0, this) === 6; - } - setCryptoKey(value) { - capnp_ts_1.Struct.setUint16(0, 6, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptService(value) { - capnp_ts_1.Struct.setUint16(0, 7, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownService() { - return capnp_ts_1.Struct.disown(this.getService()); - } - getService() { - capnp_ts_1.Struct.testWhich( - "service", - capnp_ts_1.Struct.getUint16(0, this), - 7, - this - ); - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasService() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initService() { - capnp_ts_1.Struct.setUint16(0, 7, this); - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - isService() { - return capnp_ts_1.Struct.getUint16(0, this) === 7; - } - setService(value) { - capnp_ts_1.Struct.setUint16(0, 7, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptDurableObjectNamespace(value) { - capnp_ts_1.Struct.setUint16(0, 8, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownDurableObjectNamespace() { - return capnp_ts_1.Struct.disown(this.getDurableObjectNamespace()); - } - getDurableObjectNamespace() { - capnp_ts_1.Struct.testWhich( - "durableObjectNamespace", - capnp_ts_1.Struct.getUint16(0, this), - 8, - this - ); - return capnp_ts_1.Struct.getStruct( - 1, - Worker_Binding_DurableObjectNamespaceDesignator, - this - ); - } - hasDurableObjectNamespace() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initDurableObjectNamespace() { - capnp_ts_1.Struct.setUint16(0, 8, this); - return capnp_ts_1.Struct.initStructAt( - 1, - Worker_Binding_DurableObjectNamespaceDesignator, - this - ); - } - isDurableObjectNamespace() { - return capnp_ts_1.Struct.getUint16(0, this) === 8; - } - setDurableObjectNamespace(value) { - capnp_ts_1.Struct.setUint16(0, 8, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptKvNamespace(value) { - capnp_ts_1.Struct.setUint16(0, 9, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownKvNamespace() { - return capnp_ts_1.Struct.disown(this.getKvNamespace()); - } - getKvNamespace() { - capnp_ts_1.Struct.testWhich( - "kvNamespace", - capnp_ts_1.Struct.getUint16(0, this), - 9, - this - ); - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasKvNamespace() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initKvNamespace() { - capnp_ts_1.Struct.setUint16(0, 9, this); - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - isKvNamespace() { - return capnp_ts_1.Struct.getUint16(0, this) === 9; - } - setKvNamespace(value) { - capnp_ts_1.Struct.setUint16(0, 9, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptR2Bucket(value) { - capnp_ts_1.Struct.setUint16(0, 10, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownR2Bucket() { - return capnp_ts_1.Struct.disown(this.getR2Bucket()); - } - getR2Bucket() { - capnp_ts_1.Struct.testWhich( - "r2Bucket", - capnp_ts_1.Struct.getUint16(0, this), - 10, - this - ); - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasR2Bucket() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initR2Bucket() { - capnp_ts_1.Struct.setUint16(0, 10, this); - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - isR2Bucket() { - return capnp_ts_1.Struct.getUint16(0, this) === 10; - } - setR2Bucket(value) { - capnp_ts_1.Struct.setUint16(0, 10, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptR2Admin(value) { - capnp_ts_1.Struct.setUint16(0, 11, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownR2Admin() { - return capnp_ts_1.Struct.disown(this.getR2Admin()); - } - getR2Admin() { - capnp_ts_1.Struct.testWhich( - "r2Admin", - capnp_ts_1.Struct.getUint16(0, this), - 11, - this - ); - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasR2Admin() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initR2Admin() { - capnp_ts_1.Struct.setUint16(0, 11, this); - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - isR2Admin() { - return capnp_ts_1.Struct.getUint16(0, this) === 11; - } - setR2Admin(value) { - capnp_ts_1.Struct.setUint16(0, 11, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptWrapped(value) { - capnp_ts_1.Struct.setUint16(0, 12, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownWrapped() { - return capnp_ts_1.Struct.disown(this.getWrapped()); - } - getWrapped() { - capnp_ts_1.Struct.testWhich( - "wrapped", - capnp_ts_1.Struct.getUint16(0, this), - 12, - this - ); - return capnp_ts_1.Struct.getStruct(1, Worker_Binding_WrappedBinding, this); - } - hasWrapped() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initWrapped() { - capnp_ts_1.Struct.setUint16(0, 12, this); - return capnp_ts_1.Struct.initStructAt( - 1, - Worker_Binding_WrappedBinding, - this - ); - } - isWrapped() { - return capnp_ts_1.Struct.getUint16(0, this) === 12; - } - setWrapped(value) { - capnp_ts_1.Struct.setUint16(0, 12, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptQueue(value) { - capnp_ts_1.Struct.setUint16(0, 13, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownQueue() { - return capnp_ts_1.Struct.disown(this.getQueue()); - } - getQueue() { - capnp_ts_1.Struct.testWhich( - "queue", - capnp_ts_1.Struct.getUint16(0, this), - 13, - this - ); - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasQueue() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initQueue() { - capnp_ts_1.Struct.setUint16(0, 13, this); - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - isQueue() { - return capnp_ts_1.Struct.getUint16(0, this) === 13; - } - setQueue(value) { - capnp_ts_1.Struct.setUint16(0, 13, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getFromEnvironment() { - capnp_ts_1.Struct.testWhich( - "fromEnvironment", - capnp_ts_1.Struct.getUint16(0, this), - 14, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isFromEnvironment() { - return capnp_ts_1.Struct.getUint16(0, this) === 14; - } - setFromEnvironment(value) { - capnp_ts_1.Struct.setUint16(0, 14, this); - capnp_ts_1.Struct.setText(1, value, this); - } - adoptAnalyticsEngine(value) { - capnp_ts_1.Struct.setUint16(0, 15, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownAnalyticsEngine() { - return capnp_ts_1.Struct.disown(this.getAnalyticsEngine()); - } - getAnalyticsEngine() { - capnp_ts_1.Struct.testWhich( - "analyticsEngine", - capnp_ts_1.Struct.getUint16(0, this), - 15, - this - ); - return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); - } - hasAnalyticsEngine() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initAnalyticsEngine() { - capnp_ts_1.Struct.setUint16(0, 15, this); - return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); - } - isAnalyticsEngine() { - return capnp_ts_1.Struct.getUint16(0, this) === 15; - } - setAnalyticsEngine(value) { - capnp_ts_1.Struct.setUint16(0, 15, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getHyperdrive() { - capnp_ts_1.Struct.testWhich( - "hyperdrive", - capnp_ts_1.Struct.getUint16(0, this), - 16, - this - ); - return capnp_ts_1.Struct.getAs(Worker_Binding_Hyperdrive, this); - } - initHyperdrive() { - capnp_ts_1.Struct.setUint16(0, 16, this); - return capnp_ts_1.Struct.getAs(Worker_Binding_Hyperdrive, this); - } - isHyperdrive() { - return capnp_ts_1.Struct.getUint16(0, this) === 16; - } - setHyperdrive() { - capnp_ts_1.Struct.setUint16(0, 16, this); - } - isUnsafeEval() { - return capnp_ts_1.Struct.getUint16(0, this) === 17; - } - setUnsafeEval() { - capnp_ts_1.Struct.setUint16(0, 17, this); - } - toString() { - return "Worker_Binding_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + isUnspecified() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setUnspecified() { capnp_ts_1.Struct.setUint16(0, 0, this); } + getParameter() { + capnp_ts_1.Struct.testWhich("parameter", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getAs(Worker_Binding_Parameter, this); + } + initParameter() { + capnp_ts_1.Struct.setUint16(0, 1, this); + return capnp_ts_1.Struct.getAs(Worker_Binding_Parameter, this); + } + isParameter() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setParameter() { capnp_ts_1.Struct.setUint16(0, 1, this); } + getText() { + capnp_ts_1.Struct.testWhich("text", capnp_ts_1.Struct.getUint16(0, this), 2, this); + return capnp_ts_1.Struct.getText(1, this); + } + isText() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setText(value) { + capnp_ts_1.Struct.setUint16(0, 2, this); + capnp_ts_1.Struct.setText(1, value, this); + } + adoptData(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownData() { return capnp_ts_1.Struct.disown(this.getData()); } + getData() { + capnp_ts_1.Struct.testWhich("data", capnp_ts_1.Struct.getUint16(0, this), 3, this); + return capnp_ts_1.Struct.getData(1, this); + } + hasData() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initData(length) { + capnp_ts_1.Struct.setUint16(0, 3, this); + return capnp_ts_1.Struct.initData(1, length, this); + } + isData() { return capnp_ts_1.Struct.getUint16(0, this) === 3; } + setData(value) { + capnp_ts_1.Struct.setUint16(0, 3, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + getJson() { + capnp_ts_1.Struct.testWhich("json", capnp_ts_1.Struct.getUint16(0, this), 4, this); + return capnp_ts_1.Struct.getText(1, this); + } + isJson() { return capnp_ts_1.Struct.getUint16(0, this) === 4; } + setJson(value) { + capnp_ts_1.Struct.setUint16(0, 4, this); + capnp_ts_1.Struct.setText(1, value, this); + } + adoptWasmModule(value) { + capnp_ts_1.Struct.setUint16(0, 5, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownWasmModule() { return capnp_ts_1.Struct.disown(this.getWasmModule()); } + getWasmModule() { + capnp_ts_1.Struct.testWhich("wasmModule", capnp_ts_1.Struct.getUint16(0, this), 5, this); + return capnp_ts_1.Struct.getData(1, this); + } + hasWasmModule() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initWasmModule(length) { + capnp_ts_1.Struct.setUint16(0, 5, this); + return capnp_ts_1.Struct.initData(1, length, this); + } + isWasmModule() { return capnp_ts_1.Struct.getUint16(0, this) === 5; } + setWasmModule(value) { + capnp_ts_1.Struct.setUint16(0, 5, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptCryptoKey(value) { + capnp_ts_1.Struct.setUint16(0, 6, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownCryptoKey() { return capnp_ts_1.Struct.disown(this.getCryptoKey()); } + getCryptoKey() { + capnp_ts_1.Struct.testWhich("cryptoKey", capnp_ts_1.Struct.getUint16(0, this), 6, this); + return capnp_ts_1.Struct.getStruct(1, Worker_Binding_CryptoKey, this); + } + hasCryptoKey() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initCryptoKey() { + capnp_ts_1.Struct.setUint16(0, 6, this); + return capnp_ts_1.Struct.initStructAt(1, Worker_Binding_CryptoKey, this); + } + isCryptoKey() { return capnp_ts_1.Struct.getUint16(0, this) === 6; } + setCryptoKey(value) { + capnp_ts_1.Struct.setUint16(0, 6, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptService(value) { + capnp_ts_1.Struct.setUint16(0, 7, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownService() { return capnp_ts_1.Struct.disown(this.getService()); } + getService() { + capnp_ts_1.Struct.testWhich("service", capnp_ts_1.Struct.getUint16(0, this), 7, this); + return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); + } + hasService() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initService() { + capnp_ts_1.Struct.setUint16(0, 7, this); + return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); + } + isService() { return capnp_ts_1.Struct.getUint16(0, this) === 7; } + setService(value) { + capnp_ts_1.Struct.setUint16(0, 7, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptDurableObjectNamespace(value) { + capnp_ts_1.Struct.setUint16(0, 8, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownDurableObjectNamespace() { return capnp_ts_1.Struct.disown(this.getDurableObjectNamespace()); } + getDurableObjectNamespace() { + capnp_ts_1.Struct.testWhich("durableObjectNamespace", capnp_ts_1.Struct.getUint16(0, this), 8, this); + return capnp_ts_1.Struct.getStruct(1, Worker_Binding_DurableObjectNamespaceDesignator, this); + } + hasDurableObjectNamespace() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initDurableObjectNamespace() { + capnp_ts_1.Struct.setUint16(0, 8, this); + return capnp_ts_1.Struct.initStructAt(1, Worker_Binding_DurableObjectNamespaceDesignator, this); + } + isDurableObjectNamespace() { return capnp_ts_1.Struct.getUint16(0, this) === 8; } + setDurableObjectNamespace(value) { + capnp_ts_1.Struct.setUint16(0, 8, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptKvNamespace(value) { + capnp_ts_1.Struct.setUint16(0, 9, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownKvNamespace() { return capnp_ts_1.Struct.disown(this.getKvNamespace()); } + getKvNamespace() { + capnp_ts_1.Struct.testWhich("kvNamespace", capnp_ts_1.Struct.getUint16(0, this), 9, this); + return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); + } + hasKvNamespace() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initKvNamespace() { + capnp_ts_1.Struct.setUint16(0, 9, this); + return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); + } + isKvNamespace() { return capnp_ts_1.Struct.getUint16(0, this) === 9; } + setKvNamespace(value) { + capnp_ts_1.Struct.setUint16(0, 9, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptR2Bucket(value) { + capnp_ts_1.Struct.setUint16(0, 10, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownR2Bucket() { return capnp_ts_1.Struct.disown(this.getR2Bucket()); } + getR2Bucket() { + capnp_ts_1.Struct.testWhich("r2Bucket", capnp_ts_1.Struct.getUint16(0, this), 10, this); + return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); + } + hasR2Bucket() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initR2Bucket() { + capnp_ts_1.Struct.setUint16(0, 10, this); + return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); + } + isR2Bucket() { return capnp_ts_1.Struct.getUint16(0, this) === 10; } + setR2Bucket(value) { + capnp_ts_1.Struct.setUint16(0, 10, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptR2Admin(value) { + capnp_ts_1.Struct.setUint16(0, 11, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownR2Admin() { return capnp_ts_1.Struct.disown(this.getR2Admin()); } + getR2Admin() { + capnp_ts_1.Struct.testWhich("r2Admin", capnp_ts_1.Struct.getUint16(0, this), 11, this); + return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); + } + hasR2Admin() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initR2Admin() { + capnp_ts_1.Struct.setUint16(0, 11, this); + return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); + } + isR2Admin() { return capnp_ts_1.Struct.getUint16(0, this) === 11; } + setR2Admin(value) { + capnp_ts_1.Struct.setUint16(0, 11, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptWrapped(value) { + capnp_ts_1.Struct.setUint16(0, 12, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownWrapped() { return capnp_ts_1.Struct.disown(this.getWrapped()); } + getWrapped() { + capnp_ts_1.Struct.testWhich("wrapped", capnp_ts_1.Struct.getUint16(0, this), 12, this); + return capnp_ts_1.Struct.getStruct(1, Worker_Binding_WrappedBinding, this); + } + hasWrapped() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initWrapped() { + capnp_ts_1.Struct.setUint16(0, 12, this); + return capnp_ts_1.Struct.initStructAt(1, Worker_Binding_WrappedBinding, this); + } + isWrapped() { return capnp_ts_1.Struct.getUint16(0, this) === 12; } + setWrapped(value) { + capnp_ts_1.Struct.setUint16(0, 12, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + adoptQueue(value) { + capnp_ts_1.Struct.setUint16(0, 13, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownQueue() { return capnp_ts_1.Struct.disown(this.getQueue()); } + getQueue() { + capnp_ts_1.Struct.testWhich("queue", capnp_ts_1.Struct.getUint16(0, this), 13, this); + return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); + } + hasQueue() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initQueue() { + capnp_ts_1.Struct.setUint16(0, 13, this); + return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); + } + isQueue() { return capnp_ts_1.Struct.getUint16(0, this) === 13; } + setQueue(value) { + capnp_ts_1.Struct.setUint16(0, 13, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + getFromEnvironment() { + capnp_ts_1.Struct.testWhich("fromEnvironment", capnp_ts_1.Struct.getUint16(0, this), 14, this); + return capnp_ts_1.Struct.getText(1, this); + } + isFromEnvironment() { return capnp_ts_1.Struct.getUint16(0, this) === 14; } + setFromEnvironment(value) { + capnp_ts_1.Struct.setUint16(0, 14, this); + capnp_ts_1.Struct.setText(1, value, this); + } + adoptAnalyticsEngine(value) { + capnp_ts_1.Struct.setUint16(0, 15, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownAnalyticsEngine() { return capnp_ts_1.Struct.disown(this.getAnalyticsEngine()); } + getAnalyticsEngine() { + capnp_ts_1.Struct.testWhich("analyticsEngine", capnp_ts_1.Struct.getUint16(0, this), 15, this); + return capnp_ts_1.Struct.getStruct(1, ServiceDesignator, this); + } + hasAnalyticsEngine() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initAnalyticsEngine() { + capnp_ts_1.Struct.setUint16(0, 15, this); + return capnp_ts_1.Struct.initStructAt(1, ServiceDesignator, this); + } + isAnalyticsEngine() { return capnp_ts_1.Struct.getUint16(0, this) === 15; } + setAnalyticsEngine(value) { + capnp_ts_1.Struct.setUint16(0, 15, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + getHyperdrive() { + capnp_ts_1.Struct.testWhich("hyperdrive", capnp_ts_1.Struct.getUint16(0, this), 16, this); + return capnp_ts_1.Struct.getAs(Worker_Binding_Hyperdrive, this); + } + initHyperdrive() { + capnp_ts_1.Struct.setUint16(0, 16, this); + return capnp_ts_1.Struct.getAs(Worker_Binding_Hyperdrive, this); + } + isHyperdrive() { return capnp_ts_1.Struct.getUint16(0, this) === 16; } + setHyperdrive() { capnp_ts_1.Struct.setUint16(0, 16, this); } + isUnsafeEval() { return capnp_ts_1.Struct.getUint16(0, this) === 17; } + setUnsafeEval() { capnp_ts_1.Struct.setUint16(0, 17, this); } + toString() { return "Worker_Binding_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Worker_Binding = Worker_Binding; Worker_Binding.UNSPECIFIED = Worker_Binding_Which.UNSPECIFIED; @@ -1760,8 +922,7 @@ Worker_Binding.JSON = Worker_Binding_Which.JSON; Worker_Binding.WASM_MODULE = Worker_Binding_Which.WASM_MODULE; Worker_Binding.CRYPTO_KEY = Worker_Binding_Which.CRYPTO_KEY; Worker_Binding.SERVICE = Worker_Binding_Which.SERVICE; -Worker_Binding.DURABLE_OBJECT_NAMESPACE = - Worker_Binding_Which.DURABLE_OBJECT_NAMESPACE; +Worker_Binding.DURABLE_OBJECT_NAMESPACE = Worker_Binding_Which.DURABLE_OBJECT_NAMESPACE; Worker_Binding.KV_NAMESPACE = Worker_Binding_Which.KV_NAMESPACE; Worker_Binding.R2BUCKET = Worker_Binding_Which.R2BUCKET; Worker_Binding.R2ADMIN = Worker_Binding_Which.R2ADMIN; @@ -1772,337 +933,150 @@ Worker_Binding.ANALYTICS_ENGINE = Worker_Binding_Which.ANALYTICS_ENGINE; Worker_Binding.HYPERDRIVE = Worker_Binding_Which.HYPERDRIVE; Worker_Binding.UNSAFE_EVAL = Worker_Binding_Which.UNSAFE_EVAL; Worker_Binding.Type = Worker_Binding_Type; -Worker_Binding.DurableObjectNamespaceDesignator = - Worker_Binding_DurableObjectNamespaceDesignator; +Worker_Binding.DurableObjectNamespaceDesignator = Worker_Binding_DurableObjectNamespaceDesignator; Worker_Binding.CryptoKey = Worker_Binding_CryptoKey; Worker_Binding.WrappedBinding = Worker_Binding_WrappedBinding; -Worker_Binding._capnp = { - displayName: "Binding", - id: "8e7e492fd7e35f3e", - size: new capnp_ts_1.ObjectSize(8, 6), -}; +Worker_Binding._capnp = { displayName: "Binding", id: "8e7e492fd7e35f3e", size: new capnp_ts_1.ObjectSize(8, 6) }; var Worker_DurableObjectNamespace_Which; (function (Worker_DurableObjectNamespace_Which) { - Worker_DurableObjectNamespace_Which[ - (Worker_DurableObjectNamespace_Which["UNIQUE_KEY"] = 0) - ] = "UNIQUE_KEY"; - Worker_DurableObjectNamespace_Which[ - (Worker_DurableObjectNamespace_Which["EPHEMERAL_LOCAL"] = 1) - ] = "EPHEMERAL_LOCAL"; -})( - (Worker_DurableObjectNamespace_Which = - exports.Worker_DurableObjectNamespace_Which || - (exports.Worker_DurableObjectNamespace_Which = {})) -); + Worker_DurableObjectNamespace_Which[Worker_DurableObjectNamespace_Which["UNIQUE_KEY"] = 0] = "UNIQUE_KEY"; + Worker_DurableObjectNamespace_Which[Worker_DurableObjectNamespace_Which["EPHEMERAL_LOCAL"] = 1] = "EPHEMERAL_LOCAL"; +})(Worker_DurableObjectNamespace_Which = exports.Worker_DurableObjectNamespace_Which || (exports.Worker_DurableObjectNamespace_Which = {})); class Worker_DurableObjectNamespace extends capnp_ts_1.Struct { - getClassName() { - return capnp_ts_1.Struct.getText(0, this); - } - setClassName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getUniqueKey() { - capnp_ts_1.Struct.testWhich( - "uniqueKey", - capnp_ts_1.Struct.getUint16(0, this), - 0, - this - ); - return capnp_ts_1.Struct.getText(1, this); - } - isUniqueKey() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setUniqueKey(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.setText(1, value, this); - } - isEphemeralLocal() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setEphemeralLocal() { - capnp_ts_1.Struct.setUint16(0, 1, this); - } - getPreventEviction() { - return capnp_ts_1.Struct.getBit(16, this); - } - setPreventEviction(value) { - capnp_ts_1.Struct.setBit(16, value, this); - } - toString() { - return "Worker_DurableObjectNamespace_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + getClassName() { return capnp_ts_1.Struct.getText(0, this); } + setClassName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getUniqueKey() { + capnp_ts_1.Struct.testWhich("uniqueKey", capnp_ts_1.Struct.getUint16(0, this), 0, this); + return capnp_ts_1.Struct.getText(1, this); + } + isUniqueKey() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setUniqueKey(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.setText(1, value, this); + } + isEphemeralLocal() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setEphemeralLocal() { capnp_ts_1.Struct.setUint16(0, 1, this); } + getPreventEviction() { return capnp_ts_1.Struct.getBit(16, this); } + setPreventEviction(value) { capnp_ts_1.Struct.setBit(16, value, this); } + toString() { return "Worker_DurableObjectNamespace_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Worker_DurableObjectNamespace = Worker_DurableObjectNamespace; -Worker_DurableObjectNamespace.UNIQUE_KEY = - Worker_DurableObjectNamespace_Which.UNIQUE_KEY; -Worker_DurableObjectNamespace.EPHEMERAL_LOCAL = - Worker_DurableObjectNamespace_Which.EPHEMERAL_LOCAL; -Worker_DurableObjectNamespace._capnp = { - displayName: "DurableObjectNamespace", - id: "b429dd547d15747d", - size: new capnp_ts_1.ObjectSize(8, 2), -}; +Worker_DurableObjectNamespace.UNIQUE_KEY = Worker_DurableObjectNamespace_Which.UNIQUE_KEY; +Worker_DurableObjectNamespace.EPHEMERAL_LOCAL = Worker_DurableObjectNamespace_Which.EPHEMERAL_LOCAL; +Worker_DurableObjectNamespace._capnp = { displayName: "DurableObjectNamespace", id: "b429dd547d15747d", size: new capnp_ts_1.ObjectSize(8, 2) }; var Worker_DurableObjectStorage_Which; (function (Worker_DurableObjectStorage_Which) { - Worker_DurableObjectStorage_Which[ - (Worker_DurableObjectStorage_Which["NONE"] = 0) - ] = "NONE"; - Worker_DurableObjectStorage_Which[ - (Worker_DurableObjectStorage_Which["IN_MEMORY"] = 1) - ] = "IN_MEMORY"; - Worker_DurableObjectStorage_Which[ - (Worker_DurableObjectStorage_Which["LOCAL_DISK"] = 2) - ] = "LOCAL_DISK"; -})( - (Worker_DurableObjectStorage_Which = - exports.Worker_DurableObjectStorage_Which || - (exports.Worker_DurableObjectStorage_Which = {})) -); + Worker_DurableObjectStorage_Which[Worker_DurableObjectStorage_Which["NONE"] = 0] = "NONE"; + Worker_DurableObjectStorage_Which[Worker_DurableObjectStorage_Which["IN_MEMORY"] = 1] = "IN_MEMORY"; + Worker_DurableObjectStorage_Which[Worker_DurableObjectStorage_Which["LOCAL_DISK"] = 2] = "LOCAL_DISK"; +})(Worker_DurableObjectStorage_Which = exports.Worker_DurableObjectStorage_Which || (exports.Worker_DurableObjectStorage_Which = {})); class Worker_DurableObjectStorage extends capnp_ts_1.Struct { - isNone() { - return capnp_ts_1.Struct.getUint16(2, this) === 0; - } - setNone() { - capnp_ts_1.Struct.setUint16(2, 0, this); - } - isInMemory() { - return capnp_ts_1.Struct.getUint16(2, this) === 1; - } - setInMemory() { - capnp_ts_1.Struct.setUint16(2, 1, this); - } - getLocalDisk() { - capnp_ts_1.Struct.testWhich( - "localDisk", - capnp_ts_1.Struct.getUint16(2, this), - 2, - this - ); - return capnp_ts_1.Struct.getText(8, this); - } - isLocalDisk() { - return capnp_ts_1.Struct.getUint16(2, this) === 2; - } - setLocalDisk(value) { - capnp_ts_1.Struct.setUint16(2, 2, this); - capnp_ts_1.Struct.setText(8, value, this); - } - toString() { - return "Worker_DurableObjectStorage_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(2, this); - } + isNone() { return capnp_ts_1.Struct.getUint16(2, this) === 0; } + setNone() { capnp_ts_1.Struct.setUint16(2, 0, this); } + isInMemory() { return capnp_ts_1.Struct.getUint16(2, this) === 1; } + setInMemory() { capnp_ts_1.Struct.setUint16(2, 1, this); } + getLocalDisk() { + capnp_ts_1.Struct.testWhich("localDisk", capnp_ts_1.Struct.getUint16(2, this), 2, this); + return capnp_ts_1.Struct.getText(8, this); + } + isLocalDisk() { return capnp_ts_1.Struct.getUint16(2, this) === 2; } + setLocalDisk(value) { + capnp_ts_1.Struct.setUint16(2, 2, this); + capnp_ts_1.Struct.setText(8, value, this); + } + toString() { return "Worker_DurableObjectStorage_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(2, this); } } exports.Worker_DurableObjectStorage = Worker_DurableObjectStorage; Worker_DurableObjectStorage.NONE = Worker_DurableObjectStorage_Which.NONE; -Worker_DurableObjectStorage.IN_MEMORY = - Worker_DurableObjectStorage_Which.IN_MEMORY; -Worker_DurableObjectStorage.LOCAL_DISK = - Worker_DurableObjectStorage_Which.LOCAL_DISK; -Worker_DurableObjectStorage._capnp = { - displayName: "durableObjectStorage", - id: "cc72b3faa57827d4", - size: new capnp_ts_1.ObjectSize(8, 9), -}; +Worker_DurableObjectStorage.IN_MEMORY = Worker_DurableObjectStorage_Which.IN_MEMORY; +Worker_DurableObjectStorage.LOCAL_DISK = Worker_DurableObjectStorage_Which.LOCAL_DISK; +Worker_DurableObjectStorage._capnp = { displayName: "durableObjectStorage", id: "cc72b3faa57827d4", size: new capnp_ts_1.ObjectSize(8, 10) }; var Worker_Which; (function (Worker_Which) { - Worker_Which[(Worker_Which["MODULES"] = 0)] = "MODULES"; - Worker_Which[(Worker_Which["SERVICE_WORKER_SCRIPT"] = 1)] = - "SERVICE_WORKER_SCRIPT"; - Worker_Which[(Worker_Which["INHERIT"] = 2)] = "INHERIT"; -})((Worker_Which = exports.Worker_Which || (exports.Worker_Which = {}))); + Worker_Which[Worker_Which["MODULES"] = 0] = "MODULES"; + Worker_Which[Worker_Which["SERVICE_WORKER_SCRIPT"] = 1] = "SERVICE_WORKER_SCRIPT"; + Worker_Which[Worker_Which["INHERIT"] = 2] = "INHERIT"; +})(Worker_Which = exports.Worker_Which || (exports.Worker_Which = {})); class Worker extends capnp_ts_1.Struct { - adoptModules(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownModules() { - return capnp_ts_1.Struct.disown(this.getModules()); - } - getModules() { - capnp_ts_1.Struct.testWhich( - "modules", - capnp_ts_1.Struct.getUint16(0, this), - 0, - this - ); - return capnp_ts_1.Struct.getList(0, Worker._Modules, this); - } - hasModules() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initModules(length) { - capnp_ts_1.Struct.setUint16(0, 0, this); - return capnp_ts_1.Struct.initList(0, Worker._Modules, length, this); - } - isModules() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setModules(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - getServiceWorkerScript() { - capnp_ts_1.Struct.testWhich( - "serviceWorkerScript", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isServiceWorkerScript() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setServiceWorkerScript(value) { - capnp_ts_1.Struct.setUint16(0, 1, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getInherit() { - capnp_ts_1.Struct.testWhich( - "inherit", - capnp_ts_1.Struct.getUint16(0, this), - 2, - this - ); - return capnp_ts_1.Struct.getText(0, this); - } - isInherit() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setInherit(value) { - capnp_ts_1.Struct.setUint16(0, 2, this); - capnp_ts_1.Struct.setText(0, value, this); - } - getCompatibilityDate() { - return capnp_ts_1.Struct.getText(1, this); - } - setCompatibilityDate(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - adoptCompatibilityFlags(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownCompatibilityFlags() { - return capnp_ts_1.Struct.disown(this.getCompatibilityFlags()); - } - getCompatibilityFlags() { - return capnp_ts_1.Struct.getList(2, capnp.TextList, this); - } - hasCompatibilityFlags() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initCompatibilityFlags(length) { - return capnp_ts_1.Struct.initList(2, capnp.TextList, length, this); - } - setCompatibilityFlags(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - adoptBindings(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); - } - disownBindings() { - return capnp_ts_1.Struct.disown(this.getBindings()); - } - getBindings() { - return capnp_ts_1.Struct.getList(3, Worker._Bindings, this); - } - hasBindings() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); - } - initBindings(length) { - return capnp_ts_1.Struct.initList(3, Worker._Bindings, length, this); - } - setBindings(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); - } - adoptGlobalOutbound(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(4, this)); - } - disownGlobalOutbound() { - return capnp_ts_1.Struct.disown(this.getGlobalOutbound()); - } - getGlobalOutbound() { - return capnp_ts_1.Struct.getStruct( - 4, - ServiceDesignator, - this, - Worker._capnp.defaultGlobalOutbound - ); - } - hasGlobalOutbound() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(4, this)); - } - initGlobalOutbound() { - return capnp_ts_1.Struct.initStructAt(4, ServiceDesignator, this); - } - setGlobalOutbound(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(4, this)); - } - adoptCacheApiOutbound(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(7, this)); - } - disownCacheApiOutbound() { - return capnp_ts_1.Struct.disown(this.getCacheApiOutbound()); - } - getCacheApiOutbound() { - return capnp_ts_1.Struct.getStruct(7, ServiceDesignator, this); - } - hasCacheApiOutbound() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(7, this)); - } - initCacheApiOutbound() { - return capnp_ts_1.Struct.initStructAt(7, ServiceDesignator, this); - } - setCacheApiOutbound(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(7, this)); - } - adoptDurableObjectNamespaces(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(5, this)); - } - disownDurableObjectNamespaces() { - return capnp_ts_1.Struct.disown(this.getDurableObjectNamespaces()); - } - getDurableObjectNamespaces() { - return capnp_ts_1.Struct.getList(5, Worker._DurableObjectNamespaces, this); - } - hasDurableObjectNamespaces() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(5, this)); - } - initDurableObjectNamespaces(length) { - return capnp_ts_1.Struct.initList( - 5, - Worker._DurableObjectNamespaces, - length, - this - ); - } - setDurableObjectNamespaces(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(5, this)); - } - getDurableObjectUniqueKeyModifier() { - return capnp_ts_1.Struct.getText(6, this); - } - setDurableObjectUniqueKeyModifier(value) { - capnp_ts_1.Struct.setText(6, value, this); - } - getDurableObjectStorage() { - return capnp_ts_1.Struct.getAs(Worker_DurableObjectStorage, this); - } - initDurableObjectStorage() { - return capnp_ts_1.Struct.getAs(Worker_DurableObjectStorage, this); - } - toString() { - return "Worker_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + adoptModules(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); + } + disownModules() { return capnp_ts_1.Struct.disown(this.getModules()); } + getModules() { + capnp_ts_1.Struct.testWhich("modules", capnp_ts_1.Struct.getUint16(0, this), 0, this); + return capnp_ts_1.Struct.getList(0, Worker._Modules, this); + } + hasModules() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initModules(length) { + capnp_ts_1.Struct.setUint16(0, 0, this); + return capnp_ts_1.Struct.initList(0, Worker._Modules, length, this); + } + isModules() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setModules(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); + } + getServiceWorkerScript() { + capnp_ts_1.Struct.testWhich("serviceWorkerScript", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getText(0, this); + } + isServiceWorkerScript() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setServiceWorkerScript(value) { + capnp_ts_1.Struct.setUint16(0, 1, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getInherit() { + capnp_ts_1.Struct.testWhich("inherit", capnp_ts_1.Struct.getUint16(0, this), 2, this); + return capnp_ts_1.Struct.getText(0, this); + } + isInherit() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setInherit(value) { + capnp_ts_1.Struct.setUint16(0, 2, this); + capnp_ts_1.Struct.setText(0, value, this); + } + getCompatibilityDate() { return capnp_ts_1.Struct.getText(1, this); } + setCompatibilityDate(value) { capnp_ts_1.Struct.setText(1, value, this); } + adoptCompatibilityFlags(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownCompatibilityFlags() { return capnp_ts_1.Struct.disown(this.getCompatibilityFlags()); } + getCompatibilityFlags() { return capnp_ts_1.Struct.getList(2, capnp.TextList, this); } + hasCompatibilityFlags() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initCompatibilityFlags(length) { return capnp_ts_1.Struct.initList(2, capnp.TextList, length, this); } + setCompatibilityFlags(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + adoptBindings(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); } + disownBindings() { return capnp_ts_1.Struct.disown(this.getBindings()); } + getBindings() { return capnp_ts_1.Struct.getList(3, Worker._Bindings, this); } + hasBindings() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); } + initBindings(length) { return capnp_ts_1.Struct.initList(3, Worker._Bindings, length, this); } + setBindings(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); } + adoptGlobalOutbound(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(4, this)); } + disownGlobalOutbound() { return capnp_ts_1.Struct.disown(this.getGlobalOutbound()); } + getGlobalOutbound() { return capnp_ts_1.Struct.getStruct(4, ServiceDesignator, this, Worker._capnp.defaultGlobalOutbound); } + hasGlobalOutbound() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(4, this)); } + initGlobalOutbound() { return capnp_ts_1.Struct.initStructAt(4, ServiceDesignator, this); } + setGlobalOutbound(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(4, this)); } + adoptCacheApiOutbound(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(7, this)); } + disownCacheApiOutbound() { return capnp_ts_1.Struct.disown(this.getCacheApiOutbound()); } + getCacheApiOutbound() { return capnp_ts_1.Struct.getStruct(7, ServiceDesignator, this); } + hasCacheApiOutbound() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(7, this)); } + initCacheApiOutbound() { return capnp_ts_1.Struct.initStructAt(7, ServiceDesignator, this); } + setCacheApiOutbound(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(7, this)); } + adoptDurableObjectNamespaces(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(5, this)); } + disownDurableObjectNamespaces() { return capnp_ts_1.Struct.disown(this.getDurableObjectNamespaces()); } + getDurableObjectNamespaces() { return capnp_ts_1.Struct.getList(5, Worker._DurableObjectNamespaces, this); } + hasDurableObjectNamespaces() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(5, this)); } + initDurableObjectNamespaces(length) { return capnp_ts_1.Struct.initList(5, Worker._DurableObjectNamespaces, length, this); } + setDurableObjectNamespaces(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(5, this)); } + getDurableObjectUniqueKeyModifier() { return capnp_ts_1.Struct.getText(6, this); } + setDurableObjectUniqueKeyModifier(value) { capnp_ts_1.Struct.setText(6, value, this); } + getDurableObjectStorage() { return capnp_ts_1.Struct.getAs(Worker_DurableObjectStorage, this); } + initDurableObjectStorage() { return capnp_ts_1.Struct.getAs(Worker_DurableObjectStorage, this); } + getModuleFallback() { return capnp_ts_1.Struct.getText(9, this); } + setModuleFallback(value) { capnp_ts_1.Struct.setText(9, value, this); } + toString() { return "Worker_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.Worker = Worker; Worker.MODULES = Worker_Which.MODULES; @@ -2111,638 +1085,243 @@ Worker.INHERIT = Worker_Which.INHERIT; Worker.Module = Worker_Module; Worker.Binding = Worker_Binding; Worker.DurableObjectNamespace = Worker_DurableObjectNamespace; -Worker._capnp = { - displayName: "Worker", - id: "acfa77e88fd97d1c", - size: new capnp_ts_1.ObjectSize(8, 9), - defaultGlobalOutbound: capnp.readRawPointer( - new Uint8Array([ - 0x10, 0x05, 0x40, 0x02, 0x11, 0x05, 0x4a, 0x00, 0x00, 0xff, 0x69, 0x6e, - 0x74, 0x65, 0x72, 0x6e, 0x65, 0x74, 0x00, 0x00, 0x00, - ]).buffer - ), -}; +Worker._capnp = { displayName: "Worker", id: "acfa77e88fd97d1c", size: new capnp_ts_1.ObjectSize(8, 10), defaultGlobalOutbound: capnp.readRawPointer(new Uint8Array([0x10, 0x05, 0x40, 0x02, 0x11, 0x05, 0x4a, 0x00, 0x00, 0xff, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x65, 0x74, 0x00, 0x00, 0x00]).buffer) }; class ExternalServer_Https extends capnp_ts_1.Struct { - adoptOptions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownOptions() { - return capnp_ts_1.Struct.disown(this.getOptions()); - } - getOptions() { - return capnp_ts_1.Struct.getStruct(1, HttpOptions, this); - } - hasOptions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initOptions() { - return capnp_ts_1.Struct.initStructAt(1, HttpOptions, this); - } - setOptions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptTlsOptions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownTlsOptions() { - return capnp_ts_1.Struct.disown(this.getTlsOptions()); - } - getTlsOptions() { - return capnp_ts_1.Struct.getStruct(2, TlsOptions, this); - } - hasTlsOptions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initTlsOptions() { - return capnp_ts_1.Struct.initStructAt(2, TlsOptions, this); - } - setTlsOptions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - getCertificateHost() { - return capnp_ts_1.Struct.getText(3, this); - } - setCertificateHost(value) { - capnp_ts_1.Struct.setText(3, value, this); - } - toString() { - return "ExternalServer_Https_" + super.toString(); - } + adoptOptions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownOptions() { return capnp_ts_1.Struct.disown(this.getOptions()); } + getOptions() { return capnp_ts_1.Struct.getStruct(1, HttpOptions, this); } + hasOptions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initOptions() { return capnp_ts_1.Struct.initStructAt(1, HttpOptions, this); } + setOptions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + adoptTlsOptions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownTlsOptions() { return capnp_ts_1.Struct.disown(this.getTlsOptions()); } + getTlsOptions() { return capnp_ts_1.Struct.getStruct(2, TlsOptions, this); } + hasTlsOptions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initTlsOptions() { return capnp_ts_1.Struct.initStructAt(2, TlsOptions, this); } + setTlsOptions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + getCertificateHost() { return capnp_ts_1.Struct.getText(3, this); } + setCertificateHost(value) { capnp_ts_1.Struct.setText(3, value, this); } + toString() { return "ExternalServer_Https_" + super.toString(); } } exports.ExternalServer_Https = ExternalServer_Https; -ExternalServer_Https._capnp = { - displayName: "https", - id: "ac37e02afd3dc6db", - size: new capnp_ts_1.ObjectSize(8, 4), -}; +ExternalServer_Https._capnp = { displayName: "https", id: "ac37e02afd3dc6db", size: new capnp_ts_1.ObjectSize(8, 4) }; class ExternalServer_Tcp extends capnp_ts_1.Struct { - adoptTlsOptions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownTlsOptions() { - return capnp_ts_1.Struct.disown(this.getTlsOptions()); - } - getTlsOptions() { - return capnp_ts_1.Struct.getStruct(1, TlsOptions, this); - } - hasTlsOptions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initTlsOptions() { - return capnp_ts_1.Struct.initStructAt(1, TlsOptions, this); - } - setTlsOptions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getCertificateHost() { - return capnp_ts_1.Struct.getText(2, this); - } - setCertificateHost(value) { - capnp_ts_1.Struct.setText(2, value, this); - } - toString() { - return "ExternalServer_Tcp_" + super.toString(); - } + adoptTlsOptions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownTlsOptions() { return capnp_ts_1.Struct.disown(this.getTlsOptions()); } + getTlsOptions() { return capnp_ts_1.Struct.getStruct(1, TlsOptions, this); } + hasTlsOptions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initTlsOptions() { return capnp_ts_1.Struct.initStructAt(1, TlsOptions, this); } + setTlsOptions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + getCertificateHost() { return capnp_ts_1.Struct.getText(2, this); } + setCertificateHost(value) { capnp_ts_1.Struct.setText(2, value, this); } + toString() { return "ExternalServer_Tcp_" + super.toString(); } } exports.ExternalServer_Tcp = ExternalServer_Tcp; -ExternalServer_Tcp._capnp = { - displayName: "tcp", - id: "d941637df0fb39f1", - size: new capnp_ts_1.ObjectSize(8, 4), -}; +ExternalServer_Tcp._capnp = { displayName: "tcp", id: "d941637df0fb39f1", size: new capnp_ts_1.ObjectSize(8, 4) }; var ExternalServer_Which; (function (ExternalServer_Which) { - ExternalServer_Which[(ExternalServer_Which["HTTP"] = 0)] = "HTTP"; - ExternalServer_Which[(ExternalServer_Which["HTTPS"] = 1)] = "HTTPS"; - ExternalServer_Which[(ExternalServer_Which["TCP"] = 2)] = "TCP"; -})( - (ExternalServer_Which = - exports.ExternalServer_Which || (exports.ExternalServer_Which = {})) -); + ExternalServer_Which[ExternalServer_Which["HTTP"] = 0] = "HTTP"; + ExternalServer_Which[ExternalServer_Which["HTTPS"] = 1] = "HTTPS"; + ExternalServer_Which[ExternalServer_Which["TCP"] = 2] = "TCP"; +})(ExternalServer_Which = exports.ExternalServer_Which || (exports.ExternalServer_Which = {})); class ExternalServer extends capnp_ts_1.Struct { - getAddress() { - return capnp_ts_1.Struct.getText(0, this); - } - setAddress(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - adoptHttp(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownHttp() { - return capnp_ts_1.Struct.disown(this.getHttp()); - } - getHttp() { - capnp_ts_1.Struct.testWhich( - "http", - capnp_ts_1.Struct.getUint16(0, this), - 0, - this - ); - return capnp_ts_1.Struct.getStruct(1, HttpOptions, this); - } - hasHttp() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initHttp() { - capnp_ts_1.Struct.setUint16(0, 0, this); - return capnp_ts_1.Struct.initStructAt(1, HttpOptions, this); - } - isHttp() { - return capnp_ts_1.Struct.getUint16(0, this) === 0; - } - setHttp(value) { - capnp_ts_1.Struct.setUint16(0, 0, this); - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getHttps() { - capnp_ts_1.Struct.testWhich( - "https", - capnp_ts_1.Struct.getUint16(0, this), - 1, - this - ); - return capnp_ts_1.Struct.getAs(ExternalServer_Https, this); - } - initHttps() { - capnp_ts_1.Struct.setUint16(0, 1, this); - return capnp_ts_1.Struct.getAs(ExternalServer_Https, this); - } - isHttps() { - return capnp_ts_1.Struct.getUint16(0, this) === 1; - } - setHttps() { - capnp_ts_1.Struct.setUint16(0, 1, this); - } - getTcp() { - capnp_ts_1.Struct.testWhich( - "tcp", - capnp_ts_1.Struct.getUint16(0, this), - 2, - this - ); - return capnp_ts_1.Struct.getAs(ExternalServer_Tcp, this); - } - initTcp() { - capnp_ts_1.Struct.setUint16(0, 2, this); - return capnp_ts_1.Struct.getAs(ExternalServer_Tcp, this); - } - isTcp() { - return capnp_ts_1.Struct.getUint16(0, this) === 2; - } - setTcp() { - capnp_ts_1.Struct.setUint16(0, 2, this); - } - toString() { - return "ExternalServer_" + super.toString(); - } - which() { - return capnp_ts_1.Struct.getUint16(0, this); - } + getAddress() { return capnp_ts_1.Struct.getText(0, this); } + setAddress(value) { capnp_ts_1.Struct.setText(0, value, this); } + adoptHttp(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); + } + disownHttp() { return capnp_ts_1.Struct.disown(this.getHttp()); } + getHttp() { + capnp_ts_1.Struct.testWhich("http", capnp_ts_1.Struct.getUint16(0, this), 0, this); + return capnp_ts_1.Struct.getStruct(1, HttpOptions, this); + } + hasHttp() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initHttp() { + capnp_ts_1.Struct.setUint16(0, 0, this); + return capnp_ts_1.Struct.initStructAt(1, HttpOptions, this); + } + isHttp() { return capnp_ts_1.Struct.getUint16(0, this) === 0; } + setHttp(value) { + capnp_ts_1.Struct.setUint16(0, 0, this); + capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); + } + getHttps() { + capnp_ts_1.Struct.testWhich("https", capnp_ts_1.Struct.getUint16(0, this), 1, this); + return capnp_ts_1.Struct.getAs(ExternalServer_Https, this); + } + initHttps() { + capnp_ts_1.Struct.setUint16(0, 1, this); + return capnp_ts_1.Struct.getAs(ExternalServer_Https, this); + } + isHttps() { return capnp_ts_1.Struct.getUint16(0, this) === 1; } + setHttps() { capnp_ts_1.Struct.setUint16(0, 1, this); } + getTcp() { + capnp_ts_1.Struct.testWhich("tcp", capnp_ts_1.Struct.getUint16(0, this), 2, this); + return capnp_ts_1.Struct.getAs(ExternalServer_Tcp, this); + } + initTcp() { + capnp_ts_1.Struct.setUint16(0, 2, this); + return capnp_ts_1.Struct.getAs(ExternalServer_Tcp, this); + } + isTcp() { return capnp_ts_1.Struct.getUint16(0, this) === 2; } + setTcp() { capnp_ts_1.Struct.setUint16(0, 2, this); } + toString() { return "ExternalServer_" + super.toString(); } + which() { return capnp_ts_1.Struct.getUint16(0, this); } } exports.ExternalServer = ExternalServer; ExternalServer.HTTP = ExternalServer_Which.HTTP; ExternalServer.HTTPS = ExternalServer_Which.HTTPS; ExternalServer.TCP = ExternalServer_Which.TCP; -ExternalServer._capnp = { - displayName: "ExternalServer", - id: "ff209f9aa352f5a4", - size: new capnp_ts_1.ObjectSize(8, 4), -}; +ExternalServer._capnp = { displayName: "ExternalServer", id: "ff209f9aa352f5a4", size: new capnp_ts_1.ObjectSize(8, 4) }; class Network extends capnp_ts_1.Struct { - adoptAllow(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownAllow() { - return capnp_ts_1.Struct.disown(this.getAllow()); - } - getAllow() { - return capnp_ts_1.Struct.getList( - 0, - capnp.TextList, - this, - Network._capnp.defaultAllow - ); - } - hasAllow() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initAllow(length) { - return capnp_ts_1.Struct.initList(0, capnp.TextList, length, this); - } - setAllow(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - adoptDeny(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownDeny() { - return capnp_ts_1.Struct.disown(this.getDeny()); - } - getDeny() { - return capnp_ts_1.Struct.getList(1, capnp.TextList, this); - } - hasDeny() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initDeny(length) { - return capnp_ts_1.Struct.initList(1, capnp.TextList, length, this); - } - setDeny(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - adoptTlsOptions(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownTlsOptions() { - return capnp_ts_1.Struct.disown(this.getTlsOptions()); - } - getTlsOptions() { - return capnp_ts_1.Struct.getStruct(2, TlsOptions, this); - } - hasTlsOptions() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initTlsOptions() { - return capnp_ts_1.Struct.initStructAt(2, TlsOptions, this); - } - setTlsOptions(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - toString() { - return "Network_" + super.toString(); - } + adoptAllow(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); } + disownAllow() { return capnp_ts_1.Struct.disown(this.getAllow()); } + getAllow() { return capnp_ts_1.Struct.getList(0, capnp.TextList, this, Network._capnp.defaultAllow); } + hasAllow() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initAllow(length) { return capnp_ts_1.Struct.initList(0, capnp.TextList, length, this); } + setAllow(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); } + adoptDeny(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownDeny() { return capnp_ts_1.Struct.disown(this.getDeny()); } + getDeny() { return capnp_ts_1.Struct.getList(1, capnp.TextList, this); } + hasDeny() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initDeny(length) { return capnp_ts_1.Struct.initList(1, capnp.TextList, length, this); } + setDeny(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + adoptTlsOptions(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownTlsOptions() { return capnp_ts_1.Struct.disown(this.getTlsOptions()); } + getTlsOptions() { return capnp_ts_1.Struct.getStruct(2, TlsOptions, this); } + hasTlsOptions() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initTlsOptions() { return capnp_ts_1.Struct.initStructAt(2, TlsOptions, this); } + setTlsOptions(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + toString() { return "Network_" + super.toString(); } } exports.Network = Network; -Network._capnp = { - displayName: "Network", - id: "fa42244f950c9b9c", - size: new capnp_ts_1.ObjectSize(0, 3), - defaultAllow: capnp.readRawPointer( - new Uint8Array([ - 0x10, 0x03, 0x11, 0x01, 0x0e, 0x11, 0x01, 0x3a, 0x3f, 0x70, 0x75, 0x62, - 0x6c, 0x69, 0x63, - ]).buffer - ), -}; +Network._capnp = { displayName: "Network", id: "fa42244f950c9b9c", size: new capnp_ts_1.ObjectSize(0, 3), defaultAllow: capnp.readRawPointer(new Uint8Array([0x10, 0x03, 0x11, 0x01, 0x0e, 0x11, 0x01, 0x3a, 0x3f, 0x70, 0x75, 0x62, 0x6c, 0x69, 0x63]).buffer) }; class DiskDirectory extends capnp_ts_1.Struct { - getPath() { - return capnp_ts_1.Struct.getText(0, this); - } - setPath(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getWritable() { - return capnp_ts_1.Struct.getBit( - 0, - this, - DiskDirectory._capnp.defaultWritable - ); - } - setWritable(value) { - capnp_ts_1.Struct.setBit(0, value, this); - } - getAllowDotfiles() { - return capnp_ts_1.Struct.getBit( - 1, - this, - DiskDirectory._capnp.defaultAllowDotfiles - ); - } - setAllowDotfiles(value) { - capnp_ts_1.Struct.setBit(1, value, this); - } - toString() { - return "DiskDirectory_" + super.toString(); - } + getPath() { return capnp_ts_1.Struct.getText(0, this); } + setPath(value) { capnp_ts_1.Struct.setText(0, value, this); } + getWritable() { return capnp_ts_1.Struct.getBit(0, this, DiskDirectory._capnp.defaultWritable); } + setWritable(value) { capnp_ts_1.Struct.setBit(0, value, this); } + getAllowDotfiles() { return capnp_ts_1.Struct.getBit(1, this, DiskDirectory._capnp.defaultAllowDotfiles); } + setAllowDotfiles(value) { capnp_ts_1.Struct.setBit(1, value, this); } + toString() { return "DiskDirectory_" + super.toString(); } } exports.DiskDirectory = DiskDirectory; -DiskDirectory._capnp = { - displayName: "DiskDirectory", - id: "9048ab22835f51c3", - size: new capnp_ts_1.ObjectSize(8, 1), - defaultWritable: capnp.getBitMask(false, 0), - defaultAllowDotfiles: capnp.getBitMask(false, 1), -}; +DiskDirectory._capnp = { displayName: "DiskDirectory", id: "9048ab22835f51c3", size: new capnp_ts_1.ObjectSize(8, 1), defaultWritable: capnp.getBitMask(false, 0), defaultAllowDotfiles: capnp.getBitMask(false, 1) }; var HttpOptions_Style; (function (HttpOptions_Style) { - HttpOptions_Style[(HttpOptions_Style["HOST"] = 0)] = "HOST"; - HttpOptions_Style[(HttpOptions_Style["PROXY"] = 1)] = "PROXY"; -})( - (HttpOptions_Style = - exports.HttpOptions_Style || (exports.HttpOptions_Style = {})) -); + HttpOptions_Style[HttpOptions_Style["HOST"] = 0] = "HOST"; + HttpOptions_Style[HttpOptions_Style["PROXY"] = 1] = "PROXY"; +})(HttpOptions_Style = exports.HttpOptions_Style || (exports.HttpOptions_Style = {})); class HttpOptions_Header extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getValue() { - return capnp_ts_1.Struct.getText(1, this); - } - setValue(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return "HttpOptions_Header_" + super.toString(); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getValue() { return capnp_ts_1.Struct.getText(1, this); } + setValue(value) { capnp_ts_1.Struct.setText(1, value, this); } + toString() { return "HttpOptions_Header_" + super.toString(); } } exports.HttpOptions_Header = HttpOptions_Header; -HttpOptions_Header._capnp = { - displayName: "Header", - id: "dc0394b5a6f3417e", - size: new capnp_ts_1.ObjectSize(0, 2), -}; +HttpOptions_Header._capnp = { displayName: "Header", id: "dc0394b5a6f3417e", size: new capnp_ts_1.ObjectSize(0, 2) }; class HttpOptions extends capnp_ts_1.Struct { - getStyle() { - return capnp_ts_1.Struct.getUint16( - 0, - this, - HttpOptions._capnp.defaultStyle - ); - } - setStyle(value) { - capnp_ts_1.Struct.setUint16(0, value, this); - } - getForwardedProtoHeader() { - return capnp_ts_1.Struct.getText(0, this); - } - setForwardedProtoHeader(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getCfBlobHeader() { - return capnp_ts_1.Struct.getText(1, this); - } - setCfBlobHeader(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - adoptInjectRequestHeaders(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); - } - disownInjectRequestHeaders() { - return capnp_ts_1.Struct.disown(this.getInjectRequestHeaders()); - } - getInjectRequestHeaders() { - return capnp_ts_1.Struct.getList( - 2, - HttpOptions._InjectRequestHeaders, - this - ); - } - hasInjectRequestHeaders() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); - } - initInjectRequestHeaders(length) { - return capnp_ts_1.Struct.initList( - 2, - HttpOptions._InjectRequestHeaders, - length, - this - ); - } - setInjectRequestHeaders(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); - } - adoptInjectResponseHeaders(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); - } - disownInjectResponseHeaders() { - return capnp_ts_1.Struct.disown(this.getInjectResponseHeaders()); - } - getInjectResponseHeaders() { - return capnp_ts_1.Struct.getList( - 3, - HttpOptions._InjectResponseHeaders, - this - ); - } - hasInjectResponseHeaders() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); - } - initInjectResponseHeaders(length) { - return capnp_ts_1.Struct.initList( - 3, - HttpOptions._InjectResponseHeaders, - length, - this - ); - } - setInjectResponseHeaders(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); - } - toString() { - return "HttpOptions_" + super.toString(); - } + getStyle() { return capnp_ts_1.Struct.getUint16(0, this, HttpOptions._capnp.defaultStyle); } + setStyle(value) { capnp_ts_1.Struct.setUint16(0, value, this); } + getForwardedProtoHeader() { return capnp_ts_1.Struct.getText(0, this); } + setForwardedProtoHeader(value) { capnp_ts_1.Struct.setText(0, value, this); } + getCfBlobHeader() { return capnp_ts_1.Struct.getText(1, this); } + setCfBlobHeader(value) { capnp_ts_1.Struct.setText(1, value, this); } + adoptInjectRequestHeaders(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(2, this)); } + disownInjectRequestHeaders() { return capnp_ts_1.Struct.disown(this.getInjectRequestHeaders()); } + getInjectRequestHeaders() { return capnp_ts_1.Struct.getList(2, HttpOptions._InjectRequestHeaders, this); } + hasInjectRequestHeaders() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(2, this)); } + initInjectRequestHeaders(length) { return capnp_ts_1.Struct.initList(2, HttpOptions._InjectRequestHeaders, length, this); } + setInjectRequestHeaders(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(2, this)); } + adoptInjectResponseHeaders(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(3, this)); } + disownInjectResponseHeaders() { return capnp_ts_1.Struct.disown(this.getInjectResponseHeaders()); } + getInjectResponseHeaders() { return capnp_ts_1.Struct.getList(3, HttpOptions._InjectResponseHeaders, this); } + hasInjectResponseHeaders() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(3, this)); } + initInjectResponseHeaders(length) { return capnp_ts_1.Struct.initList(3, HttpOptions._InjectResponseHeaders, length, this); } + setInjectResponseHeaders(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(3, this)); } + toString() { return "HttpOptions_" + super.toString(); } } exports.HttpOptions = HttpOptions; HttpOptions.Style = HttpOptions_Style; HttpOptions.Header = HttpOptions_Header; -HttpOptions._capnp = { - displayName: "HttpOptions", - id: "aa8dc6885da78f19", - size: new capnp_ts_1.ObjectSize(8, 4), - defaultStyle: capnp.getUint16Mask(0), -}; +HttpOptions._capnp = { displayName: "HttpOptions", id: "aa8dc6885da78f19", size: new capnp_ts_1.ObjectSize(8, 4), defaultStyle: capnp.getUint16Mask(0) }; class TlsOptions_Keypair extends capnp_ts_1.Struct { - getPrivateKey() { - return capnp_ts_1.Struct.getText(0, this); - } - setPrivateKey(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getCertificateChain() { - return capnp_ts_1.Struct.getText(1, this); - } - setCertificateChain(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return "TlsOptions_Keypair_" + super.toString(); - } + getPrivateKey() { return capnp_ts_1.Struct.getText(0, this); } + setPrivateKey(value) { capnp_ts_1.Struct.setText(0, value, this); } + getCertificateChain() { return capnp_ts_1.Struct.getText(1, this); } + setCertificateChain(value) { capnp_ts_1.Struct.setText(1, value, this); } + toString() { return "TlsOptions_Keypair_" + super.toString(); } } exports.TlsOptions_Keypair = TlsOptions_Keypair; -TlsOptions_Keypair._capnp = { - displayName: "Keypair", - id: "f546bf2d5d8bd13e", - size: new capnp_ts_1.ObjectSize(0, 2), -}; +TlsOptions_Keypair._capnp = { displayName: "Keypair", id: "f546bf2d5d8bd13e", size: new capnp_ts_1.ObjectSize(0, 2) }; var TlsOptions_Version; (function (TlsOptions_Version) { - TlsOptions_Version[(TlsOptions_Version["GOOD_DEFAULT"] = 0)] = "GOOD_DEFAULT"; - TlsOptions_Version[(TlsOptions_Version["SSL3"] = 1)] = "SSL3"; - TlsOptions_Version[(TlsOptions_Version["TLS1DOT0"] = 2)] = "TLS1DOT0"; - TlsOptions_Version[(TlsOptions_Version["TLS1DOT1"] = 3)] = "TLS1DOT1"; - TlsOptions_Version[(TlsOptions_Version["TLS1DOT2"] = 4)] = "TLS1DOT2"; - TlsOptions_Version[(TlsOptions_Version["TLS1DOT3"] = 5)] = "TLS1DOT3"; -})( - (TlsOptions_Version = - exports.TlsOptions_Version || (exports.TlsOptions_Version = {})) -); + TlsOptions_Version[TlsOptions_Version["GOOD_DEFAULT"] = 0] = "GOOD_DEFAULT"; + TlsOptions_Version[TlsOptions_Version["SSL3"] = 1] = "SSL3"; + TlsOptions_Version[TlsOptions_Version["TLS1DOT0"] = 2] = "TLS1DOT0"; + TlsOptions_Version[TlsOptions_Version["TLS1DOT1"] = 3] = "TLS1DOT1"; + TlsOptions_Version[TlsOptions_Version["TLS1DOT2"] = 4] = "TLS1DOT2"; + TlsOptions_Version[TlsOptions_Version["TLS1DOT3"] = 5] = "TLS1DOT3"; +})(TlsOptions_Version = exports.TlsOptions_Version || (exports.TlsOptions_Version = {})); class TlsOptions extends capnp_ts_1.Struct { - adoptKeypair(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownKeypair() { - return capnp_ts_1.Struct.disown(this.getKeypair()); - } - getKeypair() { - return capnp_ts_1.Struct.getStruct(0, TlsOptions_Keypair, this); - } - hasKeypair() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initKeypair() { - return capnp_ts_1.Struct.initStructAt(0, TlsOptions_Keypair, this); - } - setKeypair(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - getRequireClientCerts() { - return capnp_ts_1.Struct.getBit( - 0, - this, - TlsOptions._capnp.defaultRequireClientCerts - ); - } - setRequireClientCerts(value) { - capnp_ts_1.Struct.setBit(0, value, this); - } - getTrustBrowserCas() { - return capnp_ts_1.Struct.getBit( - 1, - this, - TlsOptions._capnp.defaultTrustBrowserCas - ); - } - setTrustBrowserCas(value) { - capnp_ts_1.Struct.setBit(1, value, this); - } - adoptTrustedCertificates(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); - } - disownTrustedCertificates() { - return capnp_ts_1.Struct.disown(this.getTrustedCertificates()); - } - getTrustedCertificates() { - return capnp_ts_1.Struct.getList(1, capnp.TextList, this); - } - hasTrustedCertificates() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); - } - initTrustedCertificates(length) { - return capnp_ts_1.Struct.initList(1, capnp.TextList, length, this); - } - setTrustedCertificates(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); - } - getMinVersion() { - return capnp_ts_1.Struct.getUint16( - 2, - this, - TlsOptions._capnp.defaultMinVersion - ); - } - setMinVersion(value) { - capnp_ts_1.Struct.setUint16(2, value, this); - } - getCipherList() { - return capnp_ts_1.Struct.getText(2, this); - } - setCipherList(value) { - capnp_ts_1.Struct.setText(2, value, this); - } - toString() { - return "TlsOptions_" + super.toString(); - } + adoptKeypair(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); } + disownKeypair() { return capnp_ts_1.Struct.disown(this.getKeypair()); } + getKeypair() { return capnp_ts_1.Struct.getStruct(0, TlsOptions_Keypair, this); } + hasKeypair() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initKeypair() { return capnp_ts_1.Struct.initStructAt(0, TlsOptions_Keypair, this); } + setKeypair(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); } + getRequireClientCerts() { return capnp_ts_1.Struct.getBit(0, this, TlsOptions._capnp.defaultRequireClientCerts); } + setRequireClientCerts(value) { capnp_ts_1.Struct.setBit(0, value, this); } + getTrustBrowserCas() { return capnp_ts_1.Struct.getBit(1, this, TlsOptions._capnp.defaultTrustBrowserCas); } + setTrustBrowserCas(value) { capnp_ts_1.Struct.setBit(1, value, this); } + adoptTrustedCertificates(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(1, this)); } + disownTrustedCertificates() { return capnp_ts_1.Struct.disown(this.getTrustedCertificates()); } + getTrustedCertificates() { return capnp_ts_1.Struct.getList(1, capnp.TextList, this); } + hasTrustedCertificates() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(1, this)); } + initTrustedCertificates(length) { return capnp_ts_1.Struct.initList(1, capnp.TextList, length, this); } + setTrustedCertificates(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(1, this)); } + getMinVersion() { return capnp_ts_1.Struct.getUint16(2, this, TlsOptions._capnp.defaultMinVersion); } + setMinVersion(value) { capnp_ts_1.Struct.setUint16(2, value, this); } + getCipherList() { return capnp_ts_1.Struct.getText(2, this); } + setCipherList(value) { capnp_ts_1.Struct.setText(2, value, this); } + toString() { return "TlsOptions_" + super.toString(); } } exports.TlsOptions = TlsOptions; TlsOptions.Keypair = TlsOptions_Keypair; TlsOptions.Version = TlsOptions_Version; -TlsOptions._capnp = { - displayName: "TlsOptions", - id: "aabb3c3778ac4311", - size: new capnp_ts_1.ObjectSize(8, 3), - defaultRequireClientCerts: capnp.getBitMask(false, 0), - defaultTrustBrowserCas: capnp.getBitMask(false, 1), - defaultMinVersion: capnp.getUint16Mask(0), -}; +TlsOptions._capnp = { displayName: "TlsOptions", id: "aabb3c3778ac4311", size: new capnp_ts_1.ObjectSize(8, 3), defaultRequireClientCerts: capnp.getBitMask(false, 0), defaultTrustBrowserCas: capnp.getBitMask(false, 1), defaultMinVersion: capnp.getUint16Mask(0) }; class Extension_Module extends capnp_ts_1.Struct { - getName() { - return capnp_ts_1.Struct.getText(0, this); - } - setName(value) { - capnp_ts_1.Struct.setText(0, value, this); - } - getInternal() { - return capnp_ts_1.Struct.getBit( - 0, - this, - Extension_Module._capnp.defaultInternal - ); - } - setInternal(value) { - capnp_ts_1.Struct.setBit(0, value, this); - } - getEsModule() { - return capnp_ts_1.Struct.getText(1, this); - } - setEsModule(value) { - capnp_ts_1.Struct.setText(1, value, this); - } - toString() { - return "Extension_Module_" + super.toString(); - } + getName() { return capnp_ts_1.Struct.getText(0, this); } + setName(value) { capnp_ts_1.Struct.setText(0, value, this); } + getInternal() { return capnp_ts_1.Struct.getBit(0, this, Extension_Module._capnp.defaultInternal); } + setInternal(value) { capnp_ts_1.Struct.setBit(0, value, this); } + getEsModule() { return capnp_ts_1.Struct.getText(1, this); } + setEsModule(value) { capnp_ts_1.Struct.setText(1, value, this); } + toString() { return "Extension_Module_" + super.toString(); } } exports.Extension_Module = Extension_Module; -Extension_Module._capnp = { - displayName: "Module", - id: "d5d16e76fdedc37d", - size: new capnp_ts_1.ObjectSize(8, 2), - defaultInternal: capnp.getBitMask(false, 0), -}; +Extension_Module._capnp = { displayName: "Module", id: "d5d16e76fdedc37d", size: new capnp_ts_1.ObjectSize(8, 2), defaultInternal: capnp.getBitMask(false, 0) }; class Extension extends capnp_ts_1.Struct { - adoptModules(value) { - capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); - } - disownModules() { - return capnp_ts_1.Struct.disown(this.getModules()); - } - getModules() { - return capnp_ts_1.Struct.getList(0, Extension._Modules, this); - } - hasModules() { - return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); - } - initModules(length) { - return capnp_ts_1.Struct.initList(0, Extension._Modules, length, this); - } - setModules(value) { - capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); - } - toString() { - return "Extension_" + super.toString(); - } + adoptModules(value) { capnp_ts_1.Struct.adopt(value, capnp_ts_1.Struct.getPointer(0, this)); } + disownModules() { return capnp_ts_1.Struct.disown(this.getModules()); } + getModules() { return capnp_ts_1.Struct.getList(0, Extension._Modules, this); } + hasModules() { return !capnp_ts_1.Struct.isNull(capnp_ts_1.Struct.getPointer(0, this)); } + initModules(length) { return capnp_ts_1.Struct.initList(0, Extension._Modules, length, this); } + setModules(value) { capnp_ts_1.Struct.copyFrom(value, capnp_ts_1.Struct.getPointer(0, this)); } + toString() { return "Extension_" + super.toString(); } } exports.Extension = Extension; Extension.Module = Extension_Module; -Extension._capnp = { - displayName: "Extension", - id: "e390128a861973a6", - size: new capnp_ts_1.ObjectSize(0, 1), -}; +Extension._capnp = { displayName: "Extension", id: "e390128a861973a6", size: new capnp_ts_1.ObjectSize(0, 1) }; Config._Services = capnp.CompositeList(Service); Config._Sockets = capnp.CompositeList(Socket); Config._Extensions = capnp.CompositeList(Extension); -Worker_Binding_WrappedBinding._InnerBindings = - capnp.CompositeList(Worker_Binding); +Worker_Binding_WrappedBinding._InnerBindings = capnp.CompositeList(Worker_Binding); Worker._Modules = capnp.CompositeList(Worker_Module); Worker._Bindings = capnp.CompositeList(Worker_Binding); -Worker._DurableObjectNamespaces = capnp.CompositeList( - Worker_DurableObjectNamespace -); +Worker._DurableObjectNamespaces = capnp.CompositeList(Worker_DurableObjectNamespace); HttpOptions._InjectRequestHeaders = capnp.CompositeList(HttpOptions_Header); HttpOptions._InjectResponseHeaders = capnp.CompositeList(HttpOptions_Header); Extension._Modules = capnp.CompositeList(Extension_Module); diff --git a/packages/miniflare/src/runtime/config/workerd.ts b/packages/miniflare/src/runtime/config/workerd.ts index 9a5f2d48b402..90a48533e1cc 100644 --- a/packages/miniflare/src/runtime/config/workerd.ts +++ b/packages/miniflare/src/runtime/config/workerd.ts @@ -19,6 +19,7 @@ export interface Config { sockets?: Socket[]; v8Flags?: string[]; extensions?: Extension[]; + autogates?: string[]; } export type Socket = { @@ -76,6 +77,8 @@ export type Worker_Module = { | { wasm?: Uint8Array } | { json?: string } | { nodeJsCompatModule?: string } + | { pythonModule?: string } + | { pythonRequirement?: string } ); export type Worker_Binding = { diff --git a/packages/miniflare/src/workers/core/devalue.ts b/packages/miniflare/src/workers/core/devalue.ts index 2b7306a704b6..bf6f7569447b 100644 --- a/packages/miniflare/src/workers/core/devalue.ts +++ b/packages/miniflare/src/workers/core/devalue.ts @@ -309,5 +309,6 @@ export function parseWithReadableStreams( }, ...revivers, }; + return parse(stringified.value, streamRevivers); } diff --git a/packages/miniflare/test/index.spec.ts b/packages/miniflare/test/index.spec.ts index db8707620bb5..4d50a6fc04f7 100644 --- a/packages/miniflare/test/index.spec.ts +++ b/packages/miniflare/test/index.spec.ts @@ -727,6 +727,28 @@ test("Miniflare: modules in sub-directories", async (t) => { t.is(await res.text(), "123"); }); +test("Miniflare: python modules", async (t) => { + const mf = new Miniflare({ + modules: [ + { + type: "PythonModule", + path: "index", + contents: + "from test_module import add; from js import Response;\ndef fetch(request):\n return Response.new(add(2,2))", + }, + { + type: "PythonModule", + path: "test_module", + contents: `def add(a, b):\n return a + b`, + }, + ], + compatibilityFlags: ["experimental"], + }); + t.teardown(() => mf.dispose()); + const res = await mf.dispatchFetch("http://localhost"); + t.is(await res.text(), "4"); +}); + test("Miniflare: HTTPS fetches using browser CA certificates", async (t) => { const mf = new Miniflare({ modules: true, @@ -1713,3 +1735,29 @@ test("Miniflare: prohibits invalid wrapped bindings", async (t) => { } ); }); + +test("Miniflare: getCf() returns a standard cf object", async (t) => { + const mf = new Miniflare({ script: "", modules: true }); + t.teardown(() => mf.dispose()); + + const cf = await mf.getCf(); + t.like(cf, { + colo: "DFW", + city: "Austin", + regionCode: "TX", + }); +}); + +test("Miniflare: getCf() returns a user provided cf object", async (t) => { + const mf = new Miniflare({ + script: "", + modules: true, + cf: { + myFakeField: "test", + }, + }); + t.teardown(() => mf.dispose()); + + const cf = await mf.getCf(); + t.deepEqual(cf, { myFakeField: "test" }); +}); diff --git a/packages/pages-shared/CHANGELOG.md b/packages/pages-shared/CHANGELOG.md index a7b549f4d7fa..4b4a420219d1 100644 --- a/packages/pages-shared/CHANGELOG.md +++ b/packages/pages-shared/CHANGELOG.md @@ -1,5 +1,21 @@ # @cloudflare/pages-shared +## 0.11.11 + +### Patch Changes + +- Updated dependencies [[`148feff6`](https://github.com/cloudflare/workers-sdk/commit/148feff60c9bf3886c0e0fd1ea98049955c27659)]: + - miniflare@3.20240129.1 + +## 0.11.10 + +### Patch Changes + +- [#4819](https://github.com/cloudflare/workers-sdk/pull/4819) [`6a4cb8c6`](https://github.com/cloudflare/workers-sdk/commit/6a4cb8c6456f1dba95cae2b8bbe658f7227349f8) Thanks [@magnusdahlstrand](https://github.com/magnusdahlstrand)! - fix: Use appropriate logging levels when parsing headers and redirects in `wrangler pages dev`. + +- Updated dependencies [[`1e424ff2`](https://github.com/cloudflare/workers-sdk/commit/1e424ff280610657e997df8290d0b39b0393c845), [`749fa3c0`](https://github.com/cloudflare/workers-sdk/commit/749fa3c05e6b9fcaa59a72f60f7936b7beaed5ad)]: + - miniflare@3.20240129.0 + ## 0.11.9 ### Patch Changes diff --git a/packages/pages-shared/package.json b/packages/pages-shared/package.json index 7fa8d8556a85..e56c7226371e 100644 --- a/packages/pages-shared/package.json +++ b/packages/pages-shared/package.json @@ -1,6 +1,6 @@ { "name": "@cloudflare/pages-shared", - "version": "0.11.9", + "version": "0.11.11", "repository": { "type": "git", "url": "https://github.com/cloudflare/workers-sdk.git", diff --git a/packages/playground-preview-worker/.eslintrc.js b/packages/playground-preview-worker/.eslintrc.js new file mode 100644 index 000000000000..a478f67f57c2 --- /dev/null +++ b/packages/playground-preview-worker/.eslintrc.js @@ -0,0 +1,4 @@ +module.exports = { + root: true, + extends: ["@cloudflare/eslint-config-worker"], +}; diff --git a/packages/playground-preview-worker/package.json b/packages/playground-preview-worker/package.json index 1d1c86e1c274..6ff5ba3d21ae 100644 --- a/packages/playground-preview-worker/package.json +++ b/packages/playground-preview-worker/package.json @@ -6,17 +6,18 @@ "build-middleware": "pnpm run build-middleware:common && pnpm run build-middleware:loader", "build-middleware:common": "pnpm dlx esbuild ../wrangler/templates/middleware/common.ts --outfile=src/middleware/common.module.template", "build-middleware:loader": "pnpm dlx esbuild ../wrangler/templates/middleware/loader-modules.ts --outfile=src/middleware/loader.module.template", + "check:lint": "eslint .", "deploy": "wrangler -j deploy", "deploy:testing": "wrangler -j deploy -e testing", "start": "wrangler -j dev", - "test": "vitest run", - "test:ci": "vitest run" + "test:e2e": "vitest run" }, "dependencies": { - "hono": "^3.3.2", + "hono": "^3.12.11", "zod": "^3.22.3" }, "devDependencies": { + "@cloudflare/eslint-config-worker": "workspace:*", "@cloudflare/workers-types": "^4.20230321.0", "@types/cookie": "^0.5.1", "cookie": "^0.5.0", diff --git a/packages/playground-preview-worker/src/errors.ts b/packages/playground-preview-worker/src/errors.ts new file mode 100644 index 000000000000..edec30d65883 --- /dev/null +++ b/packages/playground-preview-worker/src/errors.ts @@ -0,0 +1,124 @@ +import type { ZodIssue } from "zod"; + +export class HttpError extends Error { + constructor( + message: string, + readonly status: number, + // Only report errors to sentry when they represent actionable errors + readonly reportable: boolean + ) { + super(message); + Object.setPrototypeOf(this, new.target.prototype); + } + toResponse() { + return Response.json( + { + error: this.name, + message: this.message, + data: this.data, + }, + { + status: this.status, + headers: { + "Access-Control-Allow-Origin": "*", + "Access-Control-Allow-Methods": "GET,PUT,POST", + }, + } + ); + } + + get data(): Record { + return {}; + } +} + +export class WorkerTimeout extends HttpError { + name = "WorkerTimeout"; + constructor() { + super("Worker timed out", 400, false); + } + + toResponse(): Response { + return new Response("Worker timed out"); + } +} + +export class ServiceWorkerNotSupported extends HttpError { + name = "ServiceWorkerNotSupported"; + constructor() { + super( + "Service Workers are not supported in the Workers Playground", + 400, + false + ); + } +} +export class ZodSchemaError extends HttpError { + name = "ZodSchemaError"; + constructor(private issues: ZodIssue[]) { + super("Something went wrong", 500, true); + } + + get data(): { issues: string } { + return { issues: JSON.stringify(this.issues) }; + } +} + +export class PreviewError extends HttpError { + name = "PreviewError"; + constructor(private error: string) { + super(error, 400, false); + } + + get data(): { error: string } { + return { error: this.error }; + } +} + +export class TokenUpdateFailed extends HttpError { + name = "TokenUpdateFailed"; + constructor() { + super("Provide valid token", 400, false); + } +} + +export class RawHttpFailed extends HttpError { + name = "RawHttpFailed"; + constructor() { + super("Provide valid token", 400, false); + } +} + +export class PreviewRequestFailed extends HttpError { + name = "PreviewRequestFailed"; + constructor(private tokenId: string | undefined, reportable: boolean) { + super("Valid token not found", 400, reportable); + } + get data(): { tokenId: string | undefined } { + return { tokenId: this.tokenId }; + } +} + +export class UploadFailed extends HttpError { + name = "UploadFailed"; + constructor() { + super("Valid token not provided", 401, false); + } +} + +export class PreviewRequestForbidden extends HttpError { + name = "PreviewRequestForbidden"; + constructor() { + super("Preview request forbidden", 403, false); + } +} + +export class BadUpload extends HttpError { + name = "BadUpload"; + constructor(message = "Invalid upload", private readonly error?: string) { + super(message, 400, false); + } + get data() { + return { error: this.error }; + } +} diff --git a/packages/playground-preview-worker/src/index.ts b/packages/playground-preview-worker/src/index.ts index 4e1e5cf9b1e9..fcc435a71fb7 100644 --- a/packages/playground-preview-worker/src/index.ts +++ b/packages/playground-preview-worker/src/index.ts @@ -1,10 +1,17 @@ import { Hono } from "hono"; import { getCookie, setCookie } from "hono/cookie"; import prom from "promjs"; -import { Toucan } from "toucan-js"; -import { ZodIssue } from "zod"; +import { + HttpError, + PreviewRequestFailed, + PreviewRequestForbidden, + RawHttpFailed, + TokenUpdateFailed, + UploadFailed, +} from "./errors"; import { handleException, setupSentry } from "./sentry"; import type { RegistryType } from "promjs"; +import type { Toucan } from "toucan-js"; const app = new Hono<{ Bindings: Env; @@ -20,118 +27,6 @@ const app = new Hono<{ const rootDomain = ROOT; const previewDomain = PREVIEW; -export class HttpError extends Error { - constructor( - message: string, - readonly status: number, - // Only report errors to sentry when they represent actionable errors - readonly reportable: boolean - ) { - super(message); - Object.setPrototypeOf(this, new.target.prototype); - } - toResponse() { - return Response.json( - { - error: this.name, - message: this.message, - data: this.data, - }, - { - status: this.status, - headers: { - "Access-Control-Allow-Origin": "*", - "Access-Control-Allow-Methods": "GET,PUT,POST", - }, - } - ); - } - - get data(): Record { - return {}; - } -} - -export class WorkerTimeout extends HttpError { - name = "WorkerTimeout"; - constructor() { - super("Worker timed out", 400, false); - } - - toResponse(): Response { - return new Response("Worker timed out"); - } -} - -export class ServiceWorkerNotSupported extends HttpError { - name = "ServiceWorkerNotSupported"; - constructor() { - super( - "Service Workers are not supported in the Workers Playground", - 400, - false - ); - } -} -export class ZodSchemaError extends HttpError { - name = "ZodSchemaError"; - constructor(private issues: ZodIssue[]) { - super("Something went wrong", 500, true); - } - - get data(): { issues: string } { - return { issues: JSON.stringify(this.issues) }; - } -} - -export class PreviewError extends HttpError { - name = "PreviewError"; - constructor(private error: string) { - super(error, 400, false); - } - - get data(): { error: string } { - return { error: this.error }; - } -} - -class TokenUpdateFailed extends HttpError { - name = "TokenUpdateFailed"; - constructor() { - super("Provide token", 400, false); - } -} - -class RawHttpFailed extends HttpError { - name = "RawHttpFailed"; - constructor() { - super("Provide token", 400, false); - } -} - -class PreviewRequestFailed extends HttpError { - name = "PreviewRequestFailed"; - constructor(private tokenId: string | undefined, reportable: boolean) { - super("Token not found", 400, reportable); - } - get data(): { tokenId: string | undefined } { - return { tokenId: this.tokenId }; - } -} - -class UploadFailed extends HttpError { - name = "UploadFailed"; - constructor() { - super("Token not provided", 401, false); - } -} - -class PreviewRequestForbidden extends HttpError { - name = "PreviewRequestForbidden"; - constructor() { - super("Preview request forbidden", 403, false); - } -} /** * Given a preview token, this endpoint allows for raw http calls to be inspected @@ -144,8 +39,13 @@ async function handleRawHttp(request: Request, url: URL, env: Env) { if (!token) { throw new RawHttpFailed(); } - - const userObject = env.UserSession.get(env.UserSession.idFromString(token)); + let userObjectId: DurableObjectId; + try { + userObjectId = env.UserSession.idFromString(token); + } catch { + throw new RawHttpFailed(); + } + const userObject = env.UserSession.get(userObjectId); // Delete these consumed headers so as not to bloat the request. // Some tokens can be quite large and may cause nginx to reject the @@ -249,15 +149,17 @@ app.get(`${rootDomain}/`, async (c) => { }); app.post(`${rootDomain}/api/worker`, async (c) => { - let userId = getCookie(c, "user"); - + const userId = getCookie(c, "user"); if (!userId) { throw new UploadFailed(); } - - const userObject = c.env.UserSession.get( - c.env.UserSession.idFromString(userId) - ); + let userObjectId: DurableObjectId; + try { + userObjectId = c.env.UserSession.idFromString(userId); + } catch { + throw new UploadFailed(); + } + const userObject = c.env.UserSession.get(userObjectId); return userObject.fetch("https://example.com", { body: c.req.body, @@ -268,15 +170,17 @@ app.post(`${rootDomain}/api/worker`, async (c) => { app.get(`${rootDomain}/api/inspector`, async (c) => { const url = new URL(c.req.url); - let userId = url.searchParams.get("user"); - + const userId = url.searchParams.get("user"); if (!userId) { throw new PreviewRequestFailed("", false); } - - const userObject = c.env.UserSession.get( - c.env.UserSession.idFromString(userId) - ); + let userObjectId: DurableObjectId; + try { + userObjectId = c.env.UserSession.idFromString(userId); + } catch { + throw new PreviewRequestFailed(userId, false); + } + const userObject = c.env.UserSession.get(userObjectId); return userObject.fetch(c.req.raw); }); @@ -308,6 +212,13 @@ app.get(`${previewDomain}/.update-preview-token`, (c) => { if (!token) { throw new TokenUpdateFailed(); } + // Validate `token` is an actual Durable Object ID + try { + c.env.UserSession.idFromString(token); + } catch { + throw new TokenUpdateFailed(); + } + setCookie(c, "token", token, { secure: true, sameSite: "None", @@ -337,14 +248,16 @@ app.all(`${previewDomain}/*`, async (c) => { return handleRawHttp(c.req.raw, url, c.env); } const token = getCookie(c, "token"); - if (!token) { throw new PreviewRequestFailed(token, false); } - - const userObject = c.env.UserSession.get( - c.env.UserSession.idFromString(token) - ); + let userObjectId: DurableObjectId; + try { + userObjectId = c.env.UserSession.idFromString(token); + } catch { + throw new PreviewRequestFailed(token, false); + } + const userObject = c.env.UserSession.get(userObjectId); const original = await userObject.fetch( url, diff --git a/packages/playground-preview-worker/src/middleware/definitions/json.module.template b/packages/playground-preview-worker/src/middleware/definitions/json.module.template index d16abbc12a7b..860385be6d6b 100644 --- a/packages/playground-preview-worker/src/middleware/definitions/json.module.template +++ b/packages/playground-preview-worker/src/middleware/definitions/json.module.template @@ -13,15 +13,22 @@ const jsonError = async (request, env, _ctx, middlewareCtx) => { } catch (e) { console.error(e); const error = reduceError(e); - return fetch('https://format-errors.devprod.cloudflare.dev', { - method: 'POST', - body: JSON.stringify({ - error, - url: request.url, - method: request.method, - headers: Object.fromEntries(request.headers.entries()) - }) - }); + try { + const errorRes = await fetch( + 'https://format-errors.devprod.cloudflare.dev', + { + method: 'POST', + body: JSON.stringify({ + error, + url: request.url, + method: request.method, + headers: Object.fromEntries(request.headers.entries()), + }), + } + ); + if (errorRes.ok) return errorRes; + } catch {} + return new Response(error.stack ?? error.message); } }; diff --git a/packages/playground-preview-worker/src/realish.ts b/packages/playground-preview-worker/src/realish.ts index ad794e7db5c1..7ec1f795a2a6 100644 --- a/packages/playground-preview-worker/src/realish.ts +++ b/packages/playground-preview-worker/src/realish.ts @@ -1,5 +1,5 @@ import { z } from "zod"; -import { PreviewError } from "."; +import { PreviewError } from "./errors"; const APIResponse = (resultSchema: T) => z.union([ diff --git a/packages/playground-preview-worker/src/sentry.ts b/packages/playground-preview-worker/src/sentry.ts index 2c792048187c..58a85c2c5f62 100644 --- a/packages/playground-preview-worker/src/sentry.ts +++ b/packages/playground-preview-worker/src/sentry.ts @@ -1,6 +1,6 @@ import { Toucan } from "toucan-js"; -import { z, ZodError } from "zod"; -import { HttpError, ZodSchemaError } from "."; +import { ZodError } from "zod"; +import { HttpError, ZodSchemaError } from "./errors"; export function handleException(e: unknown, sentry: Toucan): Response { console.error(e); diff --git a/packages/playground-preview-worker/src/user.do.ts b/packages/playground-preview-worker/src/user.do.ts index 4c220455b20b..2a80f4c89f64 100644 --- a/packages/playground-preview-worker/src/user.do.ts +++ b/packages/playground-preview-worker/src/user.do.ts @@ -1,20 +1,17 @@ import assert from "node:assert"; import { Buffer } from "node:buffer"; +import z from "zod"; +import { BadUpload, ServiceWorkerNotSupported, WorkerTimeout } from "./errors"; import { constructMiddleware } from "./inject-middleware"; -import { - doUpload, - RealishPreviewConfig, - setupTokens, - UploadResult, -} from "./realish"; +import { doUpload, setupTokens } from "./realish"; import { handleException, setupSentry } from "./sentry"; -import { ServiceWorkerNotSupported, WorkerTimeout } from "."; +import type { RealishPreviewConfig, UploadResult } from "./realish"; const encoder = new TextEncoder(); async function hash(text: string) { - const hash = await crypto.subtle.digest("SHA-256", encoder.encode(text)); - return Buffer.from(hash).toString("hex"); + const digest = await crypto.subtle.digest("SHA-256", encoder.encode(text)); + return Buffer.from(digest).toString("hex"); } function switchRemote(url: URL, remote: string) { @@ -26,6 +23,15 @@ function switchRemote(url: URL, remote: string) { return workerUrl; } +const UploadedMetadata = z.object({ + body_part: z.ostring(), + main_module: z.ostring(), + compatibility_date: z.ostring(), + compatibility_flags: z.array(z.string()).optional(), +}); + +type UploadedMetadata = z.infer; + /** * This Durable object coordinates operations for a specific user session. It's purpose is to * communicate with the Realish preview service on behalf of a user, without leaking more info @@ -41,7 +47,7 @@ export class UserSession { inspectorUrl: string | undefined; workerName!: string; constructor(private state: DurableObjectState, private env: Env) { - this.state.blockConcurrencyWhile(async () => { + void this.state.blockConcurrencyWhile(async () => { this.config = await this.state.storage.get( "config" ); @@ -114,17 +120,9 @@ export class UserSession { await this.state.storage.put("inspectorUrl", this.inspectorUrl); } - async fetch(request: Request) { + async handleRequest(request: Request) { const url = new URL(request.url); - // We need to construct a new Sentry instance here because throwing - // errors across a DO boundary will wipe stack information etc... - const sentry = setupSentry( - request, - undefined, - this.env.SENTRY_DSN, - this.env.SENTRY_ACCESS_CLIENT_ID, - this.env.SENTRY_ACCESS_CLIENT_SECRET - ); + // This is an inspector request. Forward to the correct inspector URL if (request.headers.get("Upgrade") && url.pathname === "/api/inspector") { assert(this.inspectorUrl !== undefined); @@ -158,17 +156,33 @@ export class UserSession { } return workerResponse; } + const userSession = this.state.id.toString(); - const worker = await request.formData(); - const m = worker.get("metadata"); + let worker: FormData; + try { + worker = await request.formData(); + } catch (e) { + throw new BadUpload(`Expected valid form data`, String(e)); + } - assert(m instanceof File); + const m = worker.get("metadata"); + if (!(m instanceof File)) { + throw new BadUpload("Expected metadata file to be defined"); + } - const uploadedMetadata = JSON.parse(await m.text()); + let uploadedMetadata: UploadedMetadata; + try { + uploadedMetadata = UploadedMetadata.parse(JSON.parse(await m.text())); + } catch { + throw new BadUpload("Expected metadata file to be valid"); + } - if ("body_part" in uploadedMetadata) { - return new ServiceWorkerNotSupported().toResponse(); + if ( + uploadedMetadata.body_part !== undefined || + uploadedMetadata.main_module === undefined + ) { + throw new ServiceWorkerNotSupported(); } const today = new Date(); @@ -194,9 +208,9 @@ export class UserSession { metadata.main_module = entrypoint; - for (const [path, m] of additionalModules.entries()) { - assert(m instanceof File); - worker.set(path, m); + for (const [path, additionalModule] of additionalModules.entries()) { + assert(additionalModule instanceof File); + worker.set(path, additionalModule); } worker.set( @@ -206,19 +220,33 @@ export class UserSession { }) ); - try { - await this.uploadWorker(this.workerName, worker); + await this.uploadWorker(this.workerName, worker); - assert(this.inspectorUrl !== undefined); + assert(this.inspectorUrl !== undefined); + + return Response.json({ + // Include a hash of the inspector URL so as to ensure the client will reconnect + // when the inspector URL has changed (because of an updated preview session) + inspector: `/api/inspector?user=${userSession}&h=${await hash( + this.inspectorUrl + )}`, + preview: userSession, + }); + } - return Response.json({ - // Include a hash of the inspector URL so as to ensure the client will reconnect - // when the inspector URL has changed (because of an updated preview session) - inspector: `/api/inspector?user=${userSession}&h=${await hash( - this.inspectorUrl - )}`, - preview: userSession, - }); + async fetch(request: Request) { + // We need to construct a new Sentry instance here because throwing + // errors across a DO boundary will wipe stack information etc... + const sentry = setupSentry( + request, + undefined, + this.env.SENTRY_DSN, + this.env.SENTRY_ACCESS_CLIENT_ID, + this.env.SENTRY_ACCESS_CLIENT_SECRET + ); + + try { + return await this.handleRequest(request); } catch (e) { return handleException(e, sentry); } diff --git a/packages/playground-preview-worker/tests/index.test.ts b/packages/playground-preview-worker/tests/index.test.ts index 21de6d883cf6..45e043036efb 100644 --- a/packages/playground-preview-worker/tests/index.test.ts +++ b/packages/playground-preview-worker/tests/index.test.ts @@ -1,17 +1,51 @@ import { fetch } from "undici"; -import { afterAll, beforeAll, beforeEach, describe, expect, it } from "vitest"; - -function removeUUID(str: string) { - return str.replace( - /\w{8}-\w{4}-\w{4}-\w{4}-\w{12}/g, - "00000000-0000-0000-0000-000000000000" - ); -} +import { beforeAll, describe, expect, it } from "vitest"; const REMOTE = "https://playground-testing.devprod.cloudflare.dev"; const PREVIEW_REMOTE = "https://random-data.playground-testing.devprod.cloudflare.dev"; +const TEST_WORKER_BOUNDARY = "----WebKitFormBoundaryqJEYLXuUiiZQHgvf"; +const TEST_WORKER_CONTENT_TYPE = `multipart/form-data; boundary=${TEST_WORKER_BOUNDARY}`; +const TEST_WORKER = `--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="index.js"; filename="index.js" +Content-Type: application/javascript+module + +export default { + fetch(request) { + const url = new URL(request.url) + if(url.pathname === "/exchange") { + return Response.json({ + token: "TEST_TOKEN", + prewarm: "TEST_PREWARM" + }) + } + if(url.pathname === "/redirect") { + return Response.redirect("https://example.com", 302) + } + if(url.pathname === "/method") { + return new Response(request.method) + } + if(url.pathname === "/status") { + return new Response(407) + } + if(url.pathname === "/header") { + return new Response(request.headers.get("X-Custom-Header")) + } + return Response.json({ + url: request.url, + headers: [...request.headers.entries()] + }) + } +} + +--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="metadata"; filename="blob" +Content-Type: application/json + +{"compatibility_date":"2023-05-04","main_module":"index.js"} +--${TEST_WORKER_BOUNDARY}--`; + async function fetchUserToken() { return fetch(REMOTE).then( (r) => r.headers.getSetCookie()[0].split(";")[0].split("=")[1] @@ -26,16 +60,15 @@ describe("Preview Worker", () => { method: "POST", headers: { cookie: `user=${defaultUserToken}`, - "Content-Type": - "multipart/form-data; boundary=----WebKitFormBoundaryqJEYLXuUiiZQHgvf", + "Content-Type": TEST_WORKER_CONTENT_TYPE, }, - body: '------WebKitFormBoundaryqJEYLXuUiiZQHgvf\nContent-Disposition: form-data; name="index.js"; filename="index.js"\nContent-Type: application/javascript+module\n\nexport default {\n fetch(request) {\n const url = new URL(request.url)\n if(url.pathname === "/exchange") {\n return Response.json({\n token: "TEST_TOKEN",\n prewarm: "TEST_PREWARM"\n })\n }\n if(url.pathname === "/redirect") {\n return Response.redirect("https://example.com", 302)\n }\n if(url.pathname === "/method") {\n return new Response(request.method)\n }\n if(url.pathname === "/status") {\n return new Response(407)\n }\n if(url.pathname === "/header") {\n return new Response(request.headers.get("X-Custom-Header"))\n }\n return Response.json({\n url: request.url,\n headers: [...request.headers.entries()]\n })\n }\n}\n\n------WebKitFormBoundaryqJEYLXuUiiZQHgvf\nContent-Disposition: form-data; name="metadata"; filename="blob"\nContent-Type: application/json\n\n{"compatibility_date":"2023-05-04","main_module":"index.js"}\n------WebKitFormBoundaryqJEYLXuUiiZQHgvf--', + body: TEST_WORKER, }).then((response) => response.json()); }); it("should be redirected with cookie", async () => { const resp = await fetch( - `${PREVIEW_REMOTE}/.update-preview-token?token=TEST_TOKEN&suffix=${encodeURIComponent( + `${PREVIEW_REMOTE}/.update-preview-token?token=${defaultUserToken}&suffix=${encodeURIComponent( "/hello?world" )}`, { @@ -52,7 +85,47 @@ describe("Preview Worker", () => { '"/hello?world"' ); expect(resp.headers.get("set-cookie") ?? "").toMatchInlineSnapshot( - '"token=TEST_TOKEN; Domain=random-data.playground-testing.devprod.cloudflare.dev; HttpOnly; Secure; SameSite=None"' + `"token=${defaultUserToken}; Domain=random-data.playground-testing.devprod.cloudflare.dev; Path=/; HttpOnly; Secure; SameSite=None"` + ); + }); + it("shouldn't be redirected with no token", async () => { + const resp = await fetch( + `${PREVIEW_REMOTE}/.update-preview-token?suffix=${encodeURIComponent( + "/hello?world" + )}`, + { + method: "GET", + redirect: "manual", + // These are forbidden headers, but undici currently allows setting them + headers: { + "Sec-Fetch-Dest": "iframe", + Referer: "https://workers.cloudflare.com/", + }, + } + ); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"TokenUpdateFailed\\",\\"message\\":\\"Provide valid token\\",\\"data\\":{}}"' + ); + }); + it("shouldn't be redirected with invalid token", async () => { + const resp = await fetch( + `${PREVIEW_REMOTE}/.update-preview-token?token=TEST_TOKEN&suffix=${encodeURIComponent( + "/hello?world" + )}`, + { + method: "GET", + redirect: "manual", + // These are forbidden headers, but undici currently allows setting them + headers: { + "Sec-Fetch-Dest": "iframe", + Referer: "https://workers.cloudflare.com/", + }, + } + ); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"TokenUpdateFailed\\",\\"message\\":\\"Provide valid token\\",\\"data\\":{}}"' ); }); @@ -117,6 +190,77 @@ describe("Preview Worker", () => { expect(await resp.text()).toMatchInlineSnapshot('"407"'); }); + it("should reject no token", async () => { + const resp = await fetch(PREVIEW_REMOTE); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"PreviewRequestFailed\\",\\"message\\":\\"Valid token not found\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid cookie header", async () => { + const resp = await fetch(PREVIEW_REMOTE, { + headers: { + cookie: "token", + }, + }); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"PreviewRequestFailed\\",\\"message\\":\\"Valid token not found\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid token", async () => { + const resp = await fetch(PREVIEW_REMOTE, { + headers: { + cookie: `token=TEST_TOKEN`, + }, + }); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"PreviewRequestFailed\\",\\"message\\":\\"Valid token not found\\",\\"data\\":{\\"tokenId\\":\\"TEST_TOKEN\\"}}"' + ); + }); + + it("should return raw HTTP response", async () => { + const resp = await fetch(`${PREVIEW_REMOTE}/header`, { + headers: { + "X-CF-Token": defaultUserToken, + "CF-Raw-HTTP": "true", + "X-Custom-Header": "custom", + }, + redirect: "manual", + }); + expect(resp.headers.get("cf-ew-raw-content-length")).toMatchInlineSnapshot( + '"6"' + ); + expect(await resp.text()).toMatchInlineSnapshot('"custom"'); + }); + it("should reject no token for raw HTTP response", async () => { + const resp = await fetch(`${PREVIEW_REMOTE}/header`, { + headers: { + "CF-Raw-HTTP": "true", + "X-Custom-Header": "custom", + }, + redirect: "manual", + }); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"RawHttpFailed\\",\\"message\\":\\"Provide valid token\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid token for raw HTTP response", async () => { + const resp = await fetch(`${PREVIEW_REMOTE}/header`, { + headers: { + "X-CF-Token": "TEST_TOKEN", + "CF-Raw-HTTP": "true", + "X-Custom-Header": "custom", + }, + redirect: "manual", + }); + expect(resp.status).toBe(400); + expect(await resp.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"RawHttpFailed\\",\\"message\\":\\"Provide valid token\\",\\"data\\":{}}"' + ); + }); }); describe("Upload Worker", () => { @@ -129,10 +273,9 @@ describe("Upload Worker", () => { method: "POST", headers: { cookie: `user=${defaultUserToken}`, - "Content-Type": - "multipart/form-data; boundary=----WebKitFormBoundaryqJEYLXuUiiZQHgvf", + "Content-Type": TEST_WORKER_CONTENT_TYPE, }, - body: '------WebKitFormBoundaryqJEYLXuUiiZQHgvf\nContent-Disposition: form-data; name="index.js"; filename="index.js"\nContent-Type: application/javascript+module\n\nexport default {\n fetch(request) {\n const url = new URL(request.url)\n if(url.pathname === "/exchange") {\n return Response.json({\n token: "TEST_TOKEN",\n prewarm: "TEST_PREWARM"\n })\n }\n if(url.pathname === "/redirect") {\n return Response.redirect("https://example.com", 302)\n }\n if(url.pathname === "/method") {\n return new Response(request.method)\n }\n if(url.pathname === "/status") {\n return new Response(407)\n }\n if(url.pathname === "/header") {\n return new Response(request.headers.get("X-Custom-Header"))\n }\n return Response.json({\n url: request.url,\n headers: [...request.headers.entries()]\n })\n }\n}\n\n------WebKitFormBoundaryqJEYLXuUiiZQHgvf\nContent-Disposition: form-data; name="metadata"; filename="blob"\nContent-Type: application/json\n\n{"compatibility_date":"2023-05-04","main_module":"index.js"}\n------WebKitFormBoundaryqJEYLXuUiiZQHgvf--', + body: TEST_WORKER, }); expect(w.status).toMatchInlineSnapshot("200"); }); @@ -141,10 +284,9 @@ describe("Upload Worker", () => { method: "POST", headers: { cookie: `user=${defaultUserToken}`, - "Content-Type": - "multipart/form-data; boundary=----WebKitFormBoundaryqJEYLXuUiiZQHgvf", + "Content-Type": TEST_WORKER_CONTENT_TYPE, }, - body: '------WebKitFormBoundaryqJEYLXuUiiZQHgvf\nContent-Disposition: form-data; name="index.js"; filename="index.js"\nContent-Type: application/javascript+module\n\nexport default {\n fetch(request {\n const url = new URL(request.url)\n if(url.pathname === "/exchange") {\n return Response.json({\n token: "TEST_TOKEN",\n prewarm: "TEST_PREWARM"\n })\n }\n if(url.pathname === "/redirect") {\n return Response.redirect("https://example.com", 302)\n }\n if(url.pathname === "/method") {\n return new Response(request.method)\n }\n if(url.pathname === "/status") {\n return new Response(407)\n }\n if(url.pathname === "/header") {\n return new Response(request.headers.get("X-Custom-Header"))\n }\n return Response.json({\n url: request.url,\n headers: [...request.headers.entries()]\n })\n }\n}\n\n------WebKitFormBoundaryqJEYLXuUiiZQHgvf\nContent-Disposition: form-data; name="metadata"; filename="blob"\nContent-Type: application/json\n\n{"compatibility_date":"2023-05-04","main_module":"index.js"}\n------WebKitFormBoundaryqJEYLXuUiiZQHgvf--', + body: TEST_WORKER.replace("fetch(request)", "fetch(request"), }).then((response) => response.json()); expect(w).toMatchInlineSnapshot(` { @@ -160,6 +302,131 @@ describe("Upload Worker", () => { } `); }); + it("should reject no token", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + "Content-Type": TEST_WORKER_CONTENT_TYPE, + }, + body: TEST_WORKER, + }); + expect(w.status).toBe(401); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"UploadFailed\\",\\"message\\":\\"Valid token not provided\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid token", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + cookie: `user=TEST_TOKEN`, + "Content-Type": TEST_WORKER_CONTENT_TYPE, + }, + body: TEST_WORKER, + }); + expect(w.status).toBe(401); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"UploadFailed\\",\\"message\\":\\"Valid token not provided\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid form data", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + cookie: `user=${defaultUserToken}`, + "Content-Type": "text/plain", + }, + body: "not a form", + }); + expect(w.status).toBe(400); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"BadUpload\\",\\"message\\":\\"Expected valid form data\\",\\"data\\":{\\"error\\":\\"TypeError: Unrecognized Content-Type header value. FormData can only parse the following MIME types: multipart/form-data, application/x-www-form-urlencoded\\"}}"' + ); + }); + it("should reject missing metadata", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + cookie: `user=${defaultUserToken}`, + "Content-Type": TEST_WORKER_CONTENT_TYPE, + }, + body: `--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="index.js"; filename="index.js" +Content-Type: application/javascript+module + +export default { + fetch(request) { return new Response("body"); } +} + +--${TEST_WORKER_BOUNDARY}--`, + }); + expect(w.status).toBe(400); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"BadUpload\\",\\"message\\":\\"Expected metadata file to be defined\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid metadata json", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + cookie: `user=${defaultUserToken}`, + "Content-Type": TEST_WORKER_CONTENT_TYPE, + }, + body: `--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="metadata"; filename="blob" +Content-Type: application/json + +{"compatibility_date":"2023-05-04", +--${TEST_WORKER_BOUNDARY}--`, + }); + expect(w.status).toBe(400); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"BadUpload\\",\\"message\\":\\"Expected metadata file to be valid\\",\\"data\\":{}}"' + ); + }); + it("should reject invalid metadata", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + cookie: `user=${defaultUserToken}`, + "Content-Type": TEST_WORKER_CONTENT_TYPE, + }, + body: `--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="metadata"; filename="blob" +Content-Type: application/json + +{"compatibility_date":42,"main_module":"index.js"} +--${TEST_WORKER_BOUNDARY}--`, + }); + expect(w.status).toBe(400); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"BadUpload\\",\\"message\\":\\"Expected metadata file to be valid\\",\\"data\\":{}}"' + ); + }); + it("should reject service worker", async () => { + const w = await fetch(`${REMOTE}/api/worker`, { + method: "POST", + headers: { + cookie: `user=${defaultUserToken}`, + "Content-Type": TEST_WORKER_CONTENT_TYPE, + }, + body: `--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="index.js"; filename="index.js" +Content-Type: application/javascript + +addEventListener("fetch", (event) => event.respondWith(new Response("body"))); +--${TEST_WORKER_BOUNDARY} +Content-Disposition: form-data; name="metadata"; filename="blob" +Content-Type: application/json + +{"compatibility_date":"2023-05-04","body_part":"index.js"} +--${TEST_WORKER_BOUNDARY}--`, + }); + expect(w.status).toBe(400); + expect(await w.text()).toMatchInlineSnapshot( + '"{\\"error\\":\\"ServiceWorkerNotSupported\\",\\"message\\":\\"Service Workers are not supported in the Workers Playground\\",\\"data\\":{}}"' + ); + }); }); describe("Raw HTTP preview", () => { @@ -175,7 +442,6 @@ describe("Raw HTTP preview", () => { expect(resp.headers.get("Access-Control-Allow-Headers")).toBe("foo"); }); - it("should allow arbitrary methods in cross-origin requests", async () => { const resp = await fetch(PREVIEW_REMOTE, { method: "OPTIONS", diff --git a/packages/wrangler/CHANGELOG.md b/packages/wrangler/CHANGELOG.md index 748434d1fd39..472ccf032445 100644 --- a/packages/wrangler/CHANGELOG.md +++ b/packages/wrangler/CHANGELOG.md @@ -1,5 +1,242 @@ # wrangler +## 3.28.0 + +### Minor Changes + +- [#4499](https://github.com/cloudflare/workers-sdk/pull/4499) [`cf9c029b`](https://github.com/cloudflare/workers-sdk/commit/cf9c029b30e1db3a1c3f9dc4208b9c34021a8ac0) Thanks [@penalosa](https://github.com/penalosa)! - feat: Support runtime-agnostic polyfills + + Previously, Wrangler treated any imports of `node:*` modules as build-time errors (unless one of the two Node.js compatibility modes was enabled). This is sometimes overly aggressive, since those imports are often not hit at runtime (for instance, it was impossible to write a library that worked across Node.JS and Workers, using Node packages only when running in Node). Here's an example of a function that would cause Wrangler to fail to build: + + ```ts + export function randomBytes(length: number) { + if (navigator.userAgent !== "Cloudflare-Workers") { + return new Uint8Array(require("node:crypto").randomBytes(length)); + } else { + return crypto.getRandomValues(new Uint8Array(length)); + } + } + ``` + + This function _should_ work in both Workers and Node, since it gates Node-specific functionality behind a user agent check, and falls back to the built-in Workers crypto API. Instead, Wrangler detected the `node:crypto` import and failed with the following error: + + ``` + ✘ [ERROR] Could not resolve "node:crypto" + + src/randomBytes.ts:5:36: + 5 │ ... return new Uint8Array(require('node:crypto').randomBytes(length)); + ╵ ~~~~~~~~~~~~~ + + The package "node:crypto" wasn't found on the file system but is built into node. + Add "node_compat = true" to your wrangler.toml file to enable Node.js compatibility. + ``` + + This change turns that Wrangler build failure into a warning, which users can choose to ignore if they know the import of `node:*` APIs is safe (because it will never trigger at runtime, for instance): + + ``` + ▲ [WARNING] The package "node:crypto" wasn't found on the file system but is built into node. + + Your Worker may throw errors at runtime unless you enable the "nodejs_compat" + compatibility flag. Refer to + https://developers.cloudflare.com/workers/runtime-apis/nodejs/ for more details. + Imported from: + - src/randomBytes.ts + ``` + + However, in a lot of cases, it's possible to know at _build_ time whether the import is safe. This change also injects `navigator.userAgent` into `esbuild`'s bundle settings as a predefined constant, which means that `esbuild` can tree-shake away imports of `node:*` APIs that are guaranteed not to be hit at runtime, supressing the warning entirely. + +* [#4926](https://github.com/cloudflare/workers-sdk/pull/4926) [`a14bd1d9`](https://github.com/cloudflare/workers-sdk/commit/a14bd1d97c5180b1fd48c2a0907424cf81d67bdb) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - feature: add a `cf` field to the `getBindingsProxy` result + + Add a new `cf` field to the `getBindingsProxy` result that people can use to mock the production + `cf` (`IncomingRequestCfProperties`) object. + + Example: + + ```ts + const { cf } = await getBindingsProxy(); + + console.log(`country = ${cf.country}; colo = ${cf.colo}`); + ``` + +### Patch Changes + +- [#4931](https://github.com/cloudflare/workers-sdk/pull/4931) [`321c7ed7`](https://github.com/cloudflare/workers-sdk/commit/321c7ed7355f64a22b0d26b2f097ba2e06e4b5e8) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - fix: make the entrypoint optional for the `types` command + + Currently running `wrangler types` against a `wrangler.toml` file without a defined entrypoint (`main` value) + causes the command to error with the following message: + + ``` + ✘ [ERROR] Missing entry-point: The entry-point should be specified via the command line (e.g. `wrangler types path/to/script`) or the `main` config field. + ``` + + However developers could want to generate types without the entrypoint being defined (for example when using `getBindingsProxy`), so these changes + make the entrypoint optional for the `types` command, assuming modules syntax if none is specified. + +* [#4867](https://github.com/cloudflare/workers-sdk/pull/4867) [`d637bd59`](https://github.com/cloudflare/workers-sdk/commit/d637bd59a8ea6612d59ed4b73e115287615e617d) Thanks [@RamIdeas](https://github.com/RamIdeas)! - fix: inflight requests to UserWorker which failed across reloads are now retried + + Previously, when running `wrangler dev`, requests inflight during a UserWorker reload (due to config or source file changes) would fail. + + Now, if those inflight requests are GET or HEAD requests, they will be reproxied against the new UserWorker. This adds to the guarantee that requests made during local development reach the latest worker. + +- [#4928](https://github.com/cloudflare/workers-sdk/pull/4928) [`4a735c46`](https://github.com/cloudflare/workers-sdk/commit/4a735c46fdf5752f141e0e646624f44ad6301ced) Thanks [@sdnts](https://github.com/sdnts)! - fix: Update API calls for Sippy's endpoints + +* [#4938](https://github.com/cloudflare/workers-sdk/pull/4938) [`75bd08ae`](https://github.com/cloudflare/workers-sdk/commit/75bd08aed0b82268fb5cf0f42cdd85d4d6d235ef) Thanks [@rozenmd](https://github.com/rozenmd)! - fix: print wrangler banner at the start of every d1 command + + This PR adds a wrangler banner to the start of every D1 command (except when invoked in JSON-mode) + + For example: + + ``` + ⛅️ wrangler 3.27.0 + ------------------- + ... + ``` + +- [#4953](https://github.com/cloudflare/workers-sdk/pull/4953) [`d96bc7dd`](https://github.com/cloudflare/workers-sdk/commit/d96bc7dd803739f1815601d707d9b6e6062436da) Thanks [@mrbbot](https://github.com/mrbbot)! - fix: allow `port` option to be specified with `unstable_dev()` + + Previously, specifying a non-zero `port` when using `unstable_dev()` would try to start two servers on that `port`. This change ensures we only start the user-facing server on the specified `port`, allow `unstable_dev()` to startup correctly. + +## 3.27.0 + +### Minor Changes + +- [#4877](https://github.com/cloudflare/workers-sdk/pull/4877) [`3e7cd6e4`](https://github.com/cloudflare/workers-sdk/commit/3e7cd6e40816c5c6ab28163508a6ba9729c6de73) Thanks [@magnusdahlstrand](https://github.com/magnusdahlstrand)! - fix: Do not show unnecessary errors during watch rebuilds + + When Pages is used in conjunction with a full stack framework, the framework + build will temporarily remove files that are being watched by Pages, such as + `_worker.js` and `_routes.json`. + Previously we would display errors for these changes, which adds confusing and excessive messages to the Pages dev output. Now builds are skipped if a watched `_worker.js` or `_routes.json` is removed. + +* [#4901](https://github.com/cloudflare/workers-sdk/pull/4901) [`2469e9fa`](https://github.com/cloudflare/workers-sdk/commit/2469e9faeaaa86d70bc7e3714c515274b38a67de) Thanks [@penalosa](https://github.com/penalosa)! - feature: implemented Python support in Wrangler + + Python Workers are now supported by `wrangler deploy` and `wrangler dev`. + +- [#4922](https://github.com/cloudflare/workers-sdk/pull/4922) [`4c7031a6`](https://github.com/cloudflare/workers-sdk/commit/4c7031a6b2ed33e38147d95922d6b15b0ad851ec) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - feature: add a `ctx` field to the `getBindingsProxy` result + + Add a new `ctx` filed to the `getBindingsProxy` result that people can use to mock the production + `ExecutionContext` object. + + Example: + + ```ts + const { ctx } = await getBindingsProxy(); + ctx.waitUntil(myPromise); + ``` + +### Patch Changes + +- [#4914](https://github.com/cloudflare/workers-sdk/pull/4914) [`e61dba50`](https://github.com/cloudflare/workers-sdk/commit/e61dba503598b38d9daabe63ab71f75def1e7856) Thanks [@nora-soderlund](https://github.com/nora-soderlund)! - fix: ensure d1 validation errors render user friendly messages + +* [#4907](https://github.com/cloudflare/workers-sdk/pull/4907) [`583e4451`](https://github.com/cloudflare/workers-sdk/commit/583e4451c99d916bde52e766b8a19765584303d1) Thanks [@mrbbot](https://github.com/mrbbot)! - fix: mark R2 object and bucket not found errors as unreportable + + Previously, running `wrangler r2 objects {get,put}` with an object or bucket that didn't exist would ask if you wanted to report that error to Cloudflare. There's nothing we can do to fix this, so this change prevents the prompt in this case. + +- [#4872](https://github.com/cloudflare/workers-sdk/pull/4872) [`5ef56067`](https://github.com/cloudflare/workers-sdk/commit/5ef56067ccf8e20b34fe87455da8b798702181f1) Thanks [@rozenmd](https://github.com/rozenmd)! - fix: intercept and stringify errors thrown by d1 execute in --json mode + + Prior to this PR, if a query threw an error when run in `wrangler d1 execute ... --json`, wrangler would swallow the error. + + This PR returns the error as JSON. For example, the invalid query `SELECT asdf;` now returns the following in JSON mode: + + ```json + { + "error": { + "text": "A request to the Cloudflare API (/accounts/xxxx/d1/database/xxxxxxx/query) failed.", + "notes": [ + { + "text": "no such column: asdf at offset 7 [code: 7500]" + } + ], + "kind": "error", + "name": "APIError", + "code": 7500 + } + } + ``` + +* [#4888](https://github.com/cloudflare/workers-sdk/pull/4888) [`3679bc18`](https://github.com/cloudflare/workers-sdk/commit/3679bc18b2cb849fd4023ac653c06e0a7ec2195f) Thanks [@petebacondarwin](https://github.com/petebacondarwin)! - fix: ensure that the Pages dev proxy server does not change the Host header + + Previously, when configuring `wrangler pages dev` to use a proxy to a 3rd party dev server, + the proxy would replace the Host header, resulting in problems at the dev server if it was + checking for cross-site scripting attacks. + + Now the proxy server passes through the Host header unaltered making it invisible to the + 3rd party dev server. + + Fixes #4799 + +- [#4909](https://github.com/cloudflare/workers-sdk/pull/4909) [`34b6ea1e`](https://github.com/cloudflare/workers-sdk/commit/34b6ea1ea59884daca0c0d09265feacc10a4a685) Thanks [@rozenmd](https://github.com/rozenmd)! - feat: add an experimental `insights` command to `wrangler d1` + + This PR adds a `wrangler d1 insights ` command, to let D1 users figure out which of their queries to D1 need to be optimised. + + This command defaults to fetching the top 5 queries that took the longest to run in total over the last 24 hours. + + You can also fetch the top 5 queries that consumed the most rows read over the last week, for example: + + ```bash + npx wrangler d1 insights northwind --sortBy reads --timePeriod 7d + ``` + + Or the top 5 queries that consumed the most rows written over the last month, for example: + + ```bash + npx wrangler d1 insights northwind --sortBy writes --timePeriod 31d + ``` + + Or the top 5 most frequently run queries in the last 24 hours, for example: + + ```bash + npx wrangler d1 insights northwind --sortBy count + ``` + +* [#4830](https://github.com/cloudflare/workers-sdk/pull/4830) [`48f90859`](https://github.com/cloudflare/workers-sdk/commit/48f9085981f0a4923d3ccc32596520107c4e4df8) Thanks [@Lekensteyn](https://github.com/Lekensteyn)! - fix: listen on loopback for wrangler dev port check and login + + Avoid listening on the wildcard address by default to reduce the attacker's + surface and avoid firewall prompts on macOS. + + Relates to #4430. + +- [#4907](https://github.com/cloudflare/workers-sdk/pull/4907) [`583e4451`](https://github.com/cloudflare/workers-sdk/commit/583e4451c99d916bde52e766b8a19765584303d1) Thanks [@mrbbot](https://github.com/mrbbot)! - fix: ensure `wrangler dev --log-level` flag applied to all logs + + Previously, `wrangler dev` may have ignored the `--log-level` flag for some startup logs. This change ensures the `--log-level` flag is applied immediately. + +- Updated dependencies [[`148feff6`](https://github.com/cloudflare/workers-sdk/commit/148feff60c9bf3886c0e0fd1ea98049955c27659)]: + - miniflare@3.20240129.1 + +## 3.26.0 + +### Minor Changes + +- [#4847](https://github.com/cloudflare/workers-sdk/pull/4847) [`6968e11f`](https://github.com/cloudflare/workers-sdk/commit/6968e11f3c1f4911c666501ca9654eabfe87244b) Thanks [@dario-piotrowicz](https://github.com/dario-piotrowicz)! - feature: expose new (no-op) `caches` field in `getBindingsProxy` result + + Add a new `caches` field to the `getBindingsProxy` result, such field implements a + no operation (no-op) implementation of the runtime `caches` + + Note: Miniflare exposes a proper `caches` mock, we will want to use that one in + the future but issues regarding it must be ironed out first, so for the + time being a no-op will have to do + +### Patch Changes + +- [#4860](https://github.com/cloudflare/workers-sdk/pull/4860) [`b92e5ac0`](https://github.com/cloudflare/workers-sdk/commit/b92e5ac006195d490bcf9be3b547ba0bfa33f151) Thanks [@Sibirius](https://github.com/Sibirius)! - fix: allow empty strings in secret:bulk upload + + Previously, the `secret:bulk` command would fail if any of the secrets in the secret.json file were empty strings and they already existed remotely. + +* [#4869](https://github.com/cloudflare/workers-sdk/pull/4869) [`fd084bc0`](https://github.com/cloudflare/workers-sdk/commit/fd084bc0c890458f479e756b616ed023b7142bba) Thanks [@jculvey](https://github.com/jculvey)! - feature: Expose AI bindings to `getBindingsProxy`. + + The `getBindingsProxy` utility function will now contain entries for any AI bindings specified in `wrangler.toml`. + +- [#4880](https://github.com/cloudflare/workers-sdk/pull/4880) [`65da40a1`](https://github.com/cloudflare/workers-sdk/commit/65da40a1229c4e5358553f2636282eb909ebc662) Thanks [@petebacondarwin](https://github.com/petebacondarwin)! - fix: do not attempt login during dry-run + + The "standard pricing" warning was attempting to make an API call that was causing a login attempt even when on a dry-run. + Now this warning is disabled during dry-runs. + + Fixes #4723 + +* [#4819](https://github.com/cloudflare/workers-sdk/pull/4819) [`6a4cb8c6`](https://github.com/cloudflare/workers-sdk/commit/6a4cb8c6456f1dba95cae2b8bbe658f7227349f8) Thanks [@magnusdahlstrand](https://github.com/magnusdahlstrand)! - fix: Use appropriate logging levels when parsing headers and redirects in `wrangler pages dev`. + +* Updated dependencies [[`1e424ff2`](https://github.com/cloudflare/workers-sdk/commit/1e424ff280610657e997df8290d0b39b0393c845), [`749fa3c0`](https://github.com/cloudflare/workers-sdk/commit/749fa3c05e6b9fcaa59a72f60f7936b7beaed5ad)]: + - miniflare@3.20240129.0 + ## 3.25.0 ### Minor Changes diff --git a/packages/wrangler/e2e/c3-integration.test.ts b/packages/wrangler/e2e/c3-integration.test.ts index a60dbe1d56c0..29fa335156cb 100644 --- a/packages/wrangler/e2e/c3-integration.test.ts +++ b/packages/wrangler/e2e/c3-integration.test.ts @@ -63,8 +63,7 @@ describe("c3 integration", () => { it("deploy the worker", async () => { const { stdout, stderr } = await runInWorker`$ ${WRANGLER} deploy`; expect(normalize(stdout)).toMatchInlineSnapshot(` - "🚧 New Workers Standard pricing is now available. Please visit the dashboard to view details and opt-in to new pricing: https://dash.cloudflare.com/CLOUDFLARE_ACCOUNT_ID/workers/standard/opt-in. - Total Upload: xx KiB / gzip: xx KiB + "Total Upload: xx KiB / gzip: xx KiB Uploaded smoke-test-worker (TIMINGS) Published smoke-test-worker (TIMINGS) https://smoke-test-worker.SUBDOMAIN.workers.dev @@ -86,7 +85,7 @@ describe("c3 integration", () => { const { stdout, stderr } = await runInWorker`$$ ${WRANGLER} delete`; expect(normalize(stdout)).toMatchInlineSnapshot(` "? Are you sure you want to delete smoke-test-worker? This action cannot be undone. - 🤖 Using default value in non-interactive context: yes + 🤖 Using fallback value in non-interactive context: yes Successfully deleted smoke-test-worker" `); expect(stderr).toMatchInlineSnapshot('""'); diff --git a/packages/wrangler/e2e/deploy.test.ts b/packages/wrangler/e2e/deploy.test.ts deleted file mode 100644 index d6495139f62f..000000000000 --- a/packages/wrangler/e2e/deploy.test.ts +++ /dev/null @@ -1,118 +0,0 @@ -import crypto from "node:crypto"; -import path from "node:path"; -import shellac from "shellac"; -import { fetch } from "undici"; -import { beforeAll, describe, expect, it } from "vitest"; -import { CLOUDFLARE_ACCOUNT_ID } from "./helpers/account-id"; -import { normalizeOutput } from "./helpers/normalize"; -import { retry } from "./helpers/retry"; -import { dedent, makeRoot, seed } from "./helpers/setup"; -import { WRANGLER } from "./helpers/wrangler-command"; - -function matchWorkersDev(stdout: string): string { - return stdout.match( - /https:\/\/smoke-test-worker-.+?\.(.+?\.workers\.dev)/ - )?.[1] as string; -} - -describe("deploy", () => { - let workerName: string; - let workerPath: string; - let workersDev: string | null = null; - let runInRoot: typeof shellac; - let runInWorker: typeof shellac; - let normalize: (str: string) => string; - - beforeAll(async () => { - const root = await makeRoot(); - runInRoot = shellac.in(root).env(process.env); - workerName = `smoke-test-worker-${crypto.randomBytes(4).toString("hex")}`; - workerPath = path.join(root, workerName); - runInWorker = shellac.in(workerPath).env(process.env); - normalize = (str) => - normalizeOutput(str, { - [workerName]: "smoke-test-worker", - [CLOUDFLARE_ACCOUNT_ID]: "CLOUDFLARE_ACCOUNT_ID", - }); - }); - - it("init worker", async () => { - const { stdout } = - await runInRoot`$ ${WRANGLER} init --yes --no-delegate-c3 ${workerName}`; - - expect(normalize(stdout)).toContain( - "To publish your Worker to the Internet, run `npm run deploy`" - ); - }); - - it("deploy worker", async () => { - const { stdout } = await runInWorker`$ ${WRANGLER} deploy`; - expect(normalize(stdout)).toMatchInlineSnapshot(` - "🚧 New Workers Standard pricing is now available. Please visit the dashboard to view details and opt-in to new pricing: https://dash.cloudflare.com/CLOUDFLARE_ACCOUNT_ID/workers/standard/opt-in. - Total Upload: xx KiB / gzip: xx KiB - Uploaded smoke-test-worker (TIMINGS) - Published smoke-test-worker (TIMINGS) - https://smoke-test-worker.SUBDOMAIN.workers.dev - Current Deployment ID: 00000000-0000-0000-0000-000000000000" - `); - workersDev = matchWorkersDev(stdout); - - const { text } = await retry( - (s) => s.status !== 200, - async () => { - const r = await fetch(`https://${workerName}.${workersDev}`); - return { text: await r.text(), status: r.status }; - } - ); - expect(text).toMatchInlineSnapshot('"Hello World!"'); - }); - - it("modify & deploy worker", async () => { - await seed(workerPath, { - "src/index.ts": dedent` - export default { - fetch(request) { - return new Response("Updated Worker!") - } - }`, - }); - const { stdout, stderr } = await runInWorker`$ ${WRANGLER} deploy`; - expect(normalize(stdout)).toMatchInlineSnapshot(` - "🚧 New Workers Standard pricing is now available. Please visit the dashboard to view details and opt-in to new pricing: https://dash.cloudflare.com/CLOUDFLARE_ACCOUNT_ID/workers/standard/opt-in. - Total Upload: xx KiB / gzip: xx KiB - Uploaded smoke-test-worker (TIMINGS) - Published smoke-test-worker (TIMINGS) - https://smoke-test-worker.SUBDOMAIN.workers.dev - Current Deployment ID: 00000000-0000-0000-0000-000000000000" - `); - expect(stderr).toMatchInlineSnapshot('""'); - workersDev = matchWorkersDev(stdout); - - const { text } = await retry( - (s) => s.status !== 200 || s.text === "Hello World!", - async () => { - const r = await fetch(`https://${workerName}.${workersDev}`); - return { text: await r.text(), status: r.status }; - } - ); - expect(text).toMatchInlineSnapshot('"Updated Worker!"'); - }); - - it("delete worker", async () => { - const { stdout, stderr } = await runInWorker`$ ${WRANGLER} delete`; - expect(normalize(stdout)).toMatchInlineSnapshot(` - "? Are you sure you want to delete smoke-test-worker? This action cannot be undone. - 🤖 Using default value in non-interactive context: yes - Successfully deleted smoke-test-worker" - `); - expect(stderr).toMatchInlineSnapshot('""'); - const { status } = await retry( - (s) => s.status === 200 || s.status === 500, - async () => { - const r = await fetch(`https://${workerName}.${workersDev}`); - return { text: await r.text(), status: r.status }; - } - ); - expect(status).toBe(404); - }); -}); diff --git a/packages/wrangler/e2e/deployments.test.ts b/packages/wrangler/e2e/deployments.test.ts index 02bc9500120c..ff7697f5bfcb 100644 --- a/packages/wrangler/e2e/deployments.test.ts +++ b/packages/wrangler/e2e/deployments.test.ts @@ -56,8 +56,7 @@ describe("deployments", () => { it("deploy worker", async () => { const { stdout } = await runInWorker`$ ${WRANGLER} deploy`; expect(normalize(stdout)).toMatchInlineSnapshot(` - "🚧 New Workers Standard pricing is now available. Please visit the dashboard to view details and opt-in to new pricing: https://dash.cloudflare.com/CLOUDFLARE_ACCOUNT_ID/workers/standard/opt-in. - Total Upload: xx KiB / gzip: xx KiB + "Total Upload: xx KiB / gzip: xx KiB Uploaded smoke-test-worker (TIMINGS) Published smoke-test-worker (TIMINGS) https://smoke-test-worker.SUBDOMAIN.workers.dev @@ -101,8 +100,7 @@ describe("deployments", () => { }); const { stdout, stderr } = await runInWorker`$ ${WRANGLER} deploy`; expect(normalize(stdout)).toMatchInlineSnapshot(` - "🚧 New Workers Standard pricing is now available. Please visit the dashboard to view details and opt-in to new pricing: https://dash.cloudflare.com/CLOUDFLARE_ACCOUNT_ID/workers/standard/opt-in. - Total Upload: xx KiB / gzip: xx KiB + "Total Upload: xx KiB / gzip: xx KiB Uploaded smoke-test-worker (TIMINGS) Published smoke-test-worker (TIMINGS) https://smoke-test-worker.SUBDOMAIN.workers.dev @@ -178,7 +176,7 @@ describe("deployments", () => { const { stdout, stderr } = await runInWorker`$ ${WRANGLER} delete`; expect(normalize(stdout)).toMatchInlineSnapshot(` "? Are you sure you want to delete smoke-test-worker? This action cannot be undone. - 🤖 Using default value in non-interactive context: yes + 🤖 Using fallback value in non-interactive context: yes Successfully deleted smoke-test-worker" `); expect(stderr).toMatchInlineSnapshot('""'); diff --git a/packages/wrangler/e2e/dev.test.ts b/packages/wrangler/e2e/dev.test.ts index ee2782d5fbc9..9b3b337df66b 100644 --- a/packages/wrangler/e2e/dev.test.ts +++ b/packages/wrangler/e2e/dev.test.ts @@ -4,15 +4,23 @@ import { existsSync } from "node:fs"; import * as nodeNet from "node:net"; import path from "node:path"; import { setTimeout } from "node:timers/promises"; -import getPort from "get-port"; import shellac from "shellac"; -import { fetch } from "undici"; +import { Agent, fetch, setGlobalDispatcher } from "undici"; import { afterEach, beforeEach, describe, expect, it } from "vitest"; import { normalizeOutput } from "./helpers/normalize"; import { retry } from "./helpers/retry"; import { dedent, makeRoot, seed } from "./helpers/setup"; import { WRANGLER } from "./helpers/wrangler-command"; +// Use `Agent` with lower timeouts so `fetch()`s inside `retry()`s don't block for a long time +setGlobalDispatcher( + new Agent({ + connectTimeout: 10_000, + headersTimeout: 10_000, + bodyTimeout: 10_000, + }) +); + type MaybePromise = T | Promise; const waitForPortToBeBound = async (port: number) => { @@ -31,11 +39,7 @@ const waitUntilOutputContains = async ( (stdout) => !stdout.includes(substring), async () => { await setTimeout(intervalMs); - return ( - normalizeOutput(session.stdout) + - "\n\n\n" + - normalizeOutput(session.stderr) - ); + return session.stdout + "\n\n\n" + session.stderr; } ); }; @@ -46,6 +50,20 @@ interface SessionData { stderr: string; } +function getPort() { + return new Promise((resolve, reject) => { + const server = nodeNet.createServer((socket) => socket.destroy()); + server.listen(0, () => { + const address = server.address(); + assert(typeof address === "object" && address !== null); + server.close((err) => { + if (err) reject(err); + else resolve(address.port); + }); + }); + }); +} + async function runDevSession( workerPath: string, flags: string, @@ -65,7 +83,11 @@ async function runDevSession( // Must use the `in` statement in the shellac script rather than `.in()` modifier on the `shellac` object // otherwise the working directory does not get picked up. + let promiseResolve: (() => void) | undefined; + const promise = new Promise((resolve) => (promiseResolve = resolve)); const bg = await shellac.env(process.env).bg` + await ${() => promise} + in ${workerPath} { exits { $ ${WRANGLER} dev ${flags} @@ -82,12 +104,18 @@ async function runDevSession( }; bg.process.stdout.on("data", (chunk) => (sessionData.stdout += chunk)); bg.process.stderr.on("data", (chunk) => (sessionData.stderr += chunk)); + // Only start `wrangler dev` once we've registered output listeners so we don't miss messages + promiseResolve?.(); await session(sessionData); return bg.promise; } finally { - if (pid) process.kill(pid); + try { + if (pid) process.kill(pid); + } catch { + // Ignore errors if we failed to kill the process (i.e. ESRCH if it's already terminated) + } } } @@ -240,6 +268,62 @@ describe("basic dev tests", () => { }); }); +describe("basic dev python tests", () => { + let worker: DevWorker; + + beforeEach(async () => { + worker = await makeWorker(); + await worker.seed((workerName) => ({ + "wrangler.toml": dedent` + name = "${workerName}" + main = "index.py" + compatibility_date = "2023-01-01" + compatibility_flags = ["experimental"] + `, + "index.py": dedent` + from js import Response + def fetch(request): + return Response.new('py hello world')`, + "package.json": dedent` + { + "name": "${workerName}", + "version": "0.0.0", + "private": true + } + `, + })); + }); + + it("can run and modify python worker during dev session (local)", async () => { + await worker.runDevSession("", async (session) => { + const { text } = await retry( + (s) => s.status !== 200, + async () => { + const r = await fetch(`http://127.0.0.1:${session.port}`); + return { text: await r.text(), status: r.status }; + } + ); + expect(text).toMatchInlineSnapshot('"py hello world"'); + + await worker.seed({ + "index.py": dedent` + from js import Response + def fetch(request): + return Response.new('Updated Python Worker value')`, + }); + + const { text: text2 } = await retry( + (s) => s.status !== 200 || s.text === "py hello world", + async () => { + const r = await fetch(`http://127.0.0.1:${session.port}`); + return { text: await r.text(), status: r.status }; + } + ); + expect(text2).toMatchInlineSnapshot('"Updated Python Worker value"'); + }); + }); +}); + describe("dev registry", () => { let a: DevWorker; let b: DevWorker; @@ -489,14 +573,14 @@ describe("writes debug logs to hidden file", () => { async (session) => { await waitForPortToBeBound(session.port); - await waitUntilOutputContains(session, "🐛 Writing debug logs to"); + await waitUntilOutputContains(session, "Writing logs to"); await setTimeout(1000); // wait a bit to ensure the file is written to disk } ); const filepath = finalA.stdout.match( - /🐛 Writing debug logs to "(.+\.log)"/ + /🪵 {2}Writing logs to "(.+\.log)"/ )?.[1]; assert(filepath); @@ -511,13 +595,13 @@ describe("writes debug logs to hidden file", () => { }); const filepath = finalA.stdout.match( - /🐛 Writing debug logs to "(.+\.log)"/ + /🪵 {2}Writing logs to "(.+\.log)"/ )?.[1]; expect(filepath).toBeUndefined(); }); - it("rewrites address-in-use error logs", async () => { + it.skip("rewrites address-in-use error logs", async () => { // 1. start worker A on a (any) port await a.runDevSession("", async (sessionA) => { const normalize = (text: string) => diff --git a/packages/wrangler/e2e/helpers/normalize.ts b/packages/wrangler/e2e/helpers/normalize.ts index f5911fd7afdb..b3582e827437 100644 --- a/packages/wrangler/e2e/helpers/normalize.ts +++ b/packages/wrangler/e2e/helpers/normalize.ts @@ -6,6 +6,7 @@ export function normalizeOutput( ): string { const functions = [ removeVersionHeader, + removeStandardPricingWarning, npmStripTimings, removeWorkersDev, removeUUID, @@ -139,10 +140,12 @@ export function normalizeTempDirs(stdout: string): string { * Debug log files are created with a timestamp, so we replace the debug log filepath timestamp with */ export function normalizeDebugLogFilepath(stdout: string): string { - return stdout.replace( - /(🐛 Writing debug logs to ".+wrangler-debug)-.+\.log/, - "$1-.log" - ); + return stdout + .replace(/🪵 {2}Writing logs to ".+\.log"/, '🪵 Writing logs to ""') + .replace( + /🪵 {2}Logs were written to ".+\.log"/, + '🪵 Logs were written to ""' + ); } /** @@ -154,3 +157,13 @@ export function squashLocalNetworkBindings(stdout: string): string { "[mf:inf] Ready on http://:\n[mf:inf] - http://:" ); } + +/** + * This may or may not be displayed depending on whether the test account has accepted standard pricing. + */ +function removeStandardPricingWarning(stdout: string): string { + return stdout.replace( + "🚧 New Workers Standard pricing is now available. Please visit the dashboard to view details and opt-in to new pricing: https://dash.cloudflare.com/CLOUDFLARE_ACCOUNT_ID/workers/standard/opt-in.", + "" + ); +} diff --git a/packages/wrangler/e2e/r2.test.ts b/packages/wrangler/e2e/r2.test.ts index 44708b7ec760..ce863b0be307 100644 --- a/packages/wrangler/e2e/r2.test.ts +++ b/packages/wrangler/e2e/r2.test.ts @@ -83,8 +83,8 @@ describe("r2", () => { If you think this is a bug then please create an issue at https://github.com/cloudflare/workers-sdk/issues/new/choose" `); expect(normalize(stderr)).toMatchInlineSnapshot(` - "X [ERROR] Failed to fetch /accounts/CLOUDFLARE_ACCOUNT_ID/r2/buckets/wrangler-smoke-test-bucket/objects/testr2 - 404: Not Found); - " + "X [ERROR] The specified key does not exist. + 🪵 Logs were written to \\"\\"" `); }); @@ -112,8 +112,8 @@ describe("r2", () => { If you think this is a bug then please create an issue at https://github.com/cloudflare/workers-sdk/issues/new/choose" `); expect(normalize(stderr)).toMatchInlineSnapshot(` - "X [ERROR] Failed to fetch /accounts/CLOUDFLARE_ACCOUNT_ID/r2/buckets/wrangler-smoke-test-bucket/objects/testr2 - 404: Not Found); - " + "X [ERROR] The specified bucket does not exist. + 🪵 Logs were written to \\"\\"" `); }); }); diff --git a/packages/wrangler/package.json b/packages/wrangler/package.json index ca1bf3972010..7f2b6194bd7e 100644 --- a/packages/wrangler/package.json +++ b/packages/wrangler/package.json @@ -1,6 +1,6 @@ { "name": "wrangler", - "version": "3.25.0", + "version": "3.28.0", "description": "Command-line interface for all things Cloudflare Workers", "keywords": [ "wrangler", @@ -211,6 +211,14 @@ "optionalDependencies": { "fsevents": "~2.3.2" }, + "peerDependencies": { + "@cloudflare/workers-types": "^4.20230914.0" + }, + "peerDependenciesMeta": { + "@cloudflare/workers-types": { + "optional": true + } + }, "engines": { "node": ">=16.17.0" }, diff --git a/packages/wrangler/scripts/deps.ts b/packages/wrangler/scripts/deps.ts index 493a7c313070..22187e1754bd 100644 --- a/packages/wrangler/scripts/deps.ts +++ b/packages/wrangler/scripts/deps.ts @@ -15,6 +15,9 @@ export const EXTERNAL_DEPENDENCIES = [ "@esbuild-plugins/node-globals-polyfill", "@esbuild-plugins/node-modules-polyfill", "chokidar", + // @cloudflare/workers-types is an optional peer dependency of wrangler, so users can + // get the types by installing the package (to what version they prefer) themselves + "@cloudflare/workers-types", ]; const pathToPackageJson = path.resolve(__dirname, "..", "package.json"); diff --git a/packages/wrangler/scripts/emit-types.ts b/packages/wrangler/scripts/emit-types.ts index 8e055936f054..3651b906ddf8 100644 --- a/packages/wrangler/scripts/emit-types.ts +++ b/packages/wrangler/scripts/emit-types.ts @@ -16,29 +16,52 @@ const configObject = ExtractorConfig.loadFile(configObjectFullPath); // include the dependencies we want to bundle configObject.bundledPackages = BUNDLED_DEPENDENCIES; -const extractorConfig = ExtractorConfig.prepare({ - configObject, - configObjectFullPath, - packageJsonFullPath, - packageJson, -}); - -// Invoke API Extractor -const extractorResult = Extractor.invoke(extractorConfig, { - // Equivalent to the "--local" command-line parameter - localBuild: true, - - // Equivalent to the "--verbose" command-line parameter - showVerboseMessages: true, -}); - -if (extractorResult.succeeded) { - console.log(`API Extractor completed successfully`); - process.exitCode = 0; -} else { - console.error( - `API Extractor completed with ${extractorResult.errorCount} errors` + - ` and ${extractorResult.warningCount} warnings` - ); - process.exitCode = 1; +const pkgRoot = path.resolve(__dirname, ".."); + +// `api-extractor` doesn't know to load `index.ts` instead of `index.d.ts` when +// resolving imported types, so copy `index.ts` to `index.d.ts`, bundle types, +// then restore the original contents. We need the original `index.d.ts` for +// typing the `packages/miniflare/src/workers` directory. +const workersTypesExperimental = path.join( + pkgRoot, + "node_modules", + "@cloudflare", + "workers-types", + "experimental" +); +const indexTsPath = path.join(workersTypesExperimental, "index.ts"); +const indexDtsPath = path.join(workersTypesExperimental, "index.d.ts"); +const originalDtsContent = fs.readFileSync(indexDtsPath); + +fs.copyFileSync(indexTsPath, indexDtsPath); + +try { + const extractorConfig = ExtractorConfig.prepare({ + configObject, + configObjectFullPath, + packageJsonFullPath, + packageJson, + }); + + // Invoke API Extractor + const extractorResult = Extractor.invoke(extractorConfig, { + // Equivalent to the "--local" command-line parameter + localBuild: true, + + // Equivalent to the "--verbose" command-line parameter + showVerboseMessages: true, + }); + + if (extractorResult.succeeded) { + console.log(`API Extractor completed successfully`); + process.exitCode = 0; + } else { + console.error( + `API Extractor completed with ${extractorResult.errorCount} errors` + + ` and ${extractorResult.warningCount} warnings` + ); + process.exitCode = 1; + } +} finally { + fs.writeFileSync(indexDtsPath, originalDtsContent); } diff --git a/packages/wrangler/src/__tests__/d1/d1.test.ts b/packages/wrangler/src/__tests__/d1/d1.test.ts index 14396c12fe3a..a689d08411ec 100644 --- a/packages/wrangler/src/__tests__/d1/d1.test.ts +++ b/packages/wrangler/src/__tests__/d1/d1.test.ts @@ -19,6 +19,7 @@ describe("d1", () => { Commands: wrangler d1 list List D1 databases wrangler d1 info Get information about a D1 database, including the current database size and state. + wrangler d1 insights Experimental command. Get information about the queries run on a D1 database. wrangler d1 create Create D1 database wrangler d1 delete Delete D1 database wrangler d1 backup Interact with D1 Backups @@ -59,6 +60,7 @@ describe("d1", () => { Commands: wrangler d1 list List D1 databases wrangler d1 info Get information about a D1 database, including the current database size and state. + wrangler d1 insights Experimental command. Get information about the queries run on a D1 database. wrangler d1 create Create D1 database wrangler d1 delete Delete D1 database wrangler d1 backup Interact with D1 Backups diff --git a/packages/wrangler/src/__tests__/d1/execute.test.ts b/packages/wrangler/src/__tests__/d1/execute.test.ts index 257375d9cc44..b52e09b78d99 100644 --- a/packages/wrangler/src/__tests__/d1/execute.test.ts +++ b/packages/wrangler/src/__tests__/d1/execute.test.ts @@ -50,6 +50,19 @@ describe("execute", () => { ).rejects.toThrowError(`Error: can't use --preview with --local`); }); + it("should reject the use of --preview with --local with --json", async () => { + setIsTTY(false); + writeWranglerToml({ + d1_databases: [ + { binding: "DATABASE", database_name: "db", database_id: "xxxx" }, + ], + }); + + await expect( + runWrangler(`d1 execute db --command "select;" --local --preview --json`) + ).rejects.toThrowError(`Error: can't use --preview with --local`); + }); + it("should expect --local when using --persist-to", async () => { setIsTTY(false); writeWranglerToml({ diff --git a/packages/wrangler/src/__tests__/deploy.test.ts b/packages/wrangler/src/__tests__/deploy.test.ts index 9729c4813960..190b962a850f 100644 --- a/packages/wrangler/src/__tests__/deploy.test.ts +++ b/packages/wrangler/src/__tests__/deploy.test.ts @@ -7938,7 +7938,7 @@ export default{ }); describe("`nodejs_compat` compatibility flag", () => { - it('when absent, should error on any "external" `node:*` imports', async () => { + it('when absent, should warn on any "external" `node:*` imports', async () => { writeWranglerToml(); fs.writeFileSync( "index.js", @@ -7948,15 +7948,18 @@ export default{ export default {} ` ); - let err: esbuild.BuildFailure | undefined; - try { - await runWrangler("deploy index.js --dry-run"); // expecting this to throw, as node compatibility isn't enabled - } catch (e) { - err = e as esbuild.BuildFailure; - } - expect( - esbuild.formatMessagesSync(err?.errors ?? [], { kind: "error" }).join() - ).toMatch(/Could not resolve "node:async_hooks"/); + await runWrangler("deploy index.js --dry-run"); + + expect(std.warn).toMatchInlineSnapshot(` + "▲ [WARNING] The package \\"node:async_hooks\\" wasn't found on the file system but is built into node. + + Your Worker may throw errors at runtime unless you enable the \\"nodejs_compat\\" compatibility flag. + Refer to https://developers.cloudflare.com/workers/runtime-apis/nodejs/ for more details. Imported + from: + - index.js + + " + `); }); it('when present, should support any "external" `node:*` imports', async () => { @@ -8775,6 +8778,72 @@ export default{ }); }); + describe("python", () => { + it("should upload python module defined in wrangler.toml", async () => { + writeWranglerToml({ + main: "index.py", + }); + await fs.promises.writeFile( + "index.py", + "from js import Response;\ndef fetch(request):\n return Response.new('hello')" + ); + mockSubDomainRequest(); + mockUploadWorkerRequest({ + expectedMainModule: "index", + }); + + await runWrangler("deploy"); + expect( + std.out.replace( + /.wrangler\/tmp\/deploy-(.+)\/index.py/, + ".wrangler/tmp/deploy/index.py" + ) + ).toMatchInlineSnapshot(` + "┌──────────────────────────────────────┬────────┬──────────┐ + │ Name │ Type │ Size │ + ├──────────────────────────────────────┼────────┼──────────┤ + │ .wrangler/tmp/deploy/index.py │ python │ xx KiB │ + └──────────────────────────────────────┴────────┴──────────┘ + Total Upload: xx KiB / gzip: xx KiB + Uploaded test-name (TIMINGS) + Published test-name (TIMINGS) + https://test-name.test-sub-domain.workers.dev + Current Deployment ID: Galaxy-Class" + `); + }); + + it("should upload python module specified in CLI args", async () => { + writeWranglerToml(); + await fs.promises.writeFile( + "index.py", + "from js import Response;\ndef fetch(request):\n return Response.new('hello')" + ); + mockSubDomainRequest(); + mockUploadWorkerRequest({ + expectedMainModule: "index", + }); + + await runWrangler("deploy index.py"); + expect( + std.out.replace( + /.wrangler\/tmp\/deploy-(.+)\/index.py/, + ".wrangler/tmp/deploy/index.py" + ) + ).toMatchInlineSnapshot(` + "┌──────────────────────────────────────┬────────┬──────────┐ + │ Name │ Type │ Size │ + ├──────────────────────────────────────┼────────┼──────────┤ + │ .wrangler/tmp/deploy/index.py │ python │ xx KiB │ + └──────────────────────────────────────┴────────┴──────────┘ + Total Upload: xx KiB / gzip: xx KiB + Uploaded test-name (TIMINGS) + Published test-name (TIMINGS) + https://test-name.test-sub-domain.workers.dev + Current Deployment ID: Galaxy-Class" + `); + }); + }); + describe("hyperdrive", () => { it("should upload hyperdrive bindings", async () => { writeWranglerToml({ diff --git a/packages/wrangler/src/__tests__/dev.test.tsx b/packages/wrangler/src/__tests__/dev.test.tsx index 66f5d4580942..8b5d4c459a4e 100644 --- a/packages/wrangler/src/__tests__/dev.test.tsx +++ b/packages/wrangler/src/__tests__/dev.test.tsx @@ -879,12 +879,12 @@ describe("wrangler dev", () => { writeWranglerToml({ main: "index.js", dev: { - ip: "1.2.3.4", + ip: "::1", }, }); fs.writeFileSync("index.js", `export default {};`); await runWrangler("dev"); - expect((Dev as jest.Mock).mock.calls[0][0].initialIp).toEqual("1.2.3.4"); + expect((Dev as jest.Mock).mock.calls[0][0].initialIp).toEqual("::1"); expect(std.out).toMatchInlineSnapshot(`""`); expect(std.warn).toMatchInlineSnapshot(`""`); expect(std.err).toMatchInlineSnapshot(`""`); @@ -894,12 +894,14 @@ describe("wrangler dev", () => { writeWranglerToml({ main: "index.js", dev: { - ip: "1.2.3.4", + ip: "::1", }, }); fs.writeFileSync("index.js", `export default {};`); - await runWrangler("dev --ip=5.6.7.8"); - expect((Dev as jest.Mock).mock.calls[0][0].initialIp).toEqual("5.6.7.8"); + await runWrangler("dev --ip=127.0.0.1"); + expect((Dev as jest.Mock).mock.calls[0][0].initialIp).toEqual( + "127.0.0.1" + ); expect(std.out).toMatchInlineSnapshot(`""`); expect(std.warn).toMatchInlineSnapshot(`""`); expect(std.err).toMatchInlineSnapshot(`""`); diff --git a/packages/wrangler/src/__tests__/navigator-user-agent.test.ts b/packages/wrangler/src/__tests__/navigator-user-agent.test.ts new file mode 100644 index 000000000000..0f37527f4a27 --- /dev/null +++ b/packages/wrangler/src/__tests__/navigator-user-agent.test.ts @@ -0,0 +1,193 @@ +import assert from "node:assert"; +import { mkdir, readFile, writeFile } from "node:fs/promises"; +import path from "node:path"; +import dedent from "ts-dedent"; +import { bundleWorker } from "../deployment-bundle/bundle"; +import { noopModuleCollector } from "../deployment-bundle/module-collection"; +import { isNavigatorDefined } from "../navigator-user-agent"; +import { mockConsoleMethods } from "./helpers/mock-console"; +import { runInTempDir } from "./helpers/run-in-tmp"; + +/* + * This file contains inline comments with the word "javascript" + * This signals to a compatible editor extension that the template string + * contents should be syntax-highlighted as JavaScript. One such extension + * is zjcompt.es6-string-javascript, but there are others. + */ + +async function seedFs(files: Record): Promise { + for (const [location, contents] of Object.entries(files)) { + await mkdir(path.dirname(location), { recursive: true }); + await writeFile(location, contents); + } +} + +describe("isNavigatorDefined", () => { + test("default", () => { + expect(isNavigatorDefined(undefined)).toBe(false); + }); + + test("modern date", () => { + expect(isNavigatorDefined("2024-01-01")).toBe(true); + }); + + test("old date", () => { + expect(isNavigatorDefined("2000-01-01")).toBe(false); + }); + + test("switch date", () => { + expect(isNavigatorDefined("2022-03-21")).toBe(true); + }); + + test("before date", () => { + expect(isNavigatorDefined("2022-03-20")).toBe(false); + }); + + test("old date, but with flag", () => { + expect(isNavigatorDefined("2000-01-01", ["global_navigator"])).toBe(true); + }); + + test("old date, with disable flag", () => { + expect(isNavigatorDefined("2000-01-01", ["no_global_navigator"])).toBe( + false + ); + }); + + test("new date, but with disable flag", () => { + expect(isNavigatorDefined("2024-01-01", ["no_global_navigator"])).toBe( + false + ); + }); + + test("new date, with enable flag", () => { + expect(isNavigatorDefined("2024-01-01", ["global_navigator"])).toBe(true); + }); + + test("errors with disable and enable flags specified", () => { + try { + isNavigatorDefined("2024-01-01", [ + "no_global_navigator", + "global_navigator", + ]); + assert(false, "Unreachable"); + } catch (e) { + expect(e).toMatchInlineSnapshot( + `[AssertionError: Can't both enable and disable a flag]` + ); + } + }); +}); + +// Does bundleWorker respect the value of `defineNavigatorUserAgent`? +describe("defineNavigatorUserAgent is respected", () => { + runInTempDir(); + const std = mockConsoleMethods(); + + it("defineNavigatorUserAgent = false, navigator preserved", async () => { + await seedFs({ + "src/index.js": dedent/* javascript */ ` + function randomBytes(length) { + if (navigator.userAgent !== "Cloudflare-Workers") { + return new Uint8Array(require("node:crypto").randomBytes(length)); + } else { + return crypto.getRandomValues(new Uint8Array(length)); + } + } + export default { + async fetch(request, env) { + return new Response(randomBytes(10)) + }, + }; + `, + }); + + await bundleWorker( + { + file: path.resolve("src/index.js"), + directory: process.cwd(), + format: "modules", + moduleRoot: path.dirname(path.resolve("src/index.js")), + }, + path.resolve("dist"), + { + bundle: true, + additionalModules: [], + moduleCollector: noopModuleCollector, + serveAssetsFromWorker: false, + doBindings: [], + define: {}, + checkFetch: false, + targetConsumer: "deploy", + local: true, + projectRoot: process.cwd(), + defineNavigatorUserAgent: false, + } + ); + + // Build time warning that the dynamic import of `require("node:crypto")` may not be safe + expect(std.warn).toMatchInlineSnapshot(` + "▲ [WARNING] The package \\"node:crypto\\" wasn't found on the file system but is built into node. + + Your Worker may throw errors at runtime unless you enable the \\"nodejs_compat\\" compatibility flag. + Refer to https://developers.cloudflare.com/workers/runtime-apis/nodejs/ for more details. Imported + from: + - src/index.js + + " + `); + const fileContents = await readFile("dist/index.js", "utf8"); + + // navigator.userAgent should have been preserved as-is + expect(fileContents).toContain("navigator.userAgent"); + }); + + it("defineNavigatorUserAgent = true, navigator treeshaken", async () => { + await seedFs({ + "src/index.js": dedent/* javascript */ ` + function randomBytes(length) { + if (navigator.userAgent !== "Cloudflare-Workers") { + return new Uint8Array(require("node:crypto").randomBytes(length)); + } else { + return crypto.getRandomValues(new Uint8Array(length)); + } + } + export default { + async fetch(request, env) { + return new Response(randomBytes(10)) + }, + }; + `, + }); + + await bundleWorker( + { + file: path.resolve("src/index.js"), + directory: process.cwd(), + format: "modules", + moduleRoot: path.dirname(path.resolve("src/index.js")), + }, + path.resolve("dist"), + { + bundle: true, + additionalModules: [], + moduleCollector: noopModuleCollector, + serveAssetsFromWorker: false, + doBindings: [], + define: {}, + checkFetch: false, + targetConsumer: "deploy", + local: true, + projectRoot: process.cwd(), + defineNavigatorUserAgent: true, + } + ); + + // Build time warning is suppressed, because esbuild treeshakes the relevant code path + expect(std.warn).toMatchInlineSnapshot(`""`); + + const fileContents = await readFile("dist/index.js", "utf8"); + + // navigator.userAgent should have been defined, and so should not be present in the bundle + expect(fileContents).not.toContain("navigator.userAgent"); + }); +}); diff --git a/packages/wrangler/src/__tests__/pages/functions-build.test.ts b/packages/wrangler/src/__tests__/pages/functions-build.test.ts index 5327ac0a58fe..3e3693189e1c 100644 --- a/packages/wrangler/src/__tests__/pages/functions-build.test.ts +++ b/packages/wrangler/src/__tests__/pages/functions-build.test.ts @@ -413,7 +413,7 @@ export default { ); }); - it("should error at Node.js imports when the `nodejs_compat` compatibility flag is not set", async () => { + it("should warn at Node.js imports when the `nodejs_compat` compatibility flag is not set", async () => { mkdirSync("functions"); writeFileSync( "functions/hello.js", @@ -428,17 +428,18 @@ export default { ); await expect( - runWrangler(`pages functions build --outfile=public/_worker.bundle`) - ).rejects.toThrowErrorMatchingInlineSnapshot(` - "Build failed with 1 error: - hello.js:2:36: ERROR: Could not resolve \\"node:async_hooks\\"" - `); - expect(std.err).toContain( - 'The package "node:async_hooks" wasn\'t found on the file system but is built into node.' - ); - expect(std.err).toContain( - 'Add the "nodejs_compat" compatibility flag to your Pages project and make sure to prefix the module name with "node:" to enable Node.js compatibility.' + await runWrangler(`pages functions build --outfile=public/_worker.bundle`) ); + expect(std.warn).toMatchInlineSnapshot(` + "▲ [WARNING] The package \\"node:async_hooks\\" wasn't found on the file system but is built into node. + + Your Worker may throw errors at runtime unless you enable the \\"nodejs_compat\\" compatibility flag. + Refer to https://developers.cloudflare.com/workers/runtime-apis/nodejs/ for more details. Imported + from: + - hello.js + + " + `); }); it("should compile a _worker.js/ directory", async () => { diff --git a/packages/wrangler/src/__tests__/r2.test.ts b/packages/wrangler/src/__tests__/r2.test.ts index f562da7caa24..6190695c27c7 100644 --- a/packages/wrangler/src/__tests__/r2.test.ts +++ b/packages/wrangler/src/__tests__/r2.test.ts @@ -326,19 +326,25 @@ describe("r2", () => { "*/accounts/some-account-id/r2/buckets/testBucket/sippy", async (request, response, context) => { expect(await request.json()).toEqual({ - access_key: "aws-secret", - bucket: "awsBucket", - key_id: "aws-key", - provider: "AWS", - r2_access_key: "some-secret", - r2_key_id: "some-key", + source: { + provider: "aws", + region: "awsRegion", + bucket: "awsBucket", + accessKeyId: "aws-key", + secretAccessKey: "aws-secret", + }, + destination: { + provider: "r2", + accessKeyId: "some-key", + secretAccessKey: "some-secret", + }, }); return response.once(context.json(createFetchResult({}))); } ) ); await runWrangler( - "r2 bucket sippy enable testBucket --r2-key-id=some-key --r2-secret-access-key=some-secret --provider=AWS --key-id=aws-key --secret-access-key=aws-secret --bucket=awsBucket" + "r2 bucket sippy enable testBucket --r2-access-key-id=some-key --r2-secret-access-key=some-secret --provider=AWS --access-key-id=aws-key --secret-access-key=aws-secret --region=awsRegion --bucket=awsBucket" ); expect(std.out).toMatchInlineSnapshot( `"✨ Successfully enabled Sippy on the 'testBucket' bucket."` @@ -353,19 +359,24 @@ describe("r2", () => { "*/accounts/some-account-id/r2/buckets/testBucket/sippy", async (request, response, context) => { expect(await request.json()).toEqual({ - private_key: "gcs-private-key", - bucket: "gcsBucket", - client_email: "gcs-client-email", - provider: "GCS", - r2_access_key: "some-secret", - r2_key_id: "some-key", + source: { + provider: "gcs", + bucket: "gcsBucket", + clientEmail: "gcs-client-email", + privateKey: "gcs-private-key", + }, + destination: { + provider: "r2", + accessKeyId: "some-key", + secretAccessKey: "some-secret", + }, }); return response.once(context.json(createFetchResult({}))); } ) ); await runWrangler( - "r2 bucket sippy enable testBucket --r2-key-id=some-key --r2-secret-access-key=some-secret --provider=GCS --client-email=gcs-client-email --private-key=gcs-private-key --bucket=gcsBucket" + "r2 bucket sippy enable testBucket --r2-access-key-id=some-key --r2-secret-access-key=some-secret --provider=GCS --client-email=gcs-client-email --private-key=gcs-private-key --bucket=gcsBucket" ); expect(std.out).toMatchInlineSnapshot( `"✨ Successfully enabled Sippy on the 'testBucket' bucket."` @@ -399,12 +410,12 @@ describe("r2", () => { --provider [choices: \\"AWS\\", \\"GCS\\"] --bucket The name of the upstream bucket [string] --region (AWS provider only) The region of the upstream bucket [string] - --key-id (AWS provider only) The secret access key id for the upstream bucket [string] + --access-key-id (AWS provider only) The secret access key id for the upstream bucket [string] --secret-access-key (AWS provider only) The secret access key for the upstream bucket [string] --service-account-key-file (GCS provider only) The path to your Google Cloud service account key JSON file [string] --client-email (GCS provider only) The client email for your Google Cloud service account key [string] --private-key (GCS provider only) The private key for your Google Cloud service account key [string] - --r2-key-id The secret access key id for this R2 bucket [string] + --r2-access-key-id The secret access key id for this R2 bucket [string] --r2-secret-access-key The secret access key for this R2 bucket [string]" `); expect(std.err).toMatchInlineSnapshot(` @@ -522,7 +533,7 @@ describe("r2", () => { ); await runWrangler("r2 bucket sippy get testBucket"); expect(std.out).toMatchInlineSnapshot( - `"Sippy upstream bucket: https://storage.googleapis.com/storage/v1/b/testBucket."` + `"Sippy configuration: https://storage.googleapis.com/storage/v1/b/testBucket"` ); }); }); diff --git a/packages/wrangler/src/__tests__/type-generation.test.ts b/packages/wrangler/src/__tests__/type-generation.test.ts index 825918d54472..f5e248042f33 100644 --- a/packages/wrangler/src/__tests__/type-generation.test.ts +++ b/packages/wrangler/src/__tests__/type-generation.test.ts @@ -235,4 +235,24 @@ describe("generateTypes()", () => { " `); }); + + it("should accept a toml file without an entrypoint and fallback to the standard modules declarations", async () => { + fs.writeFileSync( + "./wrangler.toml", + TOML.stringify({ + vars: bindingsConfigMock.vars, + } as unknown as TOML.JsonMap), + "utf-8" + ); + + await runWrangler("types"); + expect(std.out).toMatchInlineSnapshot(` + "interface Env { + SOMETHING: \\"asdasdfasdf\\"; + ANOTHER: \\"thing\\"; + OBJECT_VAR: {\\"enterprise\\":\\"1701-D\\",\\"activeDuty\\":true,\\"captian\\":\\"Picard\\"}; + } + " + `); + }); }); diff --git a/packages/wrangler/src/api/integrations/bindings/executionContext.ts b/packages/wrangler/src/api/integrations/bindings/executionContext.ts new file mode 100644 index 000000000000..e940c026a8f2 --- /dev/null +++ b/packages/wrangler/src/api/integrations/bindings/executionContext.ts @@ -0,0 +1,5 @@ +export class ExecutionContext { + // eslint-disable-next-line @typescript-eslint/no-explicit-any, unused-imports/no-unused-vars + waitUntil(promise: Promise): void {} + passThroughOnException(): void {} +} diff --git a/packages/wrangler/src/api/integrations/bindings/index.ts b/packages/wrangler/src/api/integrations/bindings/index.ts index dcb510321253..aefabfa24906 100644 --- a/packages/wrangler/src/api/integrations/bindings/index.ts +++ b/packages/wrangler/src/api/integrations/bindings/index.ts @@ -5,8 +5,10 @@ import { getBoundRegisteredWorkers } from "../../../dev-registry"; import { getVarsForDev } from "../../../dev/dev-vars"; import { buildMiniflareBindingOptions } from "../../../dev/miniflare"; import { CacheStorage } from "./caches"; +import { ExecutionContext } from "./executionContext"; import { getServiceBindings } from "./services"; import type { Config } from "../../../config"; +import type { IncomingRequestCfProperties } from "@cloudflare/workers-types/experimental"; import type { MiniflareOptions } from "miniflare"; /** @@ -35,11 +37,22 @@ export type GetBindingsProxyOptions = { /** * Result of the `getBindingsProxy` utility */ -export type BindingsProxy> = { +export type BindingsProxy< + Bindings = Record, + CfProperties extends Record = IncomingRequestCfProperties +> = { /** * Object containing the various proxies */ bindings: Bindings; + /** + * Mock of the context object that Workers received in their request handler, all the object's methods are no-op + */ + cf: CfProperties; + /** + * Mock of the context object that Workers received in their request handler, all the object's methods are no-op + */ + ctx: ExecutionContext; /** * Caches object emulating the Workers Cache runtime API */ @@ -57,9 +70,12 @@ export type BindingsProxy> = { * @param options The various options that can tweak this function's behavior * @returns An Object containing the generated proxies alongside other related utilities */ -export async function getBindingsProxy>( +export async function getBindingsProxy< + Bindings = Record, + CfProperties extends Record = IncomingRequestCfProperties +>( options: GetBindingsProxyOptions = {} -): Promise> { +): Promise> { const rawConfig = readConfig(options.configPath, { experimentalJsonConfig: options.experimentalJsonConfig, }); @@ -83,11 +99,16 @@ export async function getBindingsProxy>( const vars = getVarsForDev(rawConfig, env); + const cf = await mf.getCf(); + deepFreeze(cf); + return { bindings: { ...vars, ...bindings, }, + cf: cf as CfProperties, + ctx: new ExecutionContext(), caches: new CacheStorage(), dispose: () => mf.dispose(), }; @@ -124,7 +145,10 @@ async function getMiniflareOptionsFromConfig( script: "", modules: true, ...bindingOptions, - serviceBindings, + serviceBindings: { + ...serviceBindings, + ...bindingOptions.serviceBindings, + }, }, externalDurableObjectWorker, ], @@ -168,3 +192,14 @@ function getMiniflarePersistOptions( d1Persist: `${persistPath}/d1`, }; } + +function deepFreeze>( + obj: T +): void { + Object.freeze(obj); + Object.entries(obj).forEach(([, prop]) => { + if (prop !== null && typeof prop === "object" && !Object.isFrozen(prop)) { + deepFreeze(prop as Record); + } + }); +} diff --git a/packages/wrangler/src/api/pages/deploy.tsx b/packages/wrangler/src/api/pages/deploy.tsx index 254adc695135..2068de864eb7 100644 --- a/packages/wrangler/src/api/pages/deploy.tsx +++ b/packages/wrangler/src/api/pages/deploy.tsx @@ -5,6 +5,7 @@ import { File, FormData } from "undici"; import { fetchResult } from "../../cfetch"; import { FatalError } from "../../errors"; import { logger } from "../../logger"; +import { isNavigatorDefined } from "../../navigator-user-agent"; import { buildFunctions } from "../../pages/buildFunctions"; import { MAX_DEPLOYMENT_ATTEMPTS } from "../../pages/constants"; import { @@ -143,6 +144,10 @@ export async function deploy({ const nodejsCompat = deploymentConfig.compatibility_flags?.includes("nodejs_compat"); + const defineNavigatorUserAgent = isNavigatorDefined( + deploymentConfig.compatibility_date, + deploymentConfig.compatibility_flags + ); /** * Evaluate if this is an Advanced Mode or Pages Functions project. If Advanced Mode, we'll * go ahead and upload `_worker.js` as is, but if Pages Functions, we need to attempt to build @@ -175,6 +180,7 @@ export async function deploy({ routesOutputPath, local: false, nodejsCompat, + defineNavigatorUserAgent, }); builtFunctions = readFileSync( @@ -254,6 +260,7 @@ export async function deploy({ workerJSDirectory: _workerPath, buildOutputDirectory: directory, nodejsCompat, + defineNavigatorUserAgent, }); } else if (_workerJS) { if (bundle) { @@ -270,6 +277,7 @@ export async function deploy({ watch: false, onEnd: () => {}, nodejsCompat, + defineNavigatorUserAgent, }); } else { await checkRawWorker(_workerPath, () => {}); diff --git a/packages/wrangler/src/cfetch/internal.ts b/packages/wrangler/src/cfetch/internal.ts index 796c99251763..cac73ce511a3 100644 --- a/packages/wrangler/src/cfetch/internal.ts +++ b/packages/wrangler/src/cfetch/internal.ts @@ -196,7 +196,7 @@ type ResponseWithBody = Response & { body: NonNullable }; export async function fetchR2Objects( resource: string, bodyInit: RequestInit = {} -): Promise { +): Promise { await requireLoggedIn(); const auth = requireApiToken(); const headers = cloneHeaders(bodyInit.headers); @@ -210,6 +210,8 @@ export async function fetchR2Objects( if (response.ok && response.body) { return response as ResponseWithBody; + } else if (response.status === 404) { + return null; } else { throw new Error( `Failed to fetch ${resource} - ${response.status}: ${response.statusText});` diff --git a/packages/wrangler/src/config/environment.ts b/packages/wrangler/src/config/environment.ts index f614072b2625..b7057b43cdc5 100644 --- a/packages/wrangler/src/config/environment.ts +++ b/packages/wrangler/src/config/environment.ts @@ -748,7 +748,9 @@ export type ConfigModuleRuleType = | "CommonJS" | "CompiledWasm" | "Text" - | "Data"; + | "Data" + | "PythonModule" + | "PythonRequirement"; export type TailConsumer = { /** The name of the service tail events will be forwarded to. */ diff --git a/packages/wrangler/src/config/index.ts b/packages/wrangler/src/config/index.ts index 2a016ca1ba66..99219c330d48 100644 --- a/packages/wrangler/src/config/index.ts +++ b/packages/wrangler/src/config/index.ts @@ -57,6 +57,17 @@ export function readConfig( throw new UserError(diagnostics.renderErrors()); } + const mainModule = "script" in args ? args.script : config.main; + if (typeof mainModule === "string" && mainModule.endsWith(".py")) { + // Workers with a python entrypoint should have bundling turned off, since all of Wrangler's bundling is JS/TS specific + config.no_bundle = true; + + // Workers with a python entrypoint need module rules for "*.py". Add one automatically as a DX nicety + if (!config.rules.some((rule) => rule.type === "PythonModule")) { + config.rules.push({ type: "PythonModule", globs: ["**/*.py"] }); + } + } + return config; } diff --git a/packages/wrangler/src/d1/create.tsx b/packages/wrangler/src/d1/create.tsx index 04b3d0128695..42cce40b28e2 100644 --- a/packages/wrangler/src/d1/create.tsx +++ b/packages/wrangler/src/d1/create.tsx @@ -1,5 +1,6 @@ import { Box, Text } from "ink"; import React from "react"; +import { printWranglerBanner } from ".."; import { fetchResult } from "../cfetch"; import { withConfig } from "../config"; import { UserError } from "../errors"; @@ -32,6 +33,7 @@ export function Options(yargs: CommonYargsArgv) { type HandlerOptions = StrictYargsOptionsToInterface; export const Handler = withConfig( async ({ name, config, location }): Promise => { + await printWranglerBanner(); const accountId = await requireAuth(config); if (location) { diff --git a/packages/wrangler/src/d1/delete.ts b/packages/wrangler/src/d1/delete.ts index d7434d551cf2..9fe936e63e40 100644 --- a/packages/wrangler/src/d1/delete.ts +++ b/packages/wrangler/src/d1/delete.ts @@ -1,3 +1,4 @@ +import { printWranglerBanner } from ".."; import { fetchResult } from "../cfetch"; import { withConfig } from "../config"; import { confirm } from "../dialogs"; @@ -24,6 +25,7 @@ export function Options(d1ListYargs: CommonYargsArgv) { type HandlerOptions = StrictYargsOptionsToInterface; export const Handler = withConfig( async ({ name, skipConfirmation, config }): Promise => { + await printWranglerBanner(); const accountId = await requireAuth(config); const db: Database = await getDatabaseByNameOrBinding( diff --git a/packages/wrangler/src/d1/execute.tsx b/packages/wrangler/src/d1/execute.tsx index 135d4aa8878c..b304ecc6cde9 100644 --- a/packages/wrangler/src/d1/execute.tsx +++ b/packages/wrangler/src/d1/execute.tsx @@ -5,11 +5,12 @@ import { Static, Text } from "ink"; import Table from "ink-table"; import { Miniflare } from "miniflare"; import React from "react"; +import { printWranglerBanner } from "../"; import { fetchResult } from "../cfetch"; import { readConfig } from "../config"; import { getLocalPersistencePath } from "../dev/get-local-persistence-path"; import { confirm } from "../dialogs"; -import { UserError } from "../errors"; +import { JsonFriendlyFatalError, UserError } from "../errors"; import { logger } from "../logger"; import { readFileSync } from "../parse"; import { readableRelative } from "../paths"; @@ -98,56 +99,70 @@ export const Handler = async (args: HandlerOptions): Promise => { // set loggerLevel to error to avoid readConfig warnings appearing in JSON output logger.loggerLevel = "error"; } + await printWranglerBanner(); const config = readConfig(args.config, args); if (file && command) return logger.error(`Error: can't provide both --command and --file.`); const isInteractive = process.stdout.isTTY; - const response: QueryResult[] | null = await executeSql({ - local, - config, - name: database, - shouldPrompt: isInteractive && !yes, - persistTo, - file, - command, - json, - preview, - batchSize, - }); + try { + const response: QueryResult[] | null = await executeSql({ + local, + config, + name: database, + shouldPrompt: isInteractive && !yes, + persistTo, + file, + command, + json, + preview, + batchSize, + }); - // Early exit if prompt rejected - if (!response) return; + // Early exit if prompt rejected + if (!response) return; - if (isInteractive && !json) { - // Render table if single result - logger.log( - renderToString( - - {(result) => { - // batch results - if (!Array.isArray(result)) { - const { results, query } = result; + if (isInteractive && !json) { + // Render table if single result + logger.log( + renderToString( + + {(result) => { + // batch results + if (!Array.isArray(result)) { + const { results, query } = result; - if (Array.isArray(results) && results.length > 0) { - const shortQuery = shorten(query, 48); - return ( - <> - {shortQuery ? {shortQuery} : null} -
- - ); + if (Array.isArray(results) && results.length > 0) { + const shortQuery = shorten(query, 48); + return ( + <> + {shortQuery ? {shortQuery} : null} +
+ + ); + } } - } - }} -
- ) - ); - } else { - // set loggerLevel back to what it was before to actually output the JSON in stdout - logger.loggerLevel = existingLogLevel; - logger.log(JSON.stringify(response, null, 2)); + }} +
+ ) + ); + } else { + // set loggerLevel back to what it was before to actually output the JSON in stdout + logger.loggerLevel = existingLogLevel; + logger.log(JSON.stringify(response, null, 2)); + } + } catch (error) { + if (json && error instanceof Error) { + logger.loggerLevel = existingLogLevel; + const messageToDisplay = + error.name === "APIError" ? error : { text: error.message }; + throw new JsonFriendlyFatalError( + JSON.stringify({ error: messageToDisplay }, null, 2) + ); + } else { + throw error; + } } }; diff --git a/packages/wrangler/src/d1/index.ts b/packages/wrangler/src/d1/index.ts index 9cc9cc7af110..c1b32059b8d1 100644 --- a/packages/wrangler/src/d1/index.ts +++ b/packages/wrangler/src/d1/index.ts @@ -3,6 +3,7 @@ import * as Create from "./create"; import * as Delete from "./delete"; import * as Execute from "./execute"; import * as Info from "./info"; +import * as Insights from "./insights"; import * as List from "./list"; import * as Migrations from "./migrations"; import * as TimeTravel from "./timeTravel"; @@ -19,6 +20,12 @@ export function d1(yargs: CommonYargsArgv) { Info.Options, Info.Handler ) + .command( + "insights ", + "Experimental command. Get information about the queries run on a D1 database.", + Insights.Options, + Insights.Handler + ) .command( "create ", "Create D1 database", diff --git a/packages/wrangler/src/d1/info.tsx b/packages/wrangler/src/d1/info.tsx index 2f79902f6410..cdaf76db6cbd 100644 --- a/packages/wrangler/src/d1/info.tsx +++ b/packages/wrangler/src/d1/info.tsx @@ -1,6 +1,7 @@ import Table from "ink-table"; import prettyBytes from "pretty-bytes"; import React from "react"; +import { printWranglerBanner } from ".."; import { fetchGraphqlResult } from "../cfetch"; import { withConfig } from "../config"; import { logger } from "../logger"; @@ -49,7 +50,7 @@ export const Handler = withConfig( output["database_size"] = output["file_size"]; delete output["file_size"]; } - if (result.version === "beta") { + if (result.version !== "alpha") { const today = new Date(); const yesterday = new Date(new Date(today).setDate(today.getDate() - 1)); @@ -118,7 +119,6 @@ export const Handler = withConfig( logger.log(JSON.stringify(output, null, 2)); } else { // Snip off the "uuid" property from the response and use those as the header - const entries = Object.entries(output).filter(([k, _v]) => k !== "uuid"); const data = entries.map(([k, v]) => { let value; @@ -140,6 +140,7 @@ export const Handler = withConfig( }; }); + await printWranglerBanner(); logger.log(renderToString()); } } diff --git a/packages/wrangler/src/d1/insights.ts b/packages/wrangler/src/d1/insights.ts new file mode 100644 index 000000000000..9692739bd803 --- /dev/null +++ b/packages/wrangler/src/d1/insights.ts @@ -0,0 +1,170 @@ +import { printWranglerBanner } from ".."; +import { fetchGraphqlResult } from "../cfetch"; +import { withConfig } from "../config"; +import { logger } from "../logger"; +import { requireAuth } from "../user"; +import { + d1BetaWarning, + getDatabaseByNameOrBinding, + getDatabaseInfoFromId, +} from "./utils"; +import type { + CommonYargsArgv, + StrictYargsOptionsToInterface, +} from "../yargs-types"; +import type { D1QueriesGraphQLResponse, Database } from "./types"; + +export function Options(d1ListYargs: CommonYargsArgv) { + return d1ListYargs + .positional("name", { + describe: "The name of the DB", + type: "string", + demandOption: true, + }) + .option("timePeriod", { + choices: ["1d", "7d", "31d"] as const, + describe: "Fetch data from now to the provided time period", + default: "1d" as const, + }) + .option("sort-type", { + choices: ["sum", "avg"] as const, + describe: "Choose the operation you want to sort insights by", + default: "sum" as const, + }) + .option("sort-by", { + choices: ["time", "reads", "writes", "count"] as const, + describe: "Choose the field you want to sort insights by", + default: "time" as const, + }) + .option("sort-direction", { + choices: ["ASC", "DESC"] as const, + describe: "Choose a sort direction", + default: "DESC" as const, + }) + .option("count", { + describe: "fetch insights about the first X queries", + type: "number", + default: 5, + }) + .option("json", { + describe: "return output as clean JSON", + type: "boolean", + default: false, + }) + .epilogue(d1BetaWarning); +} + +const cliOptionToGraphQLOption = { + time: "queryDurationMs", + reads: "rowsRead", + writes: "rowsWritten", + count: "count", +}; + +type HandlerOptions = StrictYargsOptionsToInterface; +export const Handler = withConfig( + async ({ + name, + config, + json, + count, + timePeriod, + sortType, + sortBy, + sortDirection, + }): Promise => { + const accountId = await requireAuth(config); + const db: Database = await getDatabaseByNameOrBinding( + config, + accountId, + name + ); + + const result = await getDatabaseInfoFromId(accountId, db.uuid); + + const output: Record[] = []; + + if (result.version !== "alpha") { + const convertedTimePeriod = Number(timePeriod.replace("d", "")); + const endDate = new Date(); + const startDate = new Date( + new Date(endDate).setDate(endDate.getDate() - convertedTimePeriod) + ); + const parsedSortBy = cliOptionToGraphQLOption[sortBy]; + const orderByClause = + parsedSortBy === "count" + ? `${parsedSortBy}_${sortDirection}` + : `${sortType}_${parsedSortBy}_${sortDirection}`; + const graphqlQueriesResult = + await fetchGraphqlResult({ + method: "POST", + body: JSON.stringify({ + query: `query getD1QueriesOverviewQuery($accountTag: string, $filter: ZoneWorkersRequestsFilter_InputObject) { + viewer { + accounts(filter: {accountTag: $accountTag}) { + d1QueriesAdaptiveGroups(limit: ${count}, filter: $filter, orderBy: [${orderByClause}]) { + sum { + queryDurationMs + rowsRead + rowsWritten + } + avg { + queryDurationMs + rowsRead + rowsWritten + } + count + dimensions { + query + } + } + } + } + }`, + operationName: "getD1QueriesOverviewQuery", + variables: { + accountTag: accountId, + filter: { + AND: [ + { + datetimeHour_geq: startDate.toISOString(), + datetimeHour_leq: endDate.toISOString(), + databaseId: db.uuid, + }, + ], + }, + }, + }), + headers: { + "Content-Type": "application/json", + }, + }); + + graphqlQueriesResult?.data?.viewer?.accounts[0]?.d1QueriesAdaptiveGroups?.forEach( + (row) => { + if (!row.dimensions.query) return; + output.push({ + query: row.dimensions.query, + avgRowsRead: row?.avg?.rowsRead ?? 0, + totalRowsRead: row?.sum?.rowsRead ?? 0, + avgRowsWritten: row?.avg?.rowsWritten ?? 0, + totalRowsWritten: row?.sum?.rowsWritten ?? 0, + avgDurationMs: row?.avg?.queryDurationMs ?? 0, + totalDurationMs: row?.sum?.queryDurationMs ?? 0, + numberOfTimesRun: row?.count ?? 0, + }); + } + ); + } + + if (json) { + logger.log(JSON.stringify(output, null, 2)); + } else { + await printWranglerBanner(); + logger.log( + "-------------------\n🚧 `wrangler d1 insights` is an experimental command.\n🚧 Flags for this command, their descriptions, and output may change between wrangler versions.\n-------------------\n" + ); + logger.log(JSON.stringify(output, null, 2)); + } + } +); diff --git a/packages/wrangler/src/d1/list.tsx b/packages/wrangler/src/d1/list.tsx index f82fb26c17fa..f08db1fe53e0 100644 --- a/packages/wrangler/src/d1/list.tsx +++ b/packages/wrangler/src/d1/list.tsx @@ -1,5 +1,6 @@ import Table from "ink-table"; import React from "react"; +import { printWranglerBanner } from ".."; import { fetchResult } from "../cfetch"; import { withConfig } from "../config"; import { logger } from "../logger"; @@ -31,6 +32,7 @@ export const Handler = withConfig( if (json) { logger.log(JSON.stringify(dbs, null, 2)); } else { + await printWranglerBanner(); logger.log(renderToString(
)); } } diff --git a/packages/wrangler/src/d1/migrations/apply.tsx b/packages/wrangler/src/d1/migrations/apply.tsx index 3e0f60148cfd..f91ba6eec47d 100644 --- a/packages/wrangler/src/d1/migrations/apply.tsx +++ b/packages/wrangler/src/d1/migrations/apply.tsx @@ -4,6 +4,7 @@ import path from "path"; import { Box, Text } from "ink"; import Table from "ink-table"; import React from "react"; +import { printWranglerBanner } from "../.."; import { withConfig } from "../../config"; import { confirm } from "../../dialogs"; import { UserError } from "../../errors"; @@ -51,6 +52,7 @@ export const ApplyHandler = withConfig( preview, batchSize, }): Promise => { + await printWranglerBanner(); const databaseInfo = getDatabaseInfoFromConfig(config, database); if (!databaseInfo && !local) { throw new UserError( diff --git a/packages/wrangler/src/d1/migrations/create.tsx b/packages/wrangler/src/d1/migrations/create.tsx index 5553e327fc11..1cb9f3ee190b 100644 --- a/packages/wrangler/src/d1/migrations/create.tsx +++ b/packages/wrangler/src/d1/migrations/create.tsx @@ -2,6 +2,7 @@ import fs from "node:fs"; import path from "path"; import { Box, Text } from "ink"; import React from "react"; +import { printWranglerBanner } from "../.."; import { withConfig } from "../../config"; import { UserError } from "../../errors"; import { logger } from "../../logger"; @@ -27,6 +28,7 @@ type CreateHandlerOptions = StrictYargsOptionsToInterface; export const CreateHandler = withConfig( async ({ config, database, message }): Promise => { + await printWranglerBanner(); const databaseInfo = getDatabaseInfoFromConfig(config, database); if (!databaseInfo) { throw new UserError( diff --git a/packages/wrangler/src/d1/migrations/list.tsx b/packages/wrangler/src/d1/migrations/list.tsx index bfa3de41dee5..ea88d528c2d0 100644 --- a/packages/wrangler/src/d1/migrations/list.tsx +++ b/packages/wrangler/src/d1/migrations/list.tsx @@ -2,6 +2,7 @@ import path from "path"; import { Box, Text } from "ink"; import Table from "ink-table"; import React from "react"; +import { printWranglerBanner } from "../.."; import { withConfig } from "../../config"; import { UserError } from "../../errors"; import { logger } from "../../logger"; @@ -28,6 +29,7 @@ type ListHandlerOptions = StrictYargsOptionsToInterface; export const ListHandler = withConfig( async ({ config, database, local, persistTo, preview }): Promise => { + await printWranglerBanner(); if (!local) { await requireAuth({}); } diff --git a/packages/wrangler/src/d1/types.ts b/packages/wrangler/src/d1/types.ts index 3c40045f3a45..7d76d3c2a4fa 100644 --- a/packages/wrangler/src/d1/types.ts +++ b/packages/wrangler/src/d1/types.ts @@ -72,3 +72,35 @@ export interface D1MetricsGraphQLResponse { }; }; } + +export interface D1Queries { + avg?: { + queryDurationMs?: number; + rowsRead?: number; + rowsWritten?: number; + }; + sum?: { + queryDurationMs?: number; + rowsRead?: number; + rowsWritten?: number; + }; + count?: number; + dimensions: { + query?: string; + databaseId?: string; + date?: string; + datetime?: string; + datetimeMinute?: string; + datetimeFiveMinutes?: string; + datetimeFifteenMinutes?: string; + datetimeHour?: string; + }; +} + +export interface D1QueriesGraphQLResponse { + data: { + viewer: { + accounts: { d1QueriesAdaptiveGroups?: D1Queries[] }[]; + }; + }; +} diff --git a/packages/wrangler/src/deploy/deploy.ts b/packages/wrangler/src/deploy/deploy.ts index 41588459fdde..35995e76681d 100644 --- a/packages/wrangler/src/deploy/deploy.ts +++ b/packages/wrangler/src/deploy/deploy.ts @@ -26,6 +26,7 @@ import { getMigrationsToUpload } from "../durable"; import { UserError } from "../errors"; import { logger } from "../logger"; import { getMetricsUsageHeaders } from "../metrics"; +import { isNavigatorDefined } from "../navigator-user-agent"; import { APIError, ParseError } from "../parse"; import { getWranglerTmpDir } from "../paths"; import { getQueue, putConsumer } from "../queues/client"; @@ -46,7 +47,11 @@ import type { ZoneNameRoute, } from "../config/environment"; import type { Entry } from "../deployment-bundle/entry"; -import type { CfPlacement, CfWorkerInit } from "../deployment-bundle/worker"; +import type { + CfModuleType, + CfPlacement, + CfWorkerInit, +} from "../deployment-bundle/worker"; import type { PutConsumerBody } from "../queues/client"; import type { AssetPaths } from "../sites"; import type { RetrieveSourceMapFunction } from "../sourcemap"; @@ -516,6 +521,10 @@ See https://developers.cloudflare.com/workers/platform/compatibility-dates for m targetConsumer: "deploy", local: false, projectRoot: props.projectRoot, + defineNavigatorUserAgent: isNavigatorDefined( + props.compatibilityDate ?? config.compatibility_date, + props.compatibilityFlags ?? config.compatibility_flags + ), } ); @@ -532,6 +541,19 @@ See https://developers.cloudflare.com/workers/platform/compatibility-dates for m dependencies[modulePath] = { bytesInOutput }; } + // Add modules to dependencies for size warning + for (const module of modules) { + const modulePath = + module.filePath === undefined + ? module.name + : path.relative("", module.filePath); + const bytesInOutput = + typeof module.content === "string" + ? Buffer.byteLength(module.content) + : module.content.byteLength; + dependencies[modulePath] = { bytesInOutput }; + } + const content = readFileSync(resolvedEntryPointPath, { encoding: "utf-8", }); @@ -616,7 +638,7 @@ See https://developers.cloudflare.com/workers/platform/compatibility-dates for m const worker: CfWorkerInit = { name: scriptName, main: { - name: entryPointName, + name: stripPySuffix(entryPointName, bundleType), filePath: resolvedEntryPointPath, content: content, type: bundleType, @@ -719,6 +741,15 @@ See https://developers.cloudflare.com/workers/platform/compatibility-dates for m ) { err.preventReport(); + if ( + err.notes[0].text === + "binding DB of type d1 must have a valid `id` specified [code: 10021]" + ) { + throw new UserError( + "You must use a real database in the database_id configuration. You can find your databases using 'wrangler d1 list', or read how to develop locally with D1 here: https://developers.cloudflare.com/d1/configuration/local-development" + ); + } + const maybeNameToFilePath = (moduleName: string) => { // If this is a service worker, always return the entrypoint path. // Service workers can't have additional JavaScript modules. @@ -1142,6 +1173,15 @@ function updateQueueConsumers(config: Config): Promise[] { }); } +// TODO(soon): workerd requires python modules to be named without a file extension +// We should remove this restriction +function stripPySuffix(modulePath: string, type?: CfModuleType) { + if (type === "python" && modulePath.endsWith(".py")) { + return modulePath.slice(0, -3); + } + return modulePath; +} + async function noBundleWorker( entry: Entry, rules: Rule[], @@ -1152,10 +1192,14 @@ async function noBundleWorker( await writeAdditionalModules(modules, outDir); } + const bundleType = getBundleType(entry.format, entry.file); return { - modules, + modules: modules.map((m) => ({ + ...m, + name: stripPySuffix(m.name, m.type), + })), dependencies: {} as { [path: string]: { bytesInOutput: number } }, resolvedEntryPointPath: entry.file, - bundleType: getBundleType(entry.format), + bundleType, }; } diff --git a/packages/wrangler/src/deploy/index.ts b/packages/wrangler/src/deploy/index.ts index ae21849d7a45..cae3af6dcc81 100644 --- a/packages/wrangler/src/deploy/index.ts +++ b/packages/wrangler/src/deploy/index.ts @@ -293,7 +293,9 @@ export async function deployHandler( args.siteInclude, args.siteExclude ); - await standardPricingWarning(accountId, config); + + if (!args.dryRun) await standardPricingWarning(accountId, config); + await deploy({ config, accountId, diff --git a/packages/wrangler/src/deployment-bundle/bundle-type.ts b/packages/wrangler/src/deployment-bundle/bundle-type.ts index c261debc017c..f9923d08c0c0 100644 --- a/packages/wrangler/src/deployment-bundle/bundle-type.ts +++ b/packages/wrangler/src/deployment-bundle/bundle-type.ts @@ -1,8 +1,14 @@ import type { CfModuleType, CfScriptFormat } from "./worker"; /** - * Compute the entry-point type from the bundle format. + * Compute the entry-point module type from the bundle format. */ -export function getBundleType(format: CfScriptFormat): CfModuleType { +export function getBundleType( + format: CfScriptFormat, + file?: string +): CfModuleType { + if (file && file.endsWith(".py")) { + return "python"; + } return format === "modules" ? "esm" : "commonjs"; } diff --git a/packages/wrangler/src/deployment-bundle/bundle.ts b/packages/wrangler/src/deployment-bundle/bundle.ts index 66c5b87065da..c3797f66ebb7 100644 --- a/packages/wrangler/src/deployment-bundle/bundle.ts +++ b/packages/wrangler/src/deployment-bundle/bundle.ts @@ -86,6 +86,7 @@ export type BundleOptions = { forPages?: boolean; local: boolean; projectRoot: string | undefined; + defineNavigatorUserAgent: boolean; }; /** @@ -124,6 +125,7 @@ export async function bundleWorker( forPages, local, projectRoot, + defineNavigatorUserAgent, }: BundleOptions ): Promise { // We create a temporary directory for any one-off files we @@ -312,6 +314,9 @@ export async function bundleWorker( conditions: BUILD_CONDITIONS, ...(process.env.NODE_ENV && { define: { + ...(defineNavigatorUserAgent + ? { "navigator.userAgent": `"Cloudflare-Workers"` } + : {}), // use process.env["NODE_ENV" + ""] so that esbuild doesn't replace it // when we do a build of wrangler. (re: https://github.com/cloudflare/workers-sdk/issues/1477) "process.env.NODE_ENV": `"${process.env["NODE_ENV" + ""]}"`, @@ -328,7 +333,7 @@ export async function bundleWorker( ...(legacyNodeCompat ? [NodeGlobalsPolyfills({ buffer: true }), NodeModulesPolyfills()] : []), - ...(nodejsCompat ? [nodejsCompatPlugin] : []), + nodejsCompatPlugin(!!nodejsCompat), cloudflareInternalPlugin, buildResultPlugin, ...(plugins || []), diff --git a/packages/wrangler/src/deployment-bundle/create-worker-upload-form.ts b/packages/wrangler/src/deployment-bundle/create-worker-upload-form.ts index 554b82dc06e8..ee5dc6c5a70b 100644 --- a/packages/wrangler/src/deployment-bundle/create-worker-upload-form.ts +++ b/packages/wrangler/src/deployment-bundle/create-worker-upload-form.ts @@ -25,6 +25,10 @@ export function toMimeType(type: CfModuleType): string { return "application/octet-stream"; case "text": return "text/plain"; + case "python": + return "text/x-python"; + case "python-requirement": + return "text/x-python-requirement"; default: throw new TypeError("Unsupported module: " + type); } diff --git a/packages/wrangler/src/deployment-bundle/esbuild-plugins/nodejs-compat.ts b/packages/wrangler/src/deployment-bundle/esbuild-plugins/nodejs-compat.ts index a7587f453713..3b6b2094f572 100644 --- a/packages/wrangler/src/deployment-bundle/esbuild-plugins/nodejs-compat.ts +++ b/packages/wrangler/src/deployment-bundle/esbuild-plugins/nodejs-compat.ts @@ -1,13 +1,76 @@ +import { relative } from "path"; +import chalk from "chalk"; +import { logger } from "../../logger"; import type { Plugin } from "esbuild"; +// Infinite loop detection +const seen = new Set(); + +// Prevent multiple warnings per package +const warnedPackaged = new Map(); + /** * An esbuild plugin that will mark any `node:...` imports as external. */ -export const nodejsCompatPlugin: Plugin = { +export const nodejsCompatPlugin: (silenceWarnings: boolean) => Plugin = ( + silenceWarnings +) => ({ name: "nodejs_compat imports plugin", setup(pluginBuild) { - pluginBuild.onResolve({ filter: /node:.*/ }, () => { - return { external: true }; + seen.clear(); + warnedPackaged.clear(); + pluginBuild.onResolve( + { filter: /node:.*/ }, + async ({ path, kind, resolveDir, ...opts }) => { + const specifier = `${path}:${kind}:${resolveDir}:${opts.importer}`; + if (seen.has(specifier)) { + return; + } + + seen.add(specifier); + // Try to resolve this import as a normal package + const result = await pluginBuild.resolve(path, { + kind, + resolveDir, + importer: opts.importer, + }); + + if (result.errors.length > 0) { + // esbuild couldn't resolve the package + // We should warn the user, but not fail the build + + if (!warnedPackaged.has(path)) { + warnedPackaged.set(path, [opts.importer]); + } else { + warnedPackaged.set(path, [ + ...warnedPackaged.get(path), + opts.importer, + ]); + } + return { external: true }; + } + // This is a normal package—don't treat it specially + return result; + } + ); + // Wait until the build finishes to log warnings, so that all files which import a package + // can be collated + pluginBuild.onEnd(() => { + if (!silenceWarnings) + warnedPackaged.forEach((importers: string[], path: string) => { + logger.warn( + `The package "${path}" wasn't found on the file system but is built into node. +Your Worker may throw errors at runtime unless you enable the "nodejs_compat" compatibility flag. Refer to https://developers.cloudflare.com/workers/runtime-apis/nodejs/ for more details. Imported from: +${importers + .map( + (i) => + ` - ${chalk.blue( + relative(pluginBuild.initialOptions.absWorkingDir ?? "/", i) + )}` + ) + .join("\n")}` + ); + }); }); }, -}; +}); diff --git a/packages/wrangler/src/deployment-bundle/find-additional-modules.ts b/packages/wrangler/src/deployment-bundle/find-additional-modules.ts index 5259bbff4cdd..a00e69903668 100644 --- a/packages/wrangler/src/deployment-bundle/find-additional-modules.ts +++ b/packages/wrangler/src/deployment-bundle/find-additional-modules.ts @@ -4,6 +4,7 @@ import chalk from "chalk"; import globToRegExp from "glob-to-regexp"; import { UserError } from "../errors"; import { logger } from "../logger"; +import { getBundleType } from "./bundle-type"; import { RuleTypeToModuleType } from "./module-collection"; import { parseRules } from "./rules"; import type { Rule } from "../config/environment"; @@ -49,6 +50,36 @@ export async function findAdditionalModules( name: m.name, })); + // Try to find a requirements.txt file + const isPythonEntrypoint = + getBundleType(entry.format, entry.file) === "python"; + + if (isPythonEntrypoint) { + try { + const pythonRequirements = await readFile( + path.resolve(entry.directory, "requirements.txt"), + "utf-8" + ); + + // This is incredibly naive. However, it supports common syntax for requirements.txt + for (const requirement of pythonRequirements.split("\n")) { + const packageName = requirement.match(/^[^\d\W]\w*/); + if (typeof packageName?.[0] === "string") { + modules.push({ + type: "python-requirement", + name: packageName?.[0], + content: "", + filePath: undefined, + }); + } + } + // We don't care if a requirements.txt isn't found + } catch (e) { + logger.debug( + "Python entrypoint detected, but no requirements.txt file found." + ); + } + } if (modules.length > 0) { logger.info(`Attaching additional modules:`); logger.table( @@ -56,7 +87,10 @@ export async function findAdditionalModules( return { Name: name, Type: type ?? "", - Size: `${(content.length / 1024).toFixed(2)} KiB`, + Size: + type === "python-requirement" + ? "" + : `${(content.length / 1024).toFixed(2)} KiB`, }; }) ); diff --git a/packages/wrangler/src/deployment-bundle/guess-worker-format.ts b/packages/wrangler/src/deployment-bundle/guess-worker-format.ts index 3efc0df4ac42..a2f81419d319 100644 --- a/packages/wrangler/src/deployment-bundle/guess-worker-format.ts +++ b/packages/wrangler/src/deployment-bundle/guess-worker-format.ts @@ -20,6 +20,17 @@ export default async function guessWorkerFormat( hint: CfScriptFormat | undefined, tsconfig?: string | undefined ): Promise { + const parsedEntryPath = path.parse(entryFile); + if (parsedEntryPath.ext == ".py") { + logger.warn( + `The entrypoint ${path.relative( + process.cwd(), + entryFile + )} defines a Python worker, support for Python workers is currently experimental.` + ); + return "modules"; + } + const result = await esbuild.build({ ...COMMON_ESBUILD_OPTIONS, entryPoints: [entryFile], diff --git a/packages/wrangler/src/deployment-bundle/module-collection.ts b/packages/wrangler/src/deployment-bundle/module-collection.ts index c8ee2905b8f2..97218dee50d2 100644 --- a/packages/wrangler/src/deployment-bundle/module-collection.ts +++ b/packages/wrangler/src/deployment-bundle/module-collection.ts @@ -32,6 +32,8 @@ export const RuleTypeToModuleType: Record = CompiledWasm: "compiled-wasm", Data: "buffer", Text: "text", + PythonModule: "python", + PythonRequirement: "python-requirement", }; export const ModuleTypeToRuleType = flipObject(RuleTypeToModuleType); diff --git a/packages/wrangler/src/deployment-bundle/worker.ts b/packages/wrangler/src/deployment-bundle/worker.ts index 3c10dfa08729..2cc6b7b506ce 100644 --- a/packages/wrangler/src/deployment-bundle/worker.ts +++ b/packages/wrangler/src/deployment-bundle/worker.ts @@ -14,7 +14,9 @@ export type CfModuleType = | "commonjs" | "compiled-wasm" | "text" - | "buffer"; + | "buffer" + | "python" + | "python-requirement"; /** * An imported module. diff --git a/packages/wrangler/src/dev.tsx b/packages/wrangler/src/dev.tsx index b6f4859d370f..62d41c73726f 100644 --- a/packages/wrangler/src/dev.tsx +++ b/packages/wrangler/src/dev.tsx @@ -626,12 +626,16 @@ export async function startApiDev(args: StartDevOptions) { }; } /** + * Get an available TCP port number. + * * Avoiding calling `getPort()` multiple times by memoizing the first result. */ -function memoizeGetPort(defaultPort?: number) { +function memoizeGetPort(defaultPort: number, host: string) { let portValue: number; return async () => { - return portValue || (portValue = await getPort({ port: defaultPort })); + // Check a specific host to avoid probing all local addresses. + portValue = portValue ?? (await getPort({ port: defaultPort, host: host })); + return portValue; }; } /** @@ -705,14 +709,16 @@ async function validateDevServerSettings( ); const { zoneId, host, routes } = await getZoneIdHostAndRoutes(args, config); - const getLocalPort = memoizeGetPort(DEFAULT_LOCAL_PORT); - const getInspectorPort = memoizeGetPort(DEFAULT_INSPECTOR_PORT); + const initialIp = args.ip || config.dev.ip; + const initialIpListenCheck = initialIp === "*" ? "0.0.0.0" : initialIp; + const getLocalPort = memoizeGetPort(DEFAULT_LOCAL_PORT, initialIpListenCheck); + const getInspectorPort = memoizeGetPort(DEFAULT_INSPECTOR_PORT, "127.0.0.1"); // Our inspector proxy server will be binding to the result of // `getInspectorPort`. If we attempted to bind workerd to the same inspector // port, we'd get a port already in use error. Therefore, generate a new port // for our runtime to bind its inspector service to. - const getRuntimeInspectorPort = memoizeGetPort(); + const getRuntimeInspectorPort = memoizeGetPort(0, "127.0.0.1"); if (config.services && config.services.length > 0) { logger.warn( diff --git a/packages/wrangler/src/dev/dev.tsx b/packages/wrangler/src/dev/dev.tsx index ff6a091f8714..8291ef0bc744 100644 --- a/packages/wrangler/src/dev/dev.tsx +++ b/packages/wrangler/src/dev/dev.tsx @@ -24,6 +24,7 @@ import { unregisterWorker, } from "../dev-registry"; import { logger } from "../logger"; +import { isNavigatorDefined } from "../navigator-user-agent"; import openInBrowser from "../open-in-browser"; import { getWranglerTmpDir } from "../paths"; import { openInspector } from "./inspect"; @@ -354,6 +355,10 @@ function DevSession(props: DevSessionProps) { experimentalLocal: props.experimentalLocal, projectRoot: props.projectRoot, onBundleStart, + defineNavigatorUserAgent: isNavigatorDefined( + props.compatibilityDate, + props.compatibilityFlags + ), }); useEffect(() => { if (bundle) onReloadStart(bundle); diff --git a/packages/wrangler/src/dev/miniflare.ts b/packages/wrangler/src/dev/miniflare.ts index 3fb8f577e834..54dd9c08b37d 100644 --- a/packages/wrangler/src/dev/miniflare.ts +++ b/packages/wrangler/src/dev/miniflare.ts @@ -1,6 +1,6 @@ import assert from "node:assert"; import { randomUUID } from "node:crypto"; -import { realpathSync } from "node:fs"; +import { readFileSync, realpathSync } from "node:fs"; import path from "node:path"; import { Log, LogLevel, Miniflare, Mutex, TypedEventTarget } from "miniflare"; import { AIFetcher } from "../ai/fetcher"; @@ -16,6 +16,7 @@ import type { CfDurableObject, CfHyperdrive, CfKvNamespace, + CfModuleType, CfQueue, CfR2Bucket, CfScriptFormat, @@ -173,29 +174,46 @@ function buildLog(): Log { return new WranglerLog(level, { prefix: "wrangler-UserWorker" }); } +// TODO(soon): workerd requires python modules to be named without a file extension +// We should remove this restriction +function stripPySuffix(modulePath: string, type?: CfModuleType) { + if (type === "python" && modulePath.endsWith(".py")) { + return modulePath.slice(0, -3); + } + return modulePath; +} + async function buildSourceOptions( config: ConfigBundle ): Promise { const scriptPath = realpathSync(config.bundle.path); if (config.format === "modules") { const modulesRoot = path.dirname(scriptPath); - const { entrypointSource, modules } = withSourceURLs( - scriptPath, - config.bundle.modules - ); + const { entrypointSource, modules } = + config.bundle.type === "python" + ? { + entrypointSource: readFileSync(scriptPath, "utf8"), + modules: config.bundle.modules, + } + : withSourceURLs(scriptPath, config.bundle.modules); + return { modulesRoot, + modules: [ // Entrypoint { - type: "ESModule", - path: scriptPath, + type: ModuleTypeToRuleType[config.bundle.type], + path: stripPySuffix(scriptPath, config.bundle.type), contents: entrypointSource, }, // Misc (WebAssembly, etc, ...) ...modules.map((module) => ({ type: ModuleTypeToRuleType[module.type ?? "esm"], - path: path.resolve(modulesRoot, module.name), + path: stripPySuffix( + path.resolve(modulesRoot, module.name), + module.type + ), contents: module.content, })), ], diff --git a/packages/wrangler/src/dev/proxy.ts b/packages/wrangler/src/dev/proxy.ts index fe9fa3f8c633..ad3207e23c6d 100644 --- a/packages/wrangler/src/dev/proxy.ts +++ b/packages/wrangler/src/dev/proxy.ts @@ -154,7 +154,7 @@ export async function startPreviewServer({ accessTokenRef, }); - await waitForPortToBeAvailable(port, { + await waitForPortToBeAvailable(port, ip, { retryPeriod: 200, timeout: 2000, abortSignal: abortController.signal, @@ -295,7 +295,7 @@ export function usePreviewServer({ return; } - waitForPortToBeAvailable(port, { + waitForPortToBeAvailable(port, ip, { retryPeriod: 200, timeout: 2000, abortSignal: abortController.signal, @@ -636,9 +636,13 @@ function createStreamHandler( */ export async function waitForPortToBeAvailable( port: number, + host: string, options: { retryPeriod: number; timeout: number; abortSignal: AbortSignal } ): Promise { return new Promise((resolve, reject) => { + if (host === "*") { + host = "0.0.0.0"; + } // eslint-disable-next-line @typescript-eslint/no-explicit-any options.abortSignal.addEventListener("abort", () => { const abortError = new Error("waitForPortToBeAvailable() aborted"); @@ -686,7 +690,7 @@ export async function waitForPortToBeAvailable( doReject(err); } }); - server.listen(port, () => + server.listen(port, host, () => terminator .terminate() .then(doResolve, () => diff --git a/packages/wrangler/src/dev/remote.tsx b/packages/wrangler/src/dev/remote.tsx index ac3d36d55f1c..7cfa0dc0b7a0 100644 --- a/packages/wrangler/src/dev/remote.tsx +++ b/packages/wrangler/src/dev/remote.tsx @@ -29,6 +29,7 @@ import type { CfWorkerContext, CfWorkerInit, } from "../deployment-bundle/worker"; +import type { ParseError } from "../parse"; import type { AssetPaths } from "../sites"; import type { ChooseAccountItem } from "../user"; import type { @@ -336,27 +337,18 @@ export function useWorker( start().catch((err) => { // we want to log the error, but not end the process // since it could recover after the developer fixes whatever's wrong - if ((err as { code: string }).code !== "ABORT_ERR") { - // instead of logging the raw API error to the user, - // give them friendly instructions - // for error 10063 (workers.dev subdomain required) - if (err.code === 10063) { - const errorMessage = - "Error: You need to register a workers.dev subdomain before running the dev command in remote mode"; - const solutionMessage = - "You can either enable local mode by pressing l, or register a workers.dev subdomain here:"; - const onboardingLink = `https://dash.cloudflare.com/${props.accountId}/workers/onboarding`; - logger.error( - `${errorMessage}\n${solutionMessage}\n${onboardingLink}` - ); - } else if (err.code === 10049) { + // instead of logging the raw API error to the user, + // give them friendly instructions + if ((err as unknown as { code: string }).code !== "ABORT_ERR") { + // code 10049 happens when the preview token expires + if (err.code === 10049) { logger.log("Preview token expired, fetching a new one"); - // code 10049 happens when the preview token expires + // since we want a new preview token when this happens, // lets increment the counter, and trigger a rerun of // the useEffect above setRestartCounter((prevCount) => prevCount + 1); - } else { + } else if (!handleUserFriendlyError(err, props.accountId)) { logger.error("Error on remote worker:", err); } } @@ -525,21 +517,15 @@ export async function getRemotePreviewToken(props: RemoteProps) { return workerPreviewToken; } return start().catch((err) => { - if ((err as { code?: string })?.code !== "ABORT_ERR") { - // instead of logging the raw API error to the user, - // give them friendly instructions - // for error 10063 (workers.dev subdomain required) - if (err?.code === 10063) { - const errorMessage = - "Error: You need to register a workers.dev subdomain before running the dev command in remote mode"; - const solutionMessage = - "You can either enable local mode by pressing l, or register a workers.dev subdomain here:"; - const onboardingLink = `https://dash.cloudflare.com/${props.accountId}/workers/onboarding`; - logger.error(`${errorMessage}\n${solutionMessage}\n${onboardingLink}`); - } else if (err?.code === 10049) { - // code 10049 happens when the preview token expires + // we want to log the error, but not end the process + // since it could recover after the developer fixes whatever's wrong + // instead of logging the raw API error to the user, + // give them friendly instructions + if ((err as unknown as { code: string })?.code !== "ABORT_ERR") { + // code 10049 happens when the preview token expires + if (err.code === 10049) { logger.log("Preview token expired, restart server to fetch a new one"); - } else { + } else if (!handleUserFriendlyError(err, props.accountId)) { helpIfErrorIsSizeOrScriptStartup(err, props.bundle?.dependencies || {}); logger.error("Error on remote worker:", err); } @@ -684,3 +670,55 @@ function ChooseAccount(props: { ); } + +/** + * A switch for handling thrown error mappings to user friendly + * messages, does not perform any logic other than logging errors. + * @returns if the error was handled or not + */ +function handleUserFriendlyError(error: ParseError, accountId?: string) { + switch ((error as unknown as { code: number }).code) { + // code 10021 is a validation error + case 10021: { + // if it is the following message, give a more user friendly + // error, otherwise do not handle this error in this function + if ( + error.notes[0].text === + "binding DB of type d1 must have a valid `id` specified [code: 10021]" + ) { + const errorMessage = + "Error: You must use a real database in the preview_database_id configuration."; + const solutionMessage = + "You can find your databases using 'wrangler d1 list', or read how to develop locally with D1 here:"; + const documentationLink = `https://developers.cloudflare.com/d1/configuration/local-development`; + + logger.error( + `${errorMessage}\n${solutionMessage}\n${documentationLink}` + ); + + return true; + } + + return false; + } + + // for error 10063 (workers.dev subdomain required) + case 10063: { + const errorMessage = + "Error: You need to register a workers.dev subdomain before running the dev command in remote mode"; + const solutionMessage = + "You can either enable local mode by pressing l, or register a workers.dev subdomain here:"; + const onboardingLink = accountId + ? `https://dash.cloudflare.com/${accountId}/workers/onboarding` + : "https://dash.cloudflare.com/?to=/:account/workers/onboarding"; + + logger.error(`${errorMessage}\n${solutionMessage}\n${onboardingLink}`); + + return true; + } + + default: { + return false; + } + } +} diff --git a/packages/wrangler/src/dev/start-server.ts b/packages/wrangler/src/dev/start-server.ts index e7c9ef98f1c4..7b7ecbc6e0df 100644 --- a/packages/wrangler/src/dev/start-server.ts +++ b/packages/wrangler/src/dev/start-server.ts @@ -19,6 +19,7 @@ import { stopWorkerRegistry, } from "../dev-registry"; import { logger } from "../logger"; +import { isNavigatorDefined } from "../navigator-user-agent"; import { getWranglerTmpDir } from "../paths"; import { localPropsToConfigBundle, maybeRegisterLocalWorker } from "./local"; import { DEFAULT_WORKER_NAME, MiniflareServer } from "./miniflare"; @@ -139,6 +140,10 @@ export async function startDevServer( local: props.local, doBindings: props.bindings.durable_objects?.bindings ?? [], projectRoot: props.projectRoot, + defineNavigatorUserAgent: isNavigatorDefined( + props.compatibilityDate, + props.compatibilityFlags + ), }); if (props.local) { @@ -157,8 +162,8 @@ export async function startDevServer( compatibilityFlags: props.compatibilityFlags, bindings: props.bindings, assetPaths: props.assetPaths, - initialPort: props.initialPort, - initialIp: props.initialIp, + initialPort: undefined, // hard-code for userworker, DevEnv-ProxyWorker now uses this prop value + initialIp: "127.0.0.1", // hard-code for userworker, DevEnv-ProxyWorker now uses this prop value rules: props.rules, inspectorPort: props.inspectorPort, runtimeInspectorPort: props.runtimeInspectorPort, @@ -290,6 +295,7 @@ async function runEsbuild({ local, doBindings, projectRoot, + defineNavigatorUserAgent, }: { entry: Entry; destination: string; @@ -313,6 +319,7 @@ async function runEsbuild({ local: boolean; doBindings: DurableObjectBindings; projectRoot: string | undefined; + defineNavigatorUserAgent: boolean; }): Promise { if (noBundle) { additionalModules = dedupeModulesByName([ @@ -359,6 +366,7 @@ async function runEsbuild({ testScheduled, doBindings, projectRoot, + defineNavigatorUserAgent, }) : undefined; diff --git a/packages/wrangler/src/dev/use-esbuild.ts b/packages/wrangler/src/dev/use-esbuild.ts index 66aecb26bbdf..8c3f3f5708dd 100644 --- a/packages/wrangler/src/dev/use-esbuild.ts +++ b/packages/wrangler/src/dev/use-esbuild.ts @@ -58,6 +58,7 @@ export function useEsbuild({ experimentalLocal, projectRoot, onBundleStart, + defineNavigatorUserAgent, }: { entry: Entry; destination: string | undefined; @@ -84,6 +85,7 @@ export function useEsbuild({ experimentalLocal: boolean | undefined; projectRoot: string | undefined; onBundleStart: () => void; + defineNavigatorUserAgent: boolean; }): EsbuildBundle | undefined { const [bundle, setBundle] = useState(); const { exit } = useApp(); @@ -190,6 +192,7 @@ export function useEsbuild({ plugins: [onEnd], local, projectRoot, + defineNavigatorUserAgent, }) : undefined; @@ -213,7 +216,8 @@ export function useEsbuild({ id: 0, entry, path: bundleResult?.resolvedEntryPointPath ?? entry.file, - type: bundleResult?.bundleType ?? getBundleType(entry.format), + type: + bundleResult?.bundleType ?? getBundleType(entry.format, entry.file), modules: bundleResult ? bundleResult.modules : newAdditionalModules, dependencies: bundleResult?.dependencies ?? {}, sourceMapPath: bundleResult?.sourceMapPath, diff --git a/packages/wrangler/src/errors.ts b/packages/wrangler/src/errors.ts index 85892bb3b7cc..ad73449f562c 100644 --- a/packages/wrangler/src/errors.ts +++ b/packages/wrangler/src/errors.ts @@ -24,3 +24,17 @@ export class FatalError extends UserError { super(message); } } + +/** + * JsonFriendlyFatalError is used to output JSON when wrangler crashes, useful for --json mode. + * + * To use, pass stringify'd json into the constructor like so: + * ```js + * throw new JsonFriendlyFatalError(JSON.stringify({ error: messageToDisplay }); + * ``` + */ +export class JsonFriendlyFatalError extends FatalError { + constructor(message?: string, readonly code?: number) { + super(message); + } +} diff --git a/packages/wrangler/src/index.ts b/packages/wrangler/src/index.ts index 8abe7c16c4ff..b3ce1fcc063e 100644 --- a/packages/wrangler/src/index.ts +++ b/packages/wrangler/src/index.ts @@ -35,12 +35,12 @@ import { import { devHandler, devOptions } from "./dev"; import { workerNamespaceCommands } from "./dispatch-namespace"; import { docsHandler, docsOptions } from "./docs"; -import { UserError } from "./errors"; +import { JsonFriendlyFatalError, UserError } from "./errors"; import { generateHandler, generateOptions } from "./generate"; import { hyperdrive } from "./hyperdrive/index"; import { initHandler, initOptions } from "./init"; import { kvBulk, kvKey, kvNamespace } from "./kv"; -import { logBuildFailure, logger } from "./logger"; +import { logBuildFailure, logger, LOGGER_LEVELS } from "./logger"; import * as metrics from "./metrics"; import { mTlsCertificateCommands } from "./mtls-certificate/cli"; import { pages } from "./pages"; @@ -70,6 +70,7 @@ import { versionsUploadHandler, versionsUploadOptions } from "./versions"; import { whoami } from "./whoami"; import { asJson } from "./yargs-types"; import type { Config } from "./config"; +import type { LoggerLevel } from "./logger"; import type { CommonYargsArgv, CommonYargsOptions } from "./yargs-types"; import type { Arguments, CommandModule } from "yargs"; @@ -223,6 +224,11 @@ export function createCLIParser(argv: string[]) { hidden: true, }) .check((args) => { + // Update logger level, before we do any logging + if (Object.keys(LOGGER_LEVELS).includes(args.logLevel as string)) { + logger.loggerLevel = args.logLevel as LoggerLevel; + } + // Grab locally specified env params from `.env` file const loaded = loadDotEnv(".env", args.env); for (const [key, value] of Object.entries(loaded?.parsed ?? {})) { @@ -787,6 +793,8 @@ export async function main(argv: string[]): Promise { text: "\nIf you think this is a bug, please open an issue at: https://github.com/cloudflare/workers-sdk/issues/new/choose", }); logger.log(formatMessage(e)); + } else if (e instanceof JsonFriendlyFatalError) { + logger.log(e.message); } else if ( e instanceof Error && e.message.includes("Raw mode is not supported on") diff --git a/packages/wrangler/src/miniflare-cli/assets.ts b/packages/wrangler/src/miniflare-cli/assets.ts index 1da646944012..3731736f6471 100644 --- a/packages/wrangler/src/miniflare-cli/assets.ts +++ b/packages/wrangler/src/miniflare-cli/assets.ts @@ -1,3 +1,4 @@ +import assert from "node:assert"; import { existsSync, lstatSync, readFileSync } from "node:fs"; import { join, resolve } from "node:path"; import { createMetadataObject } from "@cloudflare/pages-shared/metadata-generator/createMetadataObject"; @@ -6,6 +7,7 @@ import { parseRedirects } from "@cloudflare/pages-shared/metadata-generator/pars import { watch } from "chokidar"; import { getType } from "mime"; import { fetch, Request, Response } from "miniflare"; +import { Dispatcher, getGlobalDispatcher } from "undici"; import { hashFile } from "../pages/hash"; import type { Logger } from "../logger"; import type { Metadata } from "@cloudflare/pages-shared/asset-server/metadata"; @@ -15,6 +17,7 @@ import type { } from "@cloudflare/pages-shared/metadata-generator/types"; import type { Request as WorkersRequest } from "@cloudflare/workers-types/experimental"; import type { RequestInit } from "miniflare"; +import type { IncomingHttpHeaders } from "undici/types/header"; export interface Options { log: Logger; @@ -38,7 +41,9 @@ export default async function generateASSETSBinding(options: Options) { proxyRequest.headers.delete("Sec-WebSocket-Accept"); proxyRequest.headers.delete("Sec-WebSocket-Key"); } - return await fetch(proxyRequest); + return await fetch(proxyRequest, { + dispatcher: new ProxyDispatcher(miniflareRequest.headers.get("Host")), + }); } catch (thrown) { options.log.error(new Error(`Could not proxy request: ${thrown}`)); @@ -63,6 +68,64 @@ export default async function generateASSETSBinding(options: Options) { }; } +/** + * An Undici custom Dispatcher that is used for the fetch requests + * of the Pages dev server proxy. + * + * Notably, the ProxyDispatcher reinstates the Host header that was removed by the + * Undici `fetch` function call. Undici removes the Host header as a security precaution, + * but this is not relevant for an internal Proxy. + * + * The ProxyDispatcher will delegate through to the current global `Dispatcher`, + * ensuring that the request is routed correctly in case a developer has changed the + * global Dispatcher by a call to `setGlobalDispatcher()`. + */ +class ProxyDispatcher extends Dispatcher { + private dispatcher: Dispatcher = getGlobalDispatcher(); + + constructor(private host: string | null) { + super(); + } + + dispatch( + options: Dispatcher.DispatchOptions, + handler: Dispatcher.DispatchHandlers + ): boolean { + if (this.host !== null) { + ProxyDispatcher.reinstateHostHeader(options.headers, this.host); + } + return this.dispatcher.dispatch(options, handler); + } + + close() { + return this.dispatcher.close(); + } + + destroy() { + return this.dispatcher.destroy(); + } + + /** + * Ensure that the request contains a Host header, which would have been deleted + * by the `fetch()` function before calling `dispatcher.dispatch()`. + */ + private static reinstateHostHeader( + headers: string[] | IncomingHttpHeaders | null | undefined, + host: string + ) { + assert(headers, "Expected all proxy requests to contain headers."); + assert( + !Array.isArray(headers), + "Expected proxy request headers to be a hash object" + ); + assert( + Object.keys(headers).every((h) => h.toLowerCase() !== "host"), + "Expected Host header to have been deleted." + ); + headers["Host"] = host; + } +} + async function generateAssetsFetch( directory: string, log: Logger diff --git a/packages/wrangler/src/navigator-user-agent.ts b/packages/wrangler/src/navigator-user-agent.ts new file mode 100644 index 000000000000..b4f15a0d451a --- /dev/null +++ b/packages/wrangler/src/navigator-user-agent.ts @@ -0,0 +1,21 @@ +import assert from "node:assert"; + +export function isNavigatorDefined( + compatibility_date: string | undefined, + compatibility_flags: string[] = [] +) { + assert( + !( + compatibility_flags.includes("global_navigator") && + compatibility_flags.includes("no_global_navigator") + ), + "Can't both enable and disable a flag" + ); + if (compatibility_flags.includes("global_navigator")) { + return true; + } + if (compatibility_flags.includes("no_global_navigator")) { + return false; + } + return !!compatibility_date && compatibility_date >= "2022-03-21"; +} diff --git a/packages/wrangler/src/pages/build.ts b/packages/wrangler/src/pages/build.ts index d009890ec139..ef7eb2c103bd 100644 --- a/packages/wrangler/src/pages/build.ts +++ b/packages/wrangler/src/pages/build.ts @@ -5,6 +5,7 @@ import { writeAdditionalModules } from "../deployment-bundle/find-additional-mod import { FatalError, UserError } from "../errors"; import { logger } from "../logger"; import * as metrics from "../metrics"; +import { isNavigatorDefined } from "../navigator-user-agent"; import { buildFunctions } from "./buildFunctions"; import { EXIT_CODE_FUNCTIONS_NO_ROUTES_ERROR, @@ -126,6 +127,7 @@ export const Handler = async (args: PagesBuildArgs) => { plugin, nodejsCompat, legacyNodeCompat, + defineNavigatorUserAgent, } = validatedArgs; try { @@ -151,6 +153,7 @@ export const Handler = async (args: PagesBuildArgs) => { nodejsCompat, routesOutputPath, local: false, + defineNavigatorUserAgent, }); } catch (e) { if (e instanceof FunctionsNoRoutesError) { @@ -188,6 +191,7 @@ export const Handler = async (args: PagesBuildArgs) => { nodejsCompat, legacyNodeCompat, workerScriptPath, + defineNavigatorUserAgent, } = validatedArgs; /** @@ -200,6 +204,7 @@ export const Handler = async (args: PagesBuildArgs) => { workerJSDirectory: workerScriptPath, buildOutputDirectory, nodejsCompat, + defineNavigatorUserAgent, }); } else { /** @@ -216,6 +221,7 @@ export const Handler = async (args: PagesBuildArgs) => { sourcemap, watch, nodejsCompat, + defineNavigatorUserAgent, }); } } else { @@ -240,6 +246,7 @@ export const Handler = async (args: PagesBuildArgs) => { nodejsCompat, routesOutputPath, local: false, + defineNavigatorUserAgent, }); } catch (e) { if (e instanceof FunctionsNoRoutesError) { @@ -278,7 +285,7 @@ type WorkerBundleArgs = Omit & { buildOutputDirectory: string; legacyNodeCompat: boolean; nodejsCompat: boolean; - + defineNavigatorUserAgent: boolean; workerScriptPath: string; }; type PluginArgs = Omit< @@ -289,6 +296,7 @@ type PluginArgs = Omit< outdir: string; legacyNodeCompat: boolean; nodejsCompat: boolean; + defineNavigatorUserAgent: boolean; }; type ValidatedArgs = WorkerBundleArgs | PluginArgs; @@ -357,6 +365,10 @@ const validateArgs = (args: PagesBuildArgs): ValidatedArgs => { ); } const nodejsCompat = !!args.compatibilityFlags?.includes("nodejs_compat"); + const defineNavigatorUserAgent = isNavigatorDefined( + args.compatibilityDate, + args.compatibilityFlags + ); if (legacyNodeCompat && nodejsCompat) { throw new UserError( "The `nodejs_compat` compatibility flag cannot be used in conjunction with the legacy `--node-compat` flag. If you want to use the Workers runtime Node.js compatibility features, please remove the `--node-compat` argument from your CLI command." @@ -404,5 +416,6 @@ We looked for the Functions directory (${basename( workerScriptPath, nodejsCompat, legacyNodeCompat, + defineNavigatorUserAgent, } as ValidatedArgs; }; diff --git a/packages/wrangler/src/pages/buildFunctions.ts b/packages/wrangler/src/pages/buildFunctions.ts index b30273537bf3..531cbf2f8c36 100644 --- a/packages/wrangler/src/pages/buildFunctions.ts +++ b/packages/wrangler/src/pages/buildFunctions.ts @@ -38,6 +38,7 @@ export async function buildFunctions({ getPagesTmpDir(), `./functionsRoutes-${Math.random()}.mjs` ), + defineNavigatorUserAgent, }: Partial< Pick< PagesBuildArgs, @@ -61,6 +62,7 @@ export async function buildFunctions({ // Allow `routesModule` to be fixed, so we don't create a new file in the // temporary directory each time routesModule?: string; + defineNavigatorUserAgent: boolean; }) { RUNNING_BUILDERS.forEach( (runningBuilder) => runningBuilder.stop && runningBuilder.stop() @@ -117,6 +119,7 @@ export async function buildFunctions({ legacyNodeCompat, functionsDirectory: absoluteFunctionsDirectory, local, + defineNavigatorUserAgent, }); } else { bundle = await buildWorkerFromFunctions({ @@ -133,6 +136,7 @@ export async function buildFunctions({ buildOutputDirectory, legacyNodeCompat, nodejsCompat, + defineNavigatorUserAgent, }); } diff --git a/packages/wrangler/src/pages/dev.ts b/packages/wrangler/src/pages/dev.ts index 488cd26c128e..d66ae30777a9 100644 --- a/packages/wrangler/src/pages/dev.ts +++ b/packages/wrangler/src/pages/dev.ts @@ -9,6 +9,7 @@ import { esbuildAliasExternalPlugin } from "../deployment-bundle/esbuild-plugins import { FatalError } from "../errors"; import { logger } from "../logger"; import * as metrics from "../metrics"; +import { isNavigatorDefined } from "../navigator-user-agent"; import { getBasePath } from "../paths"; import * as shellquote from "../utils/shell-quote"; import { buildFunctions } from "./buildFunctions"; @@ -304,6 +305,10 @@ export const Handler = async ({ let scriptPath = ""; const nodejsCompat = compatibilityFlags?.includes("nodejs_compat"); + const defineNavigatorUserAgent = isNavigatorDefined( + compatibilityDate, + compatibilityFlags + ); let modules: CfModule[] = []; if (usingWorkerDirectory) { @@ -312,6 +317,7 @@ export const Handler = async ({ workerJSDirectory: workerScriptPath, buildOutputDirectory: directory ?? ".", nodejsCompat, + defineNavigatorUserAgent, }); modules = bundleResult.modules; scriptPath = bundleResult.resolvedEntryPointPath; @@ -360,6 +366,7 @@ export const Handler = async ({ sourcemap: true, watch: false, onEnd: () => scriptReadyResolve(), + defineNavigatorUserAgent, }); } catch (e: unknown) { logger.warn("Failed to bundle _worker.js.", e); @@ -371,7 +378,10 @@ export const Handler = async ({ watch([workerScriptPath], { persistent: true, ignoreInitial: true, - }).on("all", async () => { + }).on("all", async (event) => { + if (event === "unlink") { + return; + } await runBuild(); }); } else if (usingFunctions) { @@ -413,6 +423,7 @@ export const Handler = async ({ nodejsCompat, local: true, routesModule, + defineNavigatorUserAgent, }); await metrics.sendMetricsEvent("build pages functions"); }; @@ -539,8 +550,11 @@ export const Handler = async ({ watch([routesJSONPath], { persistent: true, ignoreInitial: true, - }).on("all", async () => { + }).on("all", async (event) => { try { + if (event === "unlink") { + return; + } /** * Watch for _routes.json file changes and validate file each time. * If file is valid proceed to running the build. diff --git a/packages/wrangler/src/pages/functions/buildPlugin.ts b/packages/wrangler/src/pages/functions/buildPlugin.ts index 3fafa8e94c53..4651ef7b725b 100644 --- a/packages/wrangler/src/pages/functions/buildPlugin.ts +++ b/packages/wrangler/src/pages/functions/buildPlugin.ts @@ -23,6 +23,7 @@ export function buildPluginFromFunctions({ legacyNodeCompat, functionsDirectory, local, + defineNavigatorUserAgent, }: Options) { const entry: Entry = { file: resolve(getBasePath(), "templates/pages-template-plugin.ts"), @@ -107,5 +108,6 @@ export function buildPluginFromFunctions({ forPages: true, local, projectRoot: getPagesProjectRoot(), + defineNavigatorUserAgent, }); } diff --git a/packages/wrangler/src/pages/functions/buildWorker.ts b/packages/wrangler/src/pages/functions/buildWorker.ts index 9472a4e6ecca..4beed8c91db4 100644 --- a/packages/wrangler/src/pages/functions/buildWorker.ts +++ b/packages/wrangler/src/pages/functions/buildWorker.ts @@ -31,6 +31,7 @@ export type Options = { nodejsCompat?: boolean; functionsDirectory: string; local: boolean; + defineNavigatorUserAgent: boolean; }; export function buildWorkerFromFunctions({ @@ -47,6 +48,7 @@ export function buildWorkerFromFunctions({ nodejsCompat, functionsDirectory, local, + defineNavigatorUserAgent, }: Options) { const entry: Entry = { file: resolve(getBasePath(), "templates/pages-template-worker.ts"), @@ -158,6 +160,7 @@ export function buildWorkerFromFunctions({ forPages: true, local, projectRoot: getPagesProjectRoot(), + defineNavigatorUserAgent, }); } @@ -178,6 +181,7 @@ export type RawOptions = { nodejsCompat?: boolean; local: boolean; additionalModules?: CfModule[]; + defineNavigatorUserAgent: boolean; }; /** @@ -203,6 +207,7 @@ export function buildRawWorker({ nodejsCompat, local, additionalModules = [], + defineNavigatorUserAgent, }: RawOptions) { const entry: Entry = { file: workerScriptPath, @@ -252,6 +257,7 @@ export function buildRawWorker({ forPages: true, local, projectRoot: getPagesProjectRoot(), + defineNavigatorUserAgent, }); } @@ -259,10 +265,12 @@ export async function traverseAndBuildWorkerJSDirectory({ workerJSDirectory, buildOutputDirectory, nodejsCompat, + defineNavigatorUserAgent, }: { workerJSDirectory: string; buildOutputDirectory: string; nodejsCompat?: boolean; + defineNavigatorUserAgent: boolean; }): Promise { const entrypoint = resolve(join(workerJSDirectory, "index.js")); @@ -297,6 +305,7 @@ export async function traverseAndBuildWorkerJSDirectory({ onEnd: () => {}, nodejsCompat, additionalModules, + defineNavigatorUserAgent, }); return { diff --git a/packages/wrangler/src/r2/helpers.ts b/packages/wrangler/src/r2/helpers.ts index 5acee6e6bafb..4dba22ed215f 100644 --- a/packages/wrangler/src/r2/helpers.ts +++ b/packages/wrangler/src/r2/helpers.ts @@ -97,7 +97,7 @@ export async function getR2Object( bucketName: string, objectName: string, jurisdiction?: string -): Promise { +): Promise { const headers: HeadersInit = {}; if (jurisdiction !== undefined) { headers["cf-r2-jurisdiction"] = jurisdiction; @@ -110,7 +110,7 @@ export async function getR2Object( } ); - return response.body; + return response === null ? null : response.body; } /** @@ -142,7 +142,7 @@ export async function putR2Object( headers["cf-r2-jurisdiction"] = jurisdiction; } - await fetchR2Objects( + const result = await fetchR2Objects( `/accounts/${accountId}/r2/buckets/${bucketName}/objects/${objectName}`, { body: object, @@ -151,6 +151,9 @@ export async function putR2Object( duplex: "half", } ); + if (result === null) { + throw new UserError("The specified bucket does not exist."); + } } /** * Delete an Object @@ -224,6 +227,18 @@ export async function usingLocalBucket( } } +type SippyConfig = { + source: + | { provider: "aws"; region: string; bucket: string } + | { provider: "gcs"; bucket: string }; + destination: { + provider: "r2"; + account: string; + bucket: string; + accessKeyId: string; + }; +}; + /** * Retreive the sippy upstream bucket for the bucket with the given name */ @@ -231,7 +246,7 @@ export async function getR2Sippy( accountId: string, bucketName: string, jurisdiction?: string -): Promise { +): Promise { const headers: HeadersInit = {}; if (jurisdiction !== undefined) { headers["cf-r2-jurisdiction"] = jurisdiction; @@ -260,26 +275,27 @@ export async function deleteR2Sippy( ); } -export type R2Credentials = { - bucket: string; - r2_key_id: string; - r2_access_key: string; -}; - -export type SippyPutConfig = R2Credentials & - ( +export type SippyPutParams = { + source: | { - provider: "AWS"; - zone: string | undefined; - key_id: string; - access_key: string; + provider: "aws"; + region: string; + bucket: string; + accessKeyId: string; + secretAccessKey: string; } | { - provider: "GCS"; - client_email: string; - private_key: string; - } - ); + provider: "gcs"; + bucket: string; + clientEmail: string; + privateKey: string; + }; + destination: { + provider: "r2"; + accessKeyId: string; + secretAccessKey: string; + }; +}; /** * Enable sippy on the bucket with the given name @@ -287,15 +303,17 @@ export type SippyPutConfig = R2Credentials & export async function putR2Sippy( accountId: string, bucketName: string, - config: SippyPutConfig, + params: SippyPutParams, jurisdiction?: string ): Promise { - const headers: HeadersInit = {}; + const headers: HeadersInit = { + "Content-Type": "application/json", + }; if (jurisdiction !== undefined) { headers["cf-r2-jurisdiction"] = jurisdiction; } return await fetchResult( `/accounts/${accountId}/r2/buckets/${bucketName}/sippy`, - { method: "PUT", body: JSON.stringify(config), headers } + { method: "PUT", body: JSON.stringify(params), headers } ); } diff --git a/packages/wrangler/src/r2/index.ts b/packages/wrangler/src/r2/index.ts index 3598d210a936..c788c8904ac3 100644 --- a/packages/wrangler/src/r2/index.ts +++ b/packages/wrangler/src/r2/index.ts @@ -138,6 +138,9 @@ export function r2(r2Yargs: CommonYargsArgv) { key, jurisdiction ); + if (input === null) { + throw new UserError("The specified key does not exist."); + } await stream.promises.pipeline(input, output); } if (!pipe) logger.log("Download complete."); diff --git a/packages/wrangler/src/r2/sippy.ts b/packages/wrangler/src/r2/sippy.ts index 1a04d7a83d00..233cf5dbe13c 100644 --- a/packages/wrangler/src/r2/sippy.ts +++ b/packages/wrangler/src/r2/sippy.ts @@ -9,7 +9,7 @@ import type { CommonYargsArgv, StrictYargsOptionsToInterface, } from "../yargs-types"; -import type { SippyPutConfig } from "./helpers"; +import type { SippyPutParams } from "./helpers"; const NO_SUCH_OBJECT_KEY = 10007; const SIPPY_PROVIDER_CHOICES = ["AWS", "GCS"]; @@ -38,7 +38,7 @@ export function EnableOptions(yargs: CommonYargsArgv) { description: "(AWS provider only) The region of the upstream bucket", string: true, }) - .option("key-id", { + .option("access-key-id", { description: "(AWS provider only) The secret access key id for the upstream bucket", string: true, @@ -63,7 +63,7 @@ export function EnableOptions(yargs: CommonYargsArgv) { "(GCS provider only) The private key for your Google Cloud service account key", string: true, }) - .option("r2-key-id", { + .option("r2-access-key-id", { description: "The secret access key id for this R2 bucket", string: true, }) @@ -97,14 +97,17 @@ export async function EnableHandler( throw new UserError(`Must specify ${args.provider} bucket name.`); } - if (args.provider == "AWS") { + if (args.provider === "AWS") { args.region ??= await prompt( "Enter the AWS region where your S3 bucket is located (example: us-west-2):" ); - args.keyId ??= await prompt( + if (!args.region) { + throw new UserError("Must specify an AWS Region."); + } + args.accessKeyId ??= await prompt( "Enter your AWS Access Key ID (requires read and list access):" ); - if (!args.keyId) { + if (!args.accessKeyId) { throw new UserError("Must specify an AWS Access Key ID."); } args.secretAccessKey ??= await prompt( @@ -113,7 +116,7 @@ export async function EnableHandler( if (!args.secretAccessKey) { throw new UserError("Must specify an AWS Secret Access Key."); } - } else if (args.provider == "GCS") { + } else if (args.provider === "GCS") { if ( !(args.clientEmail && args.privateKey) && !args.serviceAccountKeyFile @@ -129,10 +132,10 @@ export async function EnableHandler( } } - args.r2KeyId ??= await prompt( + args.r2AccessKeyId ??= await prompt( "Enter your R2 Access Key ID (requires read and write access):" ); - if (!args.r2KeyId) { + if (!args.r2AccessKeyId) { throw new UserError("Must specify an R2 Access Key ID."); } args.r2SecretAccessKey ??= await prompt("Enter your R2 Secret Access Key:"); @@ -141,47 +144,78 @@ export async function EnableHandler( } } - let sippyConfig = { - bucket: args.bucket ?? "", - r2_key_id: args.r2KeyId ?? "", - r2_access_key: args.r2SecretAccessKey ?? "", - } as SippyPutConfig; + let sippyConfig: SippyPutParams; + + if (args.provider === "AWS") { + if (!args.region) throw new UserError("Error: must provide --region."); + if (!args.bucket) throw new UserError("Error: must provide --bucket."); + if (!args.accessKeyId) + throw new UserError("Error: must provide --access-key-id."); + if (!args.secretAccessKey) + throw new UserError("Error: must provide --secret-access-key."); + if (!args.r2AccessKeyId) + throw new UserError("Error: must provide --r2-access-key-id."); + if (!args.r2SecretAccessKey) + throw new UserError("Error: must provide --r2-secret-access-key."); - if (args.provider == "AWS") { - if (!(args.keyId && args.secretAccessKey)) { - throw new UserError( - `Error: must provide --key-id and --secret-access-key.` - ); - } sippyConfig = { - ...sippyConfig, - provider: "AWS", - zone: args.region, - key_id: args.keyId, - access_key: args.secretAccessKey, + source: { + provider: "aws", + region: args.region, + bucket: args.bucket, + accessKeyId: args.accessKeyId, + secretAccessKey: args.secretAccessKey, + }, + destination: { + provider: "r2", + accessKeyId: args.r2AccessKeyId, + secretAccessKey: args.r2SecretAccessKey, + }, }; - } else if (args.provider == "GCS") { + } else if (args.provider === "GCS") { if (args.serviceAccountKeyFile) { const serviceAccount = JSON.parse( readFileSync(args.serviceAccountKeyFile) ); if ("client_email" in serviceAccount && "private_key" in serviceAccount) { - args.clientEmail = serviceAccount["client_email"]; - args.privateKey = serviceAccount["private_key"]; + args.clientEmail = serviceAccount.client_email; + args.privateKey = serviceAccount.private_key; } } - if (!(args.clientEmail && args.privateKey)) { + + if (!args.bucket) throw new UserError("Error: must provide --bucket."); + if (!args.clientEmail) throw new UserError( - `Error: must provide --service-account-key-file or --client-email and --private-key.` + "Error: must provide --service-account-key-file or --client-email." + ); + if (!args.privateKey) + throw new UserError( + "Error: must provide --service-account-key-file or --private-key." ); - } args.privateKey = args.privateKey.replace(/\\n/g, "\n"); + + if (!args.r2AccessKeyId) + throw new UserError("Error: must provide --r2-access-key-id."); + if (!args.r2SecretAccessKey) + throw new UserError("Error: must provide --r2-secret-access-key."); + sippyConfig = { - ...sippyConfig, - provider: "GCS", - client_email: args.clientEmail, - private_key: args.privateKey, + source: { + provider: "gcs", + bucket: args.bucket, + clientEmail: args.clientEmail, + privateKey: args.privateKey, + }, + destination: { + provider: "r2", + accessKeyId: args.r2AccessKeyId, + secretAccessKey: args.r2SecretAccessKey, + }, }; + } else { + throw new UserError( + "Error: unrecognized provider. Possible options are AWS & GCS." + ); } await putR2Sippy(accountId, args.name, sippyConfig, args.jurisdiction); @@ -211,14 +245,14 @@ export async function GetHandler( const accountId = await requireAuth(config); try { - const sippyBucket = await getR2Sippy( + const sippyConfig = await getR2Sippy( accountId, args.name, args.jurisdiction ); - logger.log(`Sippy upstream bucket: ${sippyBucket}.`); + logger.log("Sippy configuration:", sippyConfig); } catch (e) { - if (e instanceof APIError && "code" in e && e.code == NO_SUCH_OBJECT_KEY) { + if (e instanceof APIError && "code" in e && e.code === NO_SUCH_OBJECT_KEY) { logger.log(`No Sippy configuration found for the '${args.name}' bucket.`); } else { throw e; diff --git a/packages/wrangler/src/type-generation.ts b/packages/wrangler/src/type-generation.ts index a7f958043d44..b073a3cd38fd 100644 --- a/packages/wrangler/src/type-generation.ts +++ b/packages/wrangler/src/type-generation.ts @@ -4,6 +4,7 @@ import { getEntry } from "./deployment-bundle/entry"; import { UserError } from "./errors"; import { logger } from "./logger"; import type { Config } from "./config"; +import type { CfScriptFormat } from "./deployment-bundle/worker"; // Currently includes bindings & rules for declaring modules @@ -11,7 +12,13 @@ export async function generateTypes( configToDTS: Partial, config: Config ) { - const entry = await getEntry({}, config, "types"); + const configContainsEntryPoint = + config.main !== undefined || !!config.site?.["entry-point"]; + + const entrypointFormat: CfScriptFormat = configContainsEntryPoint + ? (await getEntry({}, config, "types")).format + : "modules"; + const envTypeStructure: string[] = []; if (configToDTS.kv_namespaces) { @@ -136,7 +143,7 @@ export async function generateTypes( writeDTSFile({ envTypeStructure, modulesTypeStructure, - formatType: entry.format, + formatType: entrypointFormat, }); } @@ -147,7 +154,7 @@ function writeDTSFile({ }: { envTypeStructure: string[]; modulesTypeStructure: string[]; - formatType: "modules" | "service-worker"; + formatType: CfScriptFormat; }) { const wranglerOverrideDTSPath = findUpSync("worker-configuration.d.ts"); try { diff --git a/packages/wrangler/src/user/user.ts b/packages/wrangler/src/user/user.ts index 90cc3a2fbbf2..f03ce3dbdc41 100644 --- a/packages/wrangler/src/user/user.ts +++ b/packages/wrangler/src/user/user.ts @@ -1014,7 +1014,7 @@ export async function login( } }); - server.listen(8976); + server.listen(8976, "localhost"); }); if (props?.browser) { logger.log(`Opening a link in your default browser: ${urlToOpen}`); diff --git a/packages/wrangler/src/versions/upload.ts b/packages/wrangler/src/versions/upload.ts index c0c43af624ca..c7c3058b2ebc 100644 --- a/packages/wrangler/src/versions/upload.ts +++ b/packages/wrangler/src/versions/upload.ts @@ -23,6 +23,7 @@ import { getMigrationsToUpload } from "../durable"; import { UserError } from "../errors"; import { logger } from "../logger"; import { getMetricsUsageHeaders } from "../metrics"; +import { isNavigatorDefined } from "../navigator-user-agent"; import { ParseError } from "../parse"; import { getWranglerTmpDir } from "../paths"; import { getQueue } from "../queues/client"; @@ -294,6 +295,10 @@ See https://developers.cloudflare.com/workers/platform/compatibility-dates for m targetConsumer: "deploy", local: false, projectRoot: props.projectRoot, + defineNavigatorUserAgent: isNavigatorDefined( + props.compatibilityDate ?? config.compatibility_date, + props.compatibilityFlags ?? config.compatibility_flags + ), } ); diff --git a/packages/wrangler/templates/startDevWorker/ProxyWorker.ts b/packages/wrangler/templates/startDevWorker/ProxyWorker.ts index 73548a666db0..1ab8baf59ddc 100644 --- a/packages/wrangler/templates/startDevWorker/ProxyWorker.ts +++ b/packages/wrangler/templates/startDevWorker/ProxyWorker.ts @@ -1,7 +1,7 @@ -import assert from "node:assert"; import { createDeferred, DeferredPromise, + urlFromParts, } from "../../src/api/startDevWorker/utils"; import type { ProxyData, @@ -37,6 +37,7 @@ export class ProxyWorker implements DurableObject { proxyData?: ProxyData; requestQueue = new Map>(); + requestRetryQueue = new Map>(); fetch(request: Request) { if (isRequestForLiveReloadWebsocket(request)) { @@ -94,11 +95,22 @@ export class ProxyWorker implements DurableObject { return new Response(null, { status: 204 }); } + /** + * Process requests that are being retried first, then process newer requests. + * Requests that are being retried are, by definition, older than requests which haven't been processed yet. + * We don't need to be more accurate than this re ordering, since the requests are being fired off synchronously. + */ + *getOrderedQueue() { + yield* this.requestRetryQueue; + yield* this.requestQueue; + } + processQueue() { - const { proxyData } = this; // destructuring is required to keep the type-narrowing (not undefined) in the .then callback and to ensure the same proxyData is used throughout each request + const { proxyData } = this; // store proxyData at the moment this function was called if (proxyData === undefined) return; - for (const [request, deferredResponse] of this.requestQueue) { + for (const [request, deferredResponse] of this.getOrderedQueue()) { + this.requestRetryQueue.delete(request); this.requestQueue.delete(request); const userWorkerUrl = new URL(request.url); @@ -115,6 +127,8 @@ export class ProxyWorker implements DurableObject { // merge proxyData headers with the request headers for (const [key, value] of Object.entries(proxyData.headers ?? {})) { + if (value === undefined) continue; + if (key.toLowerCase() === "cookie") { const existing = request.headers.get("cookie") ?? ""; headers.set("cookie", `${existing};${value}`); @@ -123,8 +137,7 @@ export class ProxyWorker implements DurableObject { } } - // explicitly NOT await-ing this promise, we are in a loop and want to process the whole queue quickly - // if we decide to await, we should include a timeout (~100ms) in case the user worker has long-running/parellel requests + // explicitly NOT await-ing this promise, we are in a loop and want to process the whole queue quickly + synchronously void fetch(userWorkerUrl, new Request(request, { headers })) .then((res) => { if (isHtmlResponse(res)) { @@ -137,17 +150,55 @@ export class ProxyWorker implements DurableObject { // errors here are network errors or from response post-processing // to catch only network errors, use the 2nd param of the fetch.then() - void sendMessageToProxyController(this.env, { - type: "error", - error: { - name: error.name, - message: error.message, - stack: error.stack, - cause: error.cause, - }, - }); - - deferredResponse.reject(error); + // we have crossed an async boundary, so proxyData may have changed + // if proxyData.userWorkerUrl has changed, it means there is a new downstream UserWorker + // and that this error is stale since it was for a request to the old UserWorker + // so here we construct a newUserWorkerUrl so we can compare it to the (old) userWorkerUrl + const newUserWorkerUrl = + this.proxyData && urlFromParts(this.proxyData.userWorkerUrl); + + // only report errors if the downstream proxy has NOT changed + if (userWorkerUrl.href === newUserWorkerUrl?.href) { + void sendMessageToProxyController(this.env, { + type: "error", + error: { + name: error.name, + message: error.message, + stack: error.stack, + cause: error.cause, + }, + }); + + deferredResponse.reject(error); + } + + // if the request can be retried (subset of idempotent requests which have no body), requeue it + else if (request.method === "GET" || request.method === "HEAD") { + this.requestRetryQueue.set(request, deferredResponse); + // we would only end up here if the downstream UserWorker is chang*ing* + // i.e. we are in a `pause`d state and expecting a `play` message soon + // this request will be processed (retried) when the `play` message arrives + // for that reason, we do not need to call `this.processQueue` here + // (but, also, it can't hurt to call it since it bails when + // in a `pause`d state i.e. `this.proxyData` is undefined) + } + + // if the request cannot be retried, respond with 503 Service Unavailable + // important to note, this is not an (unexpected) error -- it is an acceptable flow of local development + // it would be incorrect to retry non-idempotent requests + // and would require cloning all body streams to avoid stream reuse (which is inefficient but not out of the question in the future) + // this is a good enough UX for now since it solves the most common GET use-case + else { + deferredResponse.resolve( + new Response( + "Your worker restarted mid-request. Please try sending the request again. Only GET or HEAD requests are retried automatically.", + { + status: 503, + headers: { "Retry-After": "0" }, + } + ) + ); + } }); } } @@ -166,25 +217,14 @@ function isRequestForLiveReloadWebsocket(req: Request): boolean { return isWebSocketUpgrade && websocketProtocol === LIVE_RELOAD_PROTOCOL; } -async function sendMessageToProxyController( +function sendMessageToProxyController( env: Env, - message: ProxyWorkerOutgoingRequestBody, - retries = 3 + message: ProxyWorkerOutgoingRequestBody ) { - try { - await env.PROXY_CONTROLLER.fetch("http://dummy", { - method: "POST", - body: JSON.stringify(message), - }); - } catch (cause) { - if (retries > 0) { - return sendMessageToProxyController(env, message, retries - 1); - } - - // no point sending an error message if we can't send this message - - throw cause; - } + return env.PROXY_CONTROLLER.fetch("http://dummy", { + method: "POST", + body: JSON.stringify(message), + }); } function insertLiveReloadScript( diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 45c470089867..07f0a298cdf3 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -144,7 +144,7 @@ importers: specifier: ^7.0.0 version: 7.0.0 miniflare: - specifier: 3.20231218.4 + specifier: 3.20240129.1 version: link:../../packages/miniflare undici: specifier: ^5.28.2 @@ -393,6 +393,21 @@ importers: specifier: workspace:* version: link:../../packages/wrangler + fixtures/pages-proxy-app: + devDependencies: + '@cloudflare/workers-tsconfig': + specifier: workspace:* + version: link:../../packages/workers-tsconfig + miniflare: + specifier: workspace:* + version: link:../../packages/miniflare + undici: + specifier: ^5.28.2 + version: 5.28.2 + wrangler: + specifier: workspace:* + version: link:../../packages/wrangler + fixtures/pages-simple-assets: devDependencies: '@cloudflare/workers-tsconfig': @@ -805,7 +820,7 @@ importers: version: 8.49.0 eslint-config-turbo: specifier: latest - version: 1.10.15(eslint@8.49.0) + version: 1.12.3(eslint@8.49.0) eslint-plugin-import: specifier: 2.26.x version: 2.26.0(@typescript-eslint/parser@6.7.2)(eslint@8.49.0) @@ -879,8 +894,8 @@ importers: specifier: ^5.28.2 version: 5.28.2 workerd: - specifier: 1.20231218.0 - version: 1.20231218.0 + specifier: 1.20240129.0 + version: 1.20240129.0 ws: specifier: ^8.11.0 version: 8.14.2 @@ -1019,12 +1034,15 @@ importers: packages/playground-preview-worker: dependencies: hono: - specifier: ^3.3.2 - version: 3.5.6 + specifier: ^3.12.11 + version: 3.12.11 zod: specifier: ^3.22.3 version: 3.22.3 devDependencies: + '@cloudflare/eslint-config-worker': + specifier: workspace:* + version: link:../eslint-config-worker '@cloudflare/workers-types': specifier: ^4.20230321.0 version: 4.20230821.0 @@ -1379,7 +1397,7 @@ importers: version: 3.0.0 '@microsoft/api-extractor': specifier: ^7.28.3 - version: 7.28.3 + version: 7.40.0(@types/node@20.1.7) '@sentry/node': specifier: ^7.86.0 version: 7.87.0(supports-color@9.2.2) @@ -3941,8 +3959,8 @@ packages: marked: 0.3.19 dev: false - /@cloudflare/workerd-darwin-64@1.20231218.0: - resolution: {integrity: sha512-547gOmTIVmRdDy7HNAGJUPELa+fSDm2Y0OCxqAtQOz0GLTDu1vX61xYmsb2rn91+v3xW6eMttEIpbYokKjtfJA==} + /@cloudflare/workerd-darwin-64@1.20240129.0: + resolution: {integrity: sha512-DfVVB5IsQLVcWPJwV019vY3nEtU88c2Qu2ST5SQxqcGivZ52imagLRK0RHCIP8PK4piSiq90qUC6ybppUsw8eg==} engines: {node: '>=16'} cpu: [x64] os: [darwin] @@ -3950,8 +3968,8 @@ packages: dev: false optional: true - /@cloudflare/workerd-darwin-arm64@1.20231218.0: - resolution: {integrity: sha512-b39qrU1bKolCfmKFDAnX4vXcqzISkEUVE/V8sMBsFzxrIpNAbcUHBZAQPYmS/OHIGB94KjOVokvDi7J6UNurPw==} + /@cloudflare/workerd-darwin-arm64@1.20240129.0: + resolution: {integrity: sha512-t0q8ABkmumG1zRM/MZ/vIv/Ysx0vTAXnQAPy/JW5aeQi/tqrypXkO9/NhPc0jbF/g/hIPrWEqpDgEp3CB7Da7Q==} engines: {node: '>=16'} cpu: [arm64] os: [darwin] @@ -3959,8 +3977,8 @@ packages: dev: false optional: true - /@cloudflare/workerd-linux-64@1.20231218.0: - resolution: {integrity: sha512-dMUF1wA+0mybm6hHNOCgY/WMNMwomPPs4I7vvYCgwHSkch0Q2Wb7TnxQZSt8d1PK/myibaBwadrlIxpjxmpz3w==} + /@cloudflare/workerd-linux-64@1.20240129.0: + resolution: {integrity: sha512-sFV1uobHgDI+6CKBS/ZshQvOvajgwl6BtiYaH4PSFSpvXTmRx+A9bcug+6BnD+V4WgwxTiEO2iR97E1XuwDAVw==} engines: {node: '>=16'} cpu: [x64] os: [linux] @@ -3968,8 +3986,8 @@ packages: dev: false optional: true - /@cloudflare/workerd-linux-arm64@1.20231218.0: - resolution: {integrity: sha512-2s5uc8IHt0QmWyKxAr1Fy+4b8Xy0b/oUtlPnm5MrKi2gDRlZzR7JvxENPJCpCnYENydS8lzvkMiAFECPBccmyQ==} + /@cloudflare/workerd-linux-arm64@1.20240129.0: + resolution: {integrity: sha512-O7q7htHaFRp8PgTqNJx1/fYc3+LnvAo6kWWB9a14C5OWak6AAZk42PNpKPx+DXTmGvI+8S1+futBGUeJ8NPDXg==} engines: {node: '>=16'} cpu: [arm64] os: [linux] @@ -3977,8 +3995,8 @@ packages: dev: false optional: true - /@cloudflare/workerd-windows-64@1.20231218.0: - resolution: {integrity: sha512-oN5hz6TXUDB5YKUN5N3QWAv6cYz9JjTZ9g16HVyoegVFEL6/zXU3tV19MBX2IvlE11ab/mRogEv9KXVIrHfKmA==} + /@cloudflare/workerd-windows-64@1.20240129.0: + resolution: {integrity: sha512-YqGno0XSqqqkDmNoGEX6M8kJlI2lEfWntbTPVtHaZlaXVR9sWfoD7TEno0NKC95cXFz+ioyFLbgbOdnfWwmVAA==} engines: {node: '>=16'} cpu: [x64] os: [win32] @@ -5510,14 +5528,6 @@ packages: read-yaml-file: 1.1.0 dev: false - /@microsoft/api-extractor-model@7.21.0: - resolution: {integrity: sha512-NN4mXzoQWTuzznIcnLWeV6tGyn6Os9frDK6M/mmTXZ73vUYOvSWoKQ5SYzyzP7HF3YtvTmr1Rs+DsBb0HRx7WQ==} - dependencies: - '@microsoft/tsdoc': 0.14.1 - '@microsoft/tsdoc-config': 0.16.1 - '@rushstack/node-core-library': 3.49.0 - dev: true - /@microsoft/api-extractor-model@7.28.2(@types/node@18.16.10): resolution: {integrity: sha512-vkojrM2fo3q4n4oPh4uUZdjJ2DxQ2+RnDQL/xhTWSRUNPF6P4QyrvY357HBxbnltKcYu+nNNolVqc6TIGQ73Ig==} dependencies: @@ -5528,22 +5538,14 @@ packages: - '@types/node' dev: true - /@microsoft/api-extractor@7.28.3: - resolution: {integrity: sha512-lkDHPyln8MNEy1QHjmGwedRquclGKU0qL0gHplfnHuSTXSoNQ86UYaPmhG77/GiNehXzGNKMYSIfTsuoQb69jA==} - hasBin: true + /@microsoft/api-extractor-model@7.28.8(@types/node@20.1.7): + resolution: {integrity: sha512-/q6ds8XQVqs4Tq0/HueFiMk0wwJH8RaXHm+Z7XJ9ffeZ+6/oQUh6E0++uVfNoMD0JmZvLTV8++UgQ4dXMRQFWA==} dependencies: - '@microsoft/api-extractor-model': 7.21.0 - '@microsoft/tsdoc': 0.14.1 + '@microsoft/tsdoc': 0.14.2 '@microsoft/tsdoc-config': 0.16.1 - '@rushstack/node-core-library': 3.49.0 - '@rushstack/rig-package': 0.3.13 - '@rushstack/ts-command-line': 4.12.1 - colors: 1.2.5 - lodash: 4.17.21 - resolve: 1.17.0 - semver: 7.3.8 - source-map: 0.6.1 - typescript: 4.6.4 + '@rushstack/node-core-library': 3.65.0(@types/node@20.1.7) + transitivePeerDependencies: + - '@types/node' dev: true /@microsoft/api-extractor@7.38.2(@types/node@18.16.10): @@ -5566,6 +5568,26 @@ packages: - '@types/node' dev: true + /@microsoft/api-extractor@7.40.0(@types/node@20.1.7): + resolution: {integrity: sha512-U4yTHabfut6WuYUnSM2+FWUsNIJ+w8ZfQGqZWLjH5I/MZvCyDBFyPDIhZAnndd4Vd3pwl4eSBpeMDe8etkCxpA==} + hasBin: true + dependencies: + '@microsoft/api-extractor-model': 7.28.8(@types/node@20.1.7) + '@microsoft/tsdoc': 0.14.2 + '@microsoft/tsdoc-config': 0.16.1 + '@rushstack/node-core-library': 3.65.0(@types/node@20.1.7) + '@rushstack/rig-package': 0.5.1 + '@rushstack/ts-command-line': 4.17.1 + colors: 1.2.5 + lodash: 4.17.21 + resolve: 1.22.8 + semver: 7.5.4 + source-map: 0.6.1 + typescript: 5.3.3 + transitivePeerDependencies: + - '@types/node' + dev: true + /@microsoft/tsdoc-config@0.16.1: resolution: {integrity: sha512-2RqkwiD4uN6MLnHFljqBlZIXlt/SaUT6cuogU1w2ARw4nKuuppSmR0+s+NC+7kXBQykd9zzu0P4HtBpZT5zBpQ==} dependencies: @@ -6207,61 +6229,49 @@ packages: resolution: {integrity: sha512-sXo/qW2/pAcmT43VoRKOJbDOfV3cYpq3szSVfIThQXNt+E4DfKj361vaAt3c88U5tPUxzEswam7GW48PJqtKAg==} dev: true - /@rushstack/node-core-library@3.49.0: - resolution: {integrity: sha512-yBJRzGgUNFwulVrwwBARhbGaHsxVMjsZ9JwU1uSBbqPYCdac+t2HYdzi4f4q/Zpgb0eNbwYj2yxgHYpJORNEaw==} + /@rushstack/node-core-library@3.61.0(@types/node@18.16.10): + resolution: {integrity: sha512-tdOjdErme+/YOu4gPed3sFS72GhtWCgNV9oDsHDnoLY5oDfwjKUc9Z+JOZZ37uAxcm/OCahDHfuu2ugqrfWAVQ==} + peerDependencies: + '@types/node': '*' + peerDependenciesMeta: + '@types/node': + optional: true dependencies: - '@types/node': 12.20.24 + '@types/node': 18.16.10 colors: 1.2.5 fs-extra: 7.0.1 import-lazy: 4.0.0 jju: 1.4.0 - resolve: 1.17.0 - semver: 7.3.8 - timsort: 0.3.0 + resolve: 1.22.2 + semver: 7.5.4 z-schema: 5.0.3 dev: true - /@rushstack/node-core-library@3.61.0(@types/node@18.16.10): - resolution: {integrity: sha512-tdOjdErme+/YOu4gPed3sFS72GhtWCgNV9oDsHDnoLY5oDfwjKUc9Z+JOZZ37uAxcm/OCahDHfuu2ugqrfWAVQ==} + /@rushstack/node-core-library@3.65.0(@types/node@20.1.7): + resolution: {integrity: sha512-4AistGV/26JjSMrBuCc0bh13ayQ5mZo/SpnJjETkmkoKNaqIQpZdWr/T04Sa3DLBc4U2e61cx5ZpDzvTVCo+pQ==} peerDependencies: '@types/node': '*' peerDependenciesMeta: '@types/node': optional: true dependencies: - '@types/node': 18.16.10 + '@types/node': 20.1.7 colors: 1.2.5 fs-extra: 7.0.1 import-lazy: 4.0.0 jju: 1.4.0 - resolve: 1.22.2 + resolve: 1.22.8 semver: 7.5.4 z-schema: 5.0.3 dev: true - /@rushstack/rig-package@0.3.13: - resolution: {integrity: sha512-4/2+yyA/uDl7LQvtYtFs1AkhSWuaIGEKhP9/KK2nNARqOVc5eCXmu1vyOqr5mPvNq7sHoIR+sG84vFbaKYGaDA==} - dependencies: - resolve: 1.17.0 - strip-json-comments: 3.1.1 - dev: true - /@rushstack/rig-package@0.5.1: resolution: {integrity: sha512-pXRYSe29TjRw7rqxD4WS3HN/sRSbfr+tJs4a9uuaSIBAITbUggygdhuG0VrO0EO+QqH91GhYMN4S6KRtOEmGVA==} dependencies: - resolve: 1.22.2 + resolve: 1.22.8 strip-json-comments: 3.1.1 dev: true - /@rushstack/ts-command-line@4.12.1: - resolution: {integrity: sha512-S1Nev6h/kNnamhHeGdp30WgxZTA+B76SJ/P721ctP7DrnC+rrjAc6h/R80I4V0cA2QuEEcMdVOQCtK2BTjsOiQ==} - dependencies: - '@types/argparse': 1.0.38 - argparse: 1.0.10 - colors: 1.2.5 - string-argv: 0.3.1 - dev: true - /@rushstack/ts-command-line@4.17.1: resolution: {integrity: sha512-2jweO1O57BYP5qdBGl6apJLB+aRIn5ccIRTPDyULh0KMwVzFqWtw6IZWt1qtUoZD/pD2RNkIOosH6Cq45rIYeg==} dependencies: @@ -6851,10 +6861,6 @@ packages: resolution: {integrity: sha512-Tfx3TU/PBK8vW/BG1TK793EHlVpGnoHUj+DGxOwNOYwZiueLeu7FgksvDdpEyFSw4+AKKiEuiMm8EGUHUR4o6g==} dev: false - /@types/node@12.20.24: - resolution: {integrity: sha512-yxDeaQIAJlMav7fH5AQqPH1u8YIuhYJXYBzxaQ4PifsU0GDO38MSdmEDeRlIxrKbC6NbEaaEHDanWb+y30U8SQ==} - dev: true - /@types/node@12.20.47: resolution: {integrity: sha512-BzcaRsnFuznzOItW1WpQrDHM7plAa7GIDMZ6b5pnMbkqEtM/6WCOhvZar39oeMQP79gwvFUWjjptE7/KGcNqFg==} dev: false @@ -10703,13 +10709,13 @@ packages: eslint: 8.49.0 dev: true - /eslint-config-turbo@1.10.15(eslint@8.49.0): - resolution: {integrity: sha512-76mpx2x818JZE26euen14utYcFDxOahZ9NaWA+6Xa4pY2ezVKVschuOxS96EQz3o3ZRSmcgBOapw/gHbN+EKxQ==} + /eslint-config-turbo@1.12.3(eslint@8.49.0): + resolution: {integrity: sha512-Q46MEOiNJpJWC3Et5/YEuIYYhbOieS04yZwQOinO2hpZw3folEXV+hbwVo8M+ap/q8gtpjIWiRMZ1A4QxmhEqQ==} peerDependencies: eslint: '>6.6.0' dependencies: eslint: 8.49.0 - eslint-plugin-turbo: 1.10.15(eslint@8.49.0) + eslint-plugin-turbo: 1.12.3(eslint@8.49.0) dev: false /eslint-import-resolver-node@0.3.7: @@ -11136,8 +11142,8 @@ packages: - typescript dev: true - /eslint-plugin-turbo@1.10.15(eslint@8.49.0): - resolution: {integrity: sha512-Tv4QSKV/U56qGcTqS/UgOvb9HcKFmWOQcVh3HEaj7of94lfaENgfrtK48E2CckQf7amhKs1i+imhCsNCKjkQyA==} + /eslint-plugin-turbo@1.12.3(eslint@8.49.0): + resolution: {integrity: sha512-7hEyxa+oP898EFNoxVenHlH8jtBwV1hbbIkdQWgqDcB0EmVNGVEZkYRo5Hm6BuMAjR433B+NISBJdj0bQo4/Lg==} peerDependencies: eslint: '>6.6.0' dependencies: @@ -12476,6 +12482,11 @@ packages: resolution: {integrity: sha512-Yu+q/XWr2fFQ11tHxPq4p4EiNkb2y+lAacJNhAdRXVfRIcDH6gi7htWFnnlIzvqHMHoWeIsfXlNAjZInpAOJDA==} dev: true + /hono@3.12.11: + resolution: {integrity: sha512-LSpxVgIMR3UzyFiXZaPvqBUGqyOKG0LMZqgMn2RXz9f+YAdkHSfFQQX0dtU72fPm5GnEMh5AYXs0ek5NYgMOmA==} + engines: {node: '>=16.0.0'} + dev: false + /hono@3.5.6: resolution: {integrity: sha512-ycTOpIZJ6yLbjzoE+ojsesC7G7ZXfGSoCIDyvqmzlHc5Mk4Aj48Ed9R5g7gw3v7rOkS81pjcYIvWef/karq1iA==} engines: {node: '>=16.0.0'} @@ -17202,12 +17213,6 @@ packages: resolution: {integrity: sha512-X2UW6Nw3n/aMgDVy+0rSqgHlv39WZAlZrXCdnbyEiKm17DSqHX4MmQMaST3FbeWR5FTuRcUwYAziZajji0Y7mg==} engines: {node: '>=10'} - /resolve@1.17.0: - resolution: {integrity: sha512-ic+7JYiV8Vi2yzQGFWOkiZD5Z9z7O2Zhm9XMaTxdJExKasieFCr+yXZ/WmXsckHiKl12ar0y6XiXDx3m4RHn1w==} - dependencies: - path-parse: 1.0.7 - dev: true - /resolve@1.19.0: resolution: {integrity: sha512-rArEXAgsBG4UgRGcynxWIWKFvh/XZCcS8UJdHhwy91zwAvCZIbcs+vAbflgBnNjYMs/i/i+/Ux6IZhML1yPvxg==} dependencies: @@ -17491,14 +17496,6 @@ packages: resolution: {integrity: sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==} hasBin: true - /semver@7.3.8: - resolution: {integrity: sha512-NB1ctGL5rlHrPJtFDVIVzTyQylMLu9N9VICA6HSFJo8MCGVTMW6gfpicwKmmK/dAjTOrqu5l63JJOpDSrAis3A==} - engines: {node: '>=10'} - hasBin: true - dependencies: - lru-cache: 6.0.0 - dev: true - /semver@7.5.1: resolution: {integrity: sha512-Wvss5ivl8TMRZXXESstBA4uR5iXgEN/VC5/sOcuXdVLzcdkz4HWetIoRfG5gb5X+ij/G9rw9YoGn3QoQ8OCSpw==} engines: {node: '>=10'} @@ -18325,10 +18322,6 @@ packages: resolution: {integrity: sha512-a7wPxPdVlQL7lqvitHGGRsofhdwtkoSXPGATFuSOA2i1ZNQEPLrGnj68vOp2sOJTCFAQVXPeNMX/GctBaO9L2w==} dev: true - /timsort@0.3.0: - resolution: {integrity: sha512-qsdtZH+vMoCARQtyod4imc2nIJwg9Cc7lPRrw9CzF8ZKR0khdr8+2nX80PBhET3tcyTtJDxAffGh2rXH4tyU8A==} - dev: true - /tinybench@2.5.0: resolution: {integrity: sha512-kRwSG8Zx4tjF9ZiyH4bhaebu+EDz1BOx9hOigYHlUW4xxI/wKIUQUqo018UlU4ar6ATPBsaMrdbKZ+tmPdohFA==} dev: false @@ -18817,12 +18810,6 @@ packages: hasBin: true dev: true - /typescript@4.6.4: - resolution: {integrity: sha512-9ia/jWHIEbo49HfjrLGfKbZSuWo9iTMwXO+Ca3pRsSpbsMbc7/IU8NKdCZVRRBafVPGnoJeFL76ZOAA84I9fEg==} - engines: {node: '>=4.2.0'} - hasBin: true - dev: true - /typescript@4.9.5: resolution: {integrity: sha512-1FXk9E2Hm+QzZQ7z+McJiHL4NW1F2EzMu9Nq9i3zAaGqibafqYwCVU6WyWAuyQRRzOlxou8xZSyXLEN8oKj24g==} engines: {node: '>=4.2.0'} @@ -18840,6 +18827,12 @@ packages: hasBin: true dev: true + /typescript@5.3.3: + resolution: {integrity: sha512-pXWcraxM0uxAS+tN0AG/BF2TyqmHO014Z070UsJ+pFvYuRSq8KH8DmWpnbXe0pEPDHXZV3FcAbJkijJ5oNEnWw==} + engines: {node: '>=14.17'} + hasBin: true + dev: true + /ufo@1.3.0: resolution: {integrity: sha512-bRn3CsoojyNStCZe0BG0Mt4Nr/4KF+rhFlnNXybgqt5pXHNFRlqinSoQaTrGyzE4X8aHplSb+TorH+COin9Yxw==} @@ -19561,17 +19554,17 @@ packages: resolution: {integrity: sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==} dev: true - /workerd@1.20231218.0: - resolution: {integrity: sha512-AGIsDvqCrcwhoA9kb1hxOhVAe53/xJeaGZxL4FbYI9FvO17DZwrnqGq+6eqItJ6Cfw1ZLmf3BM+QdMWaL2bFWQ==} + /workerd@1.20240129.0: + resolution: {integrity: sha512-t4pnsmjjk/u+GdVDgH2M1AFmJaBUABshYK/vT/HNrAXsHSwN6VR8Yqw0JQ845OokO34VLkuUtYQYyxHHKpdtsw==} engines: {node: '>=16'} hasBin: true requiresBuild: true optionalDependencies: - '@cloudflare/workerd-darwin-64': 1.20231218.0 - '@cloudflare/workerd-darwin-arm64': 1.20231218.0 - '@cloudflare/workerd-linux-64': 1.20231218.0 - '@cloudflare/workerd-linux-arm64': 1.20231218.0 - '@cloudflare/workerd-windows-64': 1.20231218.0 + '@cloudflare/workerd-darwin-64': 1.20240129.0 + '@cloudflare/workerd-darwin-arm64': 1.20240129.0 + '@cloudflare/workerd-linux-64': 1.20240129.0 + '@cloudflare/workerd-linux-arm64': 1.20240129.0 + '@cloudflare/workerd-windows-64': 1.20240129.0 dev: false /wrap-ansi@6.2.0: diff --git a/templates/stream/webrtc/package.json b/templates/stream/webrtc/package.json index d2de7e1a8f54..8491d0e6db21 100644 --- a/templates/stream/webrtc/package.json +++ b/templates/stream/webrtc/package.json @@ -4,8 +4,8 @@ "private": true, "scripts": { "build-ts": "tsc && cp dist/WHIPClient.js src/WHIPClient.js && cp dist/WHEPClient.js src/WHEPClient.js && cp dist/negotiateConnectionWithClientOffer.js src/negotiateConnectionWithClientOffer.js", - "dev": "WRANGLER_SEND_METRICS=false wrangler pages dev --local ./src", - "start-stackblitz": "WRANGLER_SEND_METRICS=false wrangler pages dev --local ./src" + "dev": "WRANGLER_SEND_METRICS=false wrangler pages dev --local --port 1180 ./src", + "start-stackblitz": "WRANGLER_SEND_METRICS=false wrangler pages dev --local --port 1180 ./src" }, "devDependencies": { "typescript": "^4.5.4", diff --git a/templates/stream/webrtc/src/index.html b/templates/stream/webrtc/src/index.html index 1915aebe1f4a..10a8b5e3d470 100644 --- a/templates/stream/webrtc/src/index.html +++ b/templates/stream/webrtc/src/index.html @@ -1,5 +1,21 @@ + +

Playing video using WHEP

@@ -28,12 +43,11 @@
(remote content)