Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

google cloudrun updated their limits on maxscale based on memory and cpu count #1779

Closed
fgregg opened this issue Aug 10, 2022 · 13 comments
Closed

Comments

@fgregg
Copy link
Contributor

fgregg commented Aug 10, 2022

if you don't set an explicit limit on container scaling, then google defaults to 100

google recently updated the limits on container scaling, such that if you set up datasette to use more memory or cpu, then you need to set the maxScale argument much smaller than 100.

would be nice if datasette publish could do this math for you and set the right maxScale.

Log of an failing publish run.

ERROR: (gcloud.run.deploy) spec.template.spec.containers[0].resources.limits.cpu: Invalid value specified for cpu. For the specified value, maxScale may not exceed 15.
Consider running your workload in a region with greater capacity, decreasing your requested cpu-per-instance, or requesting an increase in quota for this region if you are seeing sustained usage near this limit, see https://cloud.google.com/run/quotas. Your project may gain access to further scaling by adding billing information to your account.
Traceback (most recent call last):
  File "/home/runner/.local/bin/datasette", line 8, in <module>
    sys.exit(cli())
  File "/home/runner/.local/lib/python3.8/site-packages/click/core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "/home/runner/.local/lib/python3.8/site-packages/click/core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "/home/runner/.local/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/runner/.local/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/runner/.local/lib/python3.8/site-packages/click/core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/runner/.local/lib/python3.8/site-packages/click/core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "/home/runner/.local/lib/python3.8/site-packages/datasette/publish/cloudrun.py", line 160, in cloudrun
    check_call(
  File "/usr/lib/python3.8/subprocess.py", line 364, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/labordata/datasette warehouse --memory 8Gi --cpu 2' returned non-zero exit status 1.
@fgregg
Copy link
Contributor Author

fgregg commented Aug 10, 2022

maybe a simpler solution is to set the maxscale to like 2? since datasette is not set up to make use of container scaling anyway?

@simonw simonw added this to the Datasette 0.62 milestone Aug 14, 2022
@simonw
Copy link
Owner

simonw commented Aug 14, 2022

I'm going to default maxscale to 2 but provide an extra command line option for datasette publish cloudrun which lets people set it to something higher if they want to.

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Actually I disagree that Datasette isn't setup to make use of container scaling: part of the idea behind the Baked Data pattern is that you can scale to handle effectively unlimited traffic by running multiple copies of your application, each with their own duplicate copy of the database.

So I'm going to default maxScale to 10 and still let people customize it.

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Here's the relevant part of the datasette publish command from that failed Actions workflow:

  datasette publish cloudrun f7.db nlrb.db opdr.db old_nlrb.db voluntary_recognitions.db work_stoppages.db lm20.db chips.db \
    --memory 8Gi \
    --cpu 2

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Tried duplicating this error locally but the following command succeeded when I expected it to fail:

datasette publish cloudrun fixtures.db --memory 8Gi --cpu 2 --service issue-1779

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Maybe I need to upgrade:

% gcloud --version
Google Cloud SDK 378.0.0
alpha 2022.03.18
bq 2.0.74
core 2022.03.18
gsutil 5.8
Updates are available for some Google Cloud CLI components.  To install them,
please run:
  $ gcloud components update
% gcloud components update
Beginning update. This process may take several minutes.


Your current Google Cloud CLI version is: 378.0.0
You will be upgraded to version: 397.0.0

┌─────────────────────────────────────────────────────────────────────────────┐
│                      These components will be updated.                      │
├─────────────────────────────────────────────────────┬────────────┬──────────┤
│                         Name                        │  Version   │   Size   │
├─────────────────────────────────────────────────────┼────────────┼──────────┤
│ BigQuery Command Line Tool                          │     2.0.75 │  1.6 MiB │
│ BigQuery Command Line Tool (Platform Specific)      │     2.0.75 │  < 1 MiB │
│ Cloud Storage Command Line Tool                     │       5.11 │ 15.5 MiB │
│ Cloud Storage Command Line Tool (Platform Specific) │       5.11 │  < 1 MiB │
│ Google Cloud CLI Core Libraries                     │ 2022.08.05 │ 24.3 MiB │
│ Google Cloud CLI Core Libraries (Platform Specific) │ 2022.08.05 │  < 1 MiB │
│ anthoscli                                           │     0.2.28 │ 48.0 MiB │
│ gcloud Alpha Commands                               │ 2022.08.05 │  < 1 MiB │
│ gcloud cli dependencies                             │ 2022.07.29 │ 11.2 MiB │
└─────────────────────────────────────────────────────┴────────────┴──────────┘

A lot has changed since your last upgrade.  For the latest full release notes,
please visit:
  https://cloud.google.com/sdk/release_notes
% gcloud --version        
Google Cloud SDK 397.0.0
alpha 2022.08.05
bq 2.0.75
core 2022.08.05
gsutil 5.11

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

datasette publish cloudrun fixtures.db --memory 8Gi --cpu 2 --service issue-1779 still works.

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Just spotted this in the failing Actions workflow:

gcloud config set run/region us-central1

I tried that locally too but the deploy still succeeds.

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Just tried this instead, and it still worked and deployed OK:

datasette publish cloudrun fixtures.db --memory 16Gi --cpu 4 --service issue-1779

@fgregg I'm not able to replicate your deployment failure I'm afraid.

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

(I deleted my issue-1779 project using the UI at https://console.cloud.google.com/run?project=datasette-222320)

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Here's the start of the man page for gcloud run deploy:

NAME
    gcloud run deploy - deploy a container to Cloud Run

SYNOPSIS
    gcloud run deploy [[SERVICE] --namespace=NAMESPACE] [--args=[ARG,...]]
        [--async] [--command=[COMMAND,...]] [--concurrency=CONCURRENCY]
        [--cpu=CPU] [--ingress=INGRESS; default="all"]
        [--max-instances=MAX_INSTANCES] [--memory=MEMORY]
        [--min-instances=MIN_INSTANCES]

I'm going to expose --max-instances and --min-instances as extra option to datasette publish cloudrun.

@simonw
Copy link
Owner

simonw commented Aug 14, 2022

Tested that with:

datasette publish cloudrun fixtures.db --service issue-1779 --min-instances 2 --max-instances 4

image

@fgregg
Copy link
Contributor Author

fgregg commented Aug 14, 2022

thanks @simonw!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants