Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ingestion fails due to 413 Client Error: Payload Too Large for url #11904

Open
rospe opened this issue Nov 20, 2024 · 0 comments
Open

Ingestion fails due to 413 Client Error: Payload Too Large for url #11904

rospe opened this issue Nov 20, 2024 · 0 comments
Labels
bug Bug report ingestion PR or Issue related to the ingestion of metadata

Comments

@rospe
Copy link

rospe commented Nov 20, 2024

Describe the bug
Various custom and Snowflake ingestion runs fail while trying to send data to the GMS server.
I think this started with v0.14.1 and I guess it is due to the default sink mode: ASYNC_BATCH that has been activated then.
We already set client_max_body_size: "100m" in our nginx config and nginx.ingress.kubernetes.io/proxy-body-size: 200m in the Frontend ingress config.

log message from ingestion (using acryldata/datahub-ingestion:v0.14.1)
{'error': 'Unable to emit metadata to DataHub GMS', 'info': {'message': '413 Client Error: Payload Too Large for url: '...

To Reproduce
Start our (previously working) Snowflake ingestion using datahub rest as sink with version v0.14.1.

Expected behavior
Request size must not exceed our upper size limit.

Additional context
Setting the sink config to mode: ASYNC is a workaround.

@rospe rospe added the bug Bug report label Nov 20, 2024
@RyanHolstien RyanHolstien added the ingestion PR or Issue related to the ingestion of metadata label Nov 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Bug report ingestion PR or Issue related to the ingestion of metadata
Projects
None yet
Development

No branches or pull requests

2 participants