-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[datadog] http output POST size (gzip) #1187
Comments
thanks for reporting the case. Internally Fluent Bit uses msgpack format (like JSON binary) to serialize the records. The work of out_http plugin per your configuration is to convert that msgpack to a JSON payload (likely bigger than msgpack) Note that Fluent Bit generate msgpack chunks of no more than 2MB, but looks like I found a case where it could be greater. I am troubleshooting now. |
hmm anyways looks like Datadog have size limits for HTTP API:
ref: https://docs.datadoghq.com/api/?lang=bash#get-list-of-active-metrics anyways we will fix the chunk size problem and I think adding gzip compression on HTTP output will help too |
We are also seeing 413 errors when sending from fluent-bit via http plugin (format: json) to an apache. |
To test we disabled SSL and can see that the POST requests are 2.2MB big.
|
I work with Datadog. It does indeed seem that the problem might be the size of the packet. As our documentation says we limit payload size to 2MB. I will try to take a further look into what can we do here. |
i'd be great if the output size was configurable |
@nvrmnd |
FYI: folks, I've just merged gzip support for out_http plugin that will land on v1.3 release (~Aug). This new option called compress gzip guarantee only data compression, so likely your data will be reduced under 2MB. Datadog team, are you planning to increase this limit anyways ? (cc: @irabinovitch) |
Hello @edsiper , Datadog teams are indeed working on increasing those limits. Regarding compression, the HTTP api do not currently support compressed logs but that's something that is plan as well and should be there in 2019. |
Hello @edsiper and everyone, I wanted to let you know that the Datadog HTTP api to submit logs have been updated to receive batches:
Do you believe that would fit into the fluentbit batch size? |
yes, it should. Since Datadog has contributed a new out_datadog plugin, I am closing this ticket. People should move to use formal plugin upon v1.3 release. |
Does bbba49f solve this problem? Is there any documentation for this new option? Cheers. |
Yeah. Sending more than 5MB of logs is kinda stupid but I'm still seeing 413 errors. I tried tag 1.3.0 on my pod with There any way to identify the request responsible for this (5MB+ payload)? Like a fallback I will check the new output! |
Hey @semoac , so currently the compression is not supported on the Datadog HTTP api. So just using the default @edsiper do you know if there is any plan to define a payload max size in Fluentbit to have a better control on this? |
@irabinovitch @NBParis One last thing, I'm using Cheers. |
@semoac thanks for the confirmation. Your changes might have been linked with the update on our end of the maximum batch size which could explain why switching to the Do you remember when you did the migration? |
@NBParis the migration was done on Sept 30, but I notice this on Oct 1. |
Bug Report
Describe the bug
I'm trying to push logs to a Datadog service using http output plugin from a tail input. Everything works well while I add few records at a time to a log file. But when I start with file having already a few thousands of records, looks like http output tries to push them all in one POST making service respond with 413 http status error(too big request/payload).
To Reproduce
Expected behavior
Maybe there is a way to limit POST size for every http request somehow?
Your Environment
The text was updated successfully, but these errors were encountered: