-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
possible denial of service attack due to unlimited errors CVE-2023-49559 #3118
Comments
I came here with the same question. It seems that it should be the responsibility of |
I'm open to a PR here in gqlgen that would add a configurable token limit, but leave the default as 0 (unlimited).
|
I'm taking a quick look at this and it doesn't seem that gqlparser exposes a way to make it configurable, unless I'm missing something. I see that the method Not sure what would be the best approach to expose this though. A solution would be to add functional options to func ParseQuery(source *Source, options ...parserOption) { ... }
func WithTokenLimit(limit int) parserOption { ... }
// you would call ParseQuery like this
doc, err := parser.ParseQuery(&ast.Source{Input: query}, WithTokenLimit(2000)) |
🤦 Ugh, you are entirely correct @xaviergodart . See github.com/vektah/gqlparser#304 and https://github.com/vektah/gqlparser/releases/tag/v2.5.15 I don't want to break backwards compatibility for any of the existing consumers of gqlparser library, so people can opt-in to the new behavior using |
Anyway, @xaviergodart or @PookieTek please submit a PR here that adds a configurable token limit to gqlgen. The default should be 0 (unlimited). |
* Use ParseQueryWithLmit and add parserTokenLimit to executor * add parser token limit test * remove failing test * move default token limit to const --------- Co-authored-by: Xavier Godart <[email protected]>
Fixed in #3136 |
By running the following query against our gqlgen based servers:
We can reliably get them OOM killed in our k8s cluster.
The query is taken from https://bessey.dev/blog/2024/05/24/why-im-over-graphql/ and the suggested solution is to limit the number of errors handled.
We could improve the situation a bit by limiting the actual number of errors returned in the response. However the issue still persists because we can only limit the list before returning the response - not preventing it from growing during parsing.
If I send enough directive, any server will crash. We should expose a limit to the amount of tokens without breaking existing behavior (unlimited tokens).
versions
go run github.com/99designs/gqlgen version
? v0.17.40go version
? go1.22.3The text was updated successfully, but these errors were encountered: