-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limits on locals() logging do not prevent runaway traversal #329
Comments
If I understand correctly, the idea is to apply the limits from the locals config dictionary during capture of the locals objects rather than later.
I thought (Looking at the options defined here: Lines 201 to 213 in e767ba4
|
Yes. The limits seem currently used to censor objects going out to logs but the I also couldn't see where So questions for you: |
Any update on this? |
Thanks for the nudge. @nicku33 There are not other naming standards to follow. I prefer maxdepth, assuming maxlevel isn't already being used for this. (When I reviewed I didn't see it being used for anything.) Overall I agree with the motivation and the plan. |
We have been talking about this internally and it is being moved up in priority. Since the last activity on this issue, we have released Still to be done is to limit the collection of data instead of trying to shorten it after collecting it all. |
Currently the values defined in DEFAULT_LOCALS_SIZES
don't prevent traversal of large collections. The Shortener only applies after the whole object graph has been copied, as far as I can see. In our case, data transformation, it is not uncommon to have massive lists and dictionaries with 100k+ objects.
In traverse.py
traverses and creates a copy of the collection. This can itself be very time consuming as well as memory intensive. Neither things one wants in a logging library. Memory use in particular should be practically bounded.
In addition, there is no limit to the amount of recursion except for circular reference checks. So for example a highly imbalanced tree implemented in dict() that has degraded to a linked list will be traversed.
I'd like to apply the iteration limits defined in configuration to the generators above. Maybe making a
def limited_enumerate(iterable, stop_limit)
to plug into enumerate.As well I'd like to add a recursion count that respects an optional limit.
Thoughts ? Let me know if I've erred as this is all based on inspection. I haven't written a test.
The text was updated successfully, but these errors were encountered: