-
Notifications
You must be signed in to change notification settings - Fork 758
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache current process object to avoid performance hit #5597
Cache current process object to avoid performance hit #5597
Conversation
c8ac8de
to
995dc27
Compare
Did you test it and it worked as expected? |
The value of Process.WorkingSet64 will not update unless Process.Refresh() is called. This method retrieves the latest process information, so it's necessary to call _currentProcess.Refresh() before accessing WorkingSet64. |
We need to be wise about the refresh, refresh literally disposes all handles to the process and the first call to the Process object it gets the handle again and is able to get the new values, at that point getting a new process or refreshing the process is the same cost, so from that perspective we can't do much as far as I understand, as long as the extension respects the sampling rate, I think the best would be to have better defaults that are not so aggressive? Refresh source code: https://source.dot.net/#System.Diagnostics.Process/System/Diagnostics/Process.cs,173ad2a094ae7507,references |
That's why I am asking if it works without the Refresh() or not :) If it does, probably we can get away with it. If it does not (e.g. WorkingSet value is not getting updated), we can't do anything but use |
Refresh() vs. new Process() could potentially improve performance by 0.1%. Since |
@haipz have you had a chance to test if it works as proposed? |
What about using Environment.WorkingSet would seem like the best bet, but we would have to make sure it maps properly and the perf is better the working set does match/map (with a low discrepancy) We would still need to measure perf @haipz |
I did a quick BenchMark.Net and if we were to replace p.WorkingSet64 with Environment.WorkingSet we would see a big win. |
My last benchmark had a minor issue where I was getting the process and refreshing at the same time but the results still show a big improvement when using Environment.WorkingSet. Also @evgenyfedorov2 @haipz @Yun-Ting has done a Refresh() vs GetProcess benchmark here, there's no gains from using one or the other open-telemetry/opentelemetry-dotnet-contrib#717 |
@danespinosa Awesome work. @evgenyfedorov2 What do you think? |
Thanks, sounds good to me! |
caf2188
to
f7a9d62
Compare
@dotnet-policy-service agree company="Microsoft" |
...Libraries/Microsoft.Extensions.Diagnostics.ResourceMonitoring/Windows/Interop/ProcessInfo.cs
Outdated
Show resolved
Hide resolved
...Libraries/Microsoft.Extensions.Diagnostics.ResourceMonitoring/Windows/Interop/ProcessInfo.cs
Outdated
Show resolved
Hide resolved
...raries/Microsoft.Extensions.Diagnostics.ResourceMonitoring.Tests/Windows/ProcessInfoTests.cs
Outdated
Show resolved
Hide resolved
...Libraries/Microsoft.Extensions.Diagnostics.ResourceMonitoring/Windows/Interop/ProcessInfo.cs
Show resolved
Hide resolved
* Read working set from Environment in ProcessInfo since it has better performance. * Add unit test for ProcessInfo. * Remove OSSkipCondition tag from process info unit test since it's cross-platform. * Use Environment.WorkingSet in GetMemoryUsageInBytes.
GetCurrentProcessMemoryUsage
will be called bysrc/Libraries/Microsoft.Extensions.Diagnostics.ResourceMonitoring/Windows/WindowsContainerSnapshotProvider.cs
fromGetSnapshot
method, which will be triggered every 1 second by default. And also this will be called by_ = meter.CreateObservableGauge(name: ResourceUtilizationInstruments.ProcessMemoryUtilization, observeValue: () => MemoryPercentage(() => _processInfo.GetCurrentProcessMemoryUsage()));
with default interval: 5 seconds.Microsoft Reviewers: Open in CodeFlow
Related issue: #5588
Refer to another similar fix in Otel: open-telemetry/opentelemetry-dotnet-contrib#2286