Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade SDK to 6.0.100-rc.1.21411.13 #57143

Merged
merged 12 commits into from
Aug 12, 2021
Merged

Upgrade SDK to 6.0.100-rc.1.21411.13 #57143

merged 12 commits into from
Aug 12, 2021

Conversation

ViktorHofer
Copy link
Member

Both the AzDO and the Core-Eng team believe believe that the issue is on our side and was caused by a thread pool regression. The assumption stands that we need to update to a newer SDK which contains the fix for the thread pool hang.

Pros:

  • AzDO and Core-Eng believe that this will mitigate the AzDO feed restore issues.
    Cons:
  • We will upgrade to an unsigned SDK build. Arcade and other repos already did the same to workaround the issue.
  • That SDK build isn’t officially released and won’t until RC 1 ships. This means that developers need to install that build via the nightly channel (from https://github.com/dotnet/installer) if they want to use their globally installed SDK in combination with dotnet/runtime.
  • Even though this is a breaking change, we can’t wait for the next monthly infrastructure rollout.

Both the AzDO and the Core-Eng team believe believe that the issue is on our side and was caused by a thread pool regression. The assumption stands that we need to update to a newer SDK which contains the fix for the thread pool hang.

Pros:
-	AzDO and Core-Eng believe that this will mitigate the AzDO feed restore issues.
Cons:
-	We will upgrade to an unsigned SDK build. Arcade and other repos already did the same to workaround the issue.
-	That SDK build isn’t officially released and won’t until RC 1 ships. This means that developers need to install that build via the nightly channel (from https://github.com/dotnet/installer) if they want to use their globally installed SDK in combination with dotnet/runtime.
-	Even though this is a breaking change, we can’t wait for the next monthly infrastructure rollout.
@ViktorHofer ViktorHofer added the NO-MERGE The PR is not ready for merge yet (see discussion for detailed reasons) label Aug 10, 2021
@stephentoub
Copy link
Member

Both the AzDO and the Core-Eng team believe believe that the issue is on our side and was caused by a thread pool regression.

Meaning #56346?

If that's true, code somewhere is doing sync-over-async blocking waiting for a task. Do they know where? Can we fix that as well?

@dotnet-issue-labeler
Copy link

I couldn't figure out the best area label to add to this PR. If you have write-permissions please help me learn by adding exactly one area label.

@ghost
Copy link

ghost commented Aug 10, 2021

Tagging subscribers to this area: @dotnet/runtime-infrastructure
See info in area-owners.md if you want to be subscribed.

Issue Details

Both the AzDO and the Core-Eng team believe believe that the issue is on our side and was caused by a thread pool regression. The assumption stands that we need to update to a newer SDK which contains the fix for the thread pool hang.

Pros:

  • AzDO and Core-Eng believe that this will mitigate the AzDO feed restore issues.
    Cons:
  • We will upgrade to an unsigned SDK build. Arcade and other repos already did the same to workaround the issue.
  • That SDK build isn’t officially released and won’t until RC 1 ships. This means that developers need to install that build via the nightly channel (from https://github.com/dotnet/installer) if they want to use their globally installed SDK in combination with dotnet/runtime.
  • Even though this is a breaking change, we can’t wait for the next monthly infrastructure rollout.
Author: ViktorHofer
Assignees: -
Labels:

* NO MERGE *, area-Infrastructure

Milestone: -

@ViktorHofer
Copy link
Member Author

Meaning #56346?

Correct.

If that's true, code somewhere is doing sync-over-async blocking waiting for a task. Do they know where? Can we fix that as well?

@MattGal where you the one who was in contact with AzDO about the issue?

@jkoritzinsky
Copy link
Member

My guess would be the sync-over-async is somewhere near the border of MSBuild and NuGet since MSBuild tasks can't be async.

@MattGal
Copy link
Member

MattGal commented Aug 10, 2021

@MattGal where you the one who was in contact with AzDO about the issue?

Yes, and "normal" types of failure (HTTP 429, 503, etc) are likely still worth pursuing with them, since they'll have telemetry.

After much investigation though, it seemed like the connections weren't even getting through, and after a lot of chatting with @wfurt , a change from @kouvel seemed relevant. Empirically ASP has had significant improvement from taking the new SDK as well.

( Edit: I think if you're wanting to follow up with a partner team, it's the nuget client, not Azure Devops, for their usage of .NET APIs)

@ViktorHofer
Copy link
Member Author

ViktorHofer commented Aug 10, 2021

cc @johnterickson (AzDO packaging)

EDIT: okt it seems like this isn't AzDO related and most likely a client issue on NuGet's side. Sorry for the cc John.

@ViktorHofer
Copy link
Member Author

.dotnet/sdk/6.0.100-rc.1.21379.2/Sdks/Microsoft.NET.Sdk.Razor/targets/Microsoft.NET.Sdk.Razor.StaticWebAssets.targets(425,5): error : (NETCORE_ENGINEERING_TELEMETRY=Build) System.IO.DirectoryNotFoundException: Could not find a part of the path '/__w/1/s/artifacts/obj/mono/BrowserDebugHost/Release/net6.0/StaticWebAssets.build.json'.
at Microsoft.Win32.SafeHandles.SafeFileHandle.Open(String path, OpenFlags flags, Int32 mode) in System.Private.CoreLib.dll:token 0x60000d5+0x0
at Microsoft.Win32.SafeHandles.SafeFileHandle.Open(String fullPath, FileMode mode, FileAccess access, FileShare share, FileOptions options, Int64 preallocationSize) in System.Private.CoreLib.dll:token 0x60000d9+0x31
at System.IO.File.WriteAllBytes(String path, Byte[] bytes) in System.Private.CoreLib.dll:token 0x6005ccc+0x2b
at Microsoft.AspNetCore.Razor.Tasks.GenerateStaticWebAssetsManifest.PersistManifest(StaticWebAssetsManifest manifest) in Microsoft.NET.Sdk.Razor.Tasks.dll:token 0x600018a+0x3b
at Microsoft.AspNetCore.Razor.Tasks.GenerateStaticWebAssetsManifest.Execute() in Microsoft.NET.Sdk.Razor.Tasks.dll:token 0x6000187+0xeb

@steveisok @akoeplinger who would be the best person to look into this issue in the wasm legs?

@steveisok
Copy link
Member

@steveisok @akoeplinger who would be the best person to look into this issue in the wasm legs?

/cc @lewing @radical

@ViktorHofer
Copy link
Member Author

ViktorHofer commented Aug 10, 2021

If that's true, code somewhere is doing sync-over-async blocking waiting for a task. Do they know where? Can we fix that as well?

@stephentoub we would need to approach the product team that we believe own the faulty code. If we believe that's NuGet, we should figure out how to approach them. One way I could imagine would be to help them reviewing their code to find such patterns.

cc @SteveMCarroll @danmoseley

@MattGal
Copy link
Member

MattGal commented Aug 10, 2021

If that's true, code somewhere is doing sync-over-async blocking waiting for a task. Do they know where? Can we fix that as well?

@stephentoub we would need to approach the product team that we believe own the faulty code. If we believe that's NuGet, we should figure out how to approach them. One way I could imagine would be to help them reviewing their code to find such patterns.

cc @SteveMCarroll @danmoseley

I think you may be looking to engage with the last few git-blame-folks from https://github.com/NuGet/NuGet.Client/blob/0ae80f5495c8d6fad2ec6e2c708727675ff08fb6/src/NuGet.Core/NuGet.Protocol/HttpSource/HttpSource.cs#L321-L350 , though it's very infrequently updated.

@lewing
Copy link
Member

lewing commented Aug 10, 2021

.dotnet/sdk/6.0.100-rc.1.21379.2/Sdks/Microsoft.NET.Sdk.Razor/targets/Microsoft.NET.Sdk.Razor.StaticWebAssets.targets(425,5): error : (NETCORE_ENGINEERING_TELEMETRY=Build) System.IO.DirectoryNotFoundException: Could not find a part of the path '/__w/1/s/artifacts/obj/mono/BrowserDebugHost/Release/net6.0/StaticWebAssets.build.json'.
at Microsoft.Win32.SafeHandles.SafeFileHandle.Open(String path, OpenFlags flags, Int32 mode) in System.Private.CoreLib.dll:token 0x60000d5+0x0
at Microsoft.Win32.SafeHandles.SafeFileHandle.Open(String fullPath, FileMode mode, FileAccess access, FileShare share, FileOptions options, Int64 preallocationSize) in System.Private.CoreLib.dll:token 0x60000d9+0x31
at System.IO.File.WriteAllBytes(String path, Byte[] bytes) in System.Private.CoreLib.dll:token 0x6005ccc+0x2b
at Microsoft.AspNetCore.Razor.Tasks.GenerateStaticWebAssetsManifest.PersistManifest(StaticWebAssetsManifest manifest) in Microsoft.NET.Sdk.Razor.Tasks.dll:token 0x600018a+0x3b
at Microsoft.AspNetCore.Razor.Tasks.GenerateStaticWebAssetsManifest.Execute() in Microsoft.NET.Sdk.Razor.Tasks.dll:token 0x6000187+0xeb

@ViktorHofer the issue is #56974 (comment) I'll pull the changes I added there here

@stephentoub
Copy link
Member

Do we have a dump from any of the processes that hung?

@lewing
Copy link
Member

lewing commented Aug 10, 2021

@lambdageek why are browser AOT/EAT lanes failing on hotreload here?

@lambdageek
Copy link
Member

lambdageek commented Aug 10, 2021

@lewing Did something change about how wasm tests are built& executed in rc1?

Two possibilities:

  1. Does this line still work:

    <WasmXHarnessMonoArgs>--setenv=DOTNET_MODIFIABLE_ASSEMBLIES=debug</WasmXHarnessMonoArgs>

  2. Do the AOT/EAT tests enable the hot reload runtime component?

@@ -5,6 +5,7 @@
<AspNetCoreHostingModel>InProcess</AspNetCoreHostingModel>
<OutputType>Exe</OutputType>
<EnableDefaultCompileItems>false</EnableDefaultCompileItems>
<StaticWebAssetsEnabled>false</StaticWebAssetsEnabled>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lewing thanks for the heads up. It's fixed in dotnet/sdk#19482

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's fixed in the SDK do we need to remove these now?

Copy link
Member

@lewing lewing Aug 14, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, I'll revert the workarounds

@radical
Copy link
Member

radical commented Aug 11, 2021

@lewing Did something change about how wasm tests are built& executed in rc1?

Two possibilities:

  1. Does this line still work:
    <WasmXHarnessMonoArgs>--setenv=DOTNET_MODIFIABLE_ASSEMBLIES=debug</WasmXHarnessMonoArgs>

It does seem to be passed to the app:

[20:11:48] info: console.info: Arguments: --setenv=DOTNET_MODIFIABLE_ASSEMBLIES=debug,--run,WasmTestRunner.dll,System.Runtime.Loader.Tests.dll,-notrait,category=IgnoreForCI,-notrait,category=OuterLoop,-notrait,category=failing

This was a recent setenv related fix, but that should be fine, AFAICS.

  1. Do the AOT/EAT tests enable the hot reload runtime component?

This?

  <ItemGroup Condition="'$(Configuration)' == 'Debug' and '@(_MonoComponent->Count())' == 0">
    <_MonoComponent Include="hot_reload;debugger" />
  </ItemGroup>

@lambdageek
Copy link
Member

  1. Do the AOT/EAT tests enable the hot reload runtime component?

This?

  <ItemGroup Condition="'$(Configuration)' == 'Debug' and '@(_MonoComponent->Count())' == 0">
    <_MonoComponent Include="hot_reload;debugger" />
  </ItemGroup>

Ah, but the EAT/AOT tests look like they're Release configuration.

I set the EnableDebugInformation property for the hot reload test assemblies so that they're still modifiable (but the main test assembly is built normally):

<!-- to call AsssemblyExtensions.ApplyUpdate we need Optimize=false, EmitDebugInformation=true in all configurations -->
<Optimize>false</Optimize>
<EmitDebugInformation>true</EmitDebugInformation>

maybe we should set _MonoComponent explicitly for this test assembly?

@radical
Copy link
Member

radical commented Aug 11, 2021

Ah, but the EAT/AOT tests look like they're Release configuration.

that's the same for the regular tests too.

I set the EnableDebugInformation property for the hot reload test assemblies so that they're still modifiable (but the main test assembly is built normally):

<!-- to call AsssemblyExtensions.ApplyUpdate we need Optimize=false, EmitDebugInformation=true in all configurations -->
<Optimize>false</Optimize>
<EmitDebugInformation>true</EmitDebugInformation>

maybe we should set _MonoComponent explicitly for this test assembly?

Are runtime components used with these tests at all? IIUC, they are available only through workload packs. And looking at the binlog, I don't see any RuntimeConfigManifest.targets in there.

Either way, what changed in this PR that is causing the failure?

Also, (not necessarily in this PR) do we need to add some public item to allow setting/overriding these components?

@ViktorHofer
Copy link
Member Author

As the rest of the stack already moved onto the RC1 SDK we want to get this in asap. Otherwise we will be stuck with outdated versions of our dependencies, i.e. hotreload-utils (#56974 (comment)).

@lambdageek
Copy link
Member

I think we should just delete the IsSupported2 test . It's calling MetadataUpdater.IsSupported which the IL linker rewrites to false when trimming is enabled. The test is effectively tautological: if MetadataUpdater.IsSupported returns true we expect it to return true.

@lewing
Copy link
Member

lewing commented Aug 11, 2021

I've removed IsSupported2 lets see how things look now.

@lewing
Copy link
Member

lewing commented Aug 11, 2021

I'm not sure what is causing the coreclr failure but they seem to be choking on some questionable command line + argument parsing.

cc @danmoseley

@danmoseley danmoseley reopened this Aug 12, 2021
@josalem
Copy link
Contributor

josalem commented Aug 12, 2021

About to push an update to the test that should handle this (I had been testing on Linux, but it appears this only fails on Windows). That did point out that the logic in SaveManagedCommandLine or perhaps GetCommandLine in the PAL is behaving differently between Windows and non-Windows:
https://github.com/dotnet/runtime/blob/main/src/coreclr/vm/ceeload.cpp#L11365-L11411

On non-Windows, this is saving only corerun app.dll, but on Windows, it appears to be saving corerun <all the property arguments> app.dll. I'm don't think this is a problem directly, but the difference in behavior is odd.

I'll push the update in a sec.

@danmoseley
Copy link
Member

After that @agocke we still have AppHostUsedWithSymbolicLinks.Put_app_directory_behind_symlink_and_use_dotnet_run. Could you please ask someone to take a look at that or disable it?

Ignoring the xunit crash, that might be it. I see a JIT failure with no log: we will see if it happens again.

@danmoseley
Copy link
Member

I didn't pull down the dump for the JSON crash, which seems sporadic. If I have time later today I will try.

@danmoseley
Copy link
Member

@dotnet/dnceng what could cause there to be only a single line in the console output? (Console log: 'JIT.jit64.hfa' from job 76fdf20e-5995-47a2-a5d4-656da646e9e1 workitem 9bdf57b5-ba46-4640-b90d-73652d96f040 (windows.10.arm64v8.open) executed on machine DDARM64-146)

Seems a JIT test failed but there is no info.

https://dev.azure.com/dnceng/public/_build/results?buildId=1290308&view=ms.vss-test-web.build-test-results-tab&runId=38085910&resultId=103042&paneView=dotnet-dnceng.dnceng-build-release-tasks.helix-test-information-tab
https://helixre8s23ayyeko0k025g8.blob.core.windows.net/dotnet-runtime-refs-pull-57143-merge-76fdf20e599547a2a5/JIT.jit64.hfa/1/console.2cf9b544.log?sv=2019-07-07&se=2021-09-01T16%3A19%3A55Z&sr=c&sp=rl&sig=EySUK0YGxErdTG%2Fj2fRKRzKmAE%2BhwwIQJxMoFQNgma0%3D

@MattGal
Copy link
Member

MattGal commented Aug 12, 2021

@dotnet/dnceng what could cause there to be only a single line in the console output? (Console log: 'JIT.jit64.hfa' from job 76fdf20e-5995-47a2-a5d4-656da646e9e1 workitem 9bdf57b5-ba46-4640-b90d-73652d96f040 (windows.10.arm64v8.open) executed on machine DDARM64-146)

Seems a JIT test failed but there is no info.

https://dev.azure.com/dnceng/public/_build/results?buildId=1290308&view=ms.vss-test-web.build-test-results-tab&runId=38085910&resultId=103042&paneView=dotnet-dnceng.dnceng-build-release-tasks.helix-test-information-tab
https://helixre8s23ayyeko0k025g8.blob.core.windows.net/dotnet-runtime-refs-pull-57143-merge-76fdf20e599547a2a5/JIT.jit64.hfa/1/console.2cf9b544.log?sv=2019-07-07&se=2021-09-01T16%3A19%3A55Z&sr=c&sp=rl&sig=EySUK0YGxErdTG%2Fj2fRKRzKmAE%2BhwwIQJxMoFQNgma0%3D

Taking a peek

@MattGal
Copy link
Member

MattGal commented Aug 12, 2021

@danmoseley I am trying to get on the machine to directly check, but I think what happened is the taskkill.exe call that is line 1 on the execute.cmd crashed it with -1073741502 (STATUS_DLL_INIT_FAILED) faster than it could record std output, which would indicate a funky machine.

I'll continue to investigate, but I'd also add you should now be able to safely remove the taskkill invocations entirely, as we now have reliable leaked process cleanup in the Helix client.

@danmoseley
Copy link
Member

@agocke here's more context for the failing test showing the similar passing tests.

    <collection total="13" passed="10" failed="1" skipped="2" name="Test collection for Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" time="26.125">
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_framework_dependent_app_behind_symlink" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_framework_dependent_app_behind_symlink" time="0" result="Skip">
        <reason><![CDATA[Currently failing in OSX with \"No such file or directory\" when running Command.Create. CI failing to use stat on symbolic links on Linux (permission denied).]]></reason>
      </test>
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Put_app_directory_behind_symlink_and_use_dotnet_run" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Put_app_directory_behind_symlink_and_use_dotnet_run" time="5.8878282" result="Fail">
        <failure exception-type="Xunit.Sdk.XunitException">
          <message><![CDATA[Expected command to pass but it did not.\"\\nFile Name: /Users/runner/work/1/s/.dotnet/dotnet\\nArguments: run\\nExit Code: 150\\nStdOut:\\n\\nStdErr:\\nIt was not possible to find any compatible framework version\\nThe framework 'Microsoft.NETCore.App', version '6.0.0-rc.1.21411.2' was not found.\\n  - The following frameworks were found:\\n      2.1.0 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.1 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.2 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.3 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.4 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.5 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.6 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.7 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.8 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.9 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.10 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.11 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.12 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.13 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.14 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.15 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.16 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.17 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.18 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.19 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.20 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.21 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.22 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.23 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.24 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.25 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.26 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.27 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      2.1.28 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.0 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.1 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.2 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.3 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.4 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.5 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.6 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.7 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.8 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.9 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.10 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.11 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.12 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.13 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.14 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.15 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.16 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      3.1.17 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.0 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.1 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.2 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.3 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.4 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.5 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.6 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.7 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n      5.0.8 at [/Users/runner/.dotnet/shared/Microsoft.NETCore.App]\\n\\nYou can resolve the problem by installing the specified framework and/or SDK.\\n\\nThe specified framework can be found at:\\n  - https://aka.ms/dotnet-core-applaunch?framework=Microsoft.NETCore.App&framework_version=6.0.0-rc.1.21411.2&arch=x64&rid=osx.10.15-x64\\n\\n\"]]></message>
          <stack-trace><![CDATA[   at FluentAssertions.Execution.XUnit2TestFramework.Throw(String message)
   at FluentAssertions.Execution.TestFrameworkProvider.Throw(String message)
   at FluentAssertions.Execution.DefaultAssertionStrategy.HandleFailure(String message)
   at FluentAssertions.Execution.AssertionScope.FailWith(String message, Object[] args)
   at Microsoft.DotNet.CoreSetup.Test.CommandResultAssertions.Pass() in /_/src/installer/tests/TestUtils/Assertions/CommandResultAssertions.cs:line 30
   at Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Put_app_directory_behind_symlink_and_use_dotnet_run() in /_/src/installer/tests/Microsoft.NET.HostModel.Tests/Microsoft.NET.HostModel.AppHost.Tests/AppHostUsedWithSymbolicLinks.cs:line 207]]></stack-trace>
        </failure>
      </test>
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_apphost_behind_symlink(symlinkRelativePath: \&quot;a/b/SymlinkToApphost\&quot;)" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_apphost_behind_symlink" time="2.1038309" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_apphost_behind_symlink(symlinkRelativePath: \&quot;a/SymlinkToApphost\&quot;)" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_apphost_behind_symlink" time="1.7897959" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Put_dotnet_behind_symlink" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Put_dotnet_behind_symlink" time="2.0057718" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_framework_dependent_app_with_runtime_behind_symlink" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_framework_dependent_app_with_runtime_behind_symlink" time="0" result="Skip">
        <reason><![CDATA[Currently failing in OSX with \"No such file or directory\" when running Command.Create. CI failing to use stat on symbolic links on Linux (permission denied).]]></reason>
      </test>
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Put_satellite_assembly_behind_symlink" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Put_satellite_assembly_behind_symlink" time="2.5457662" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Put_app_directory_behind_symlink" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Put_app_directory_behind_symlink" time="2.2338867" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Put_app_directory_behind_symlink_and_use_dotnet" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Put_app_directory_behind_symlink_and_use_dotnet" time="2.0881602" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_apphost_behind_transitive_symlinks(firstSymlinkRelativePath: \&quot;a/b/FirstSymlink\&quot;, secondSymlinkRelativePath: \&quot;c/d/SecondSymlink\&quot;)" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_apphost_behind_transitive_symlinks" time="2.1247919" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_apphost_behind_transitive_symlinks(firstSymlinkRelativePath: \&quot;a/b/FirstSymlink\&quot;, secondSymlinkRelativePath: \&quot;c/SecondSymlink\&quot;)" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_apphost_behind_transitive_symlinks" time="1.9710878" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_apphost_behind_transitive_symlinks(firstSymlinkRelativePath: \&quot;a/FirstSymlink\&quot;, secondSymlinkRelativePath: \&quot;c/d/SecondSymlink\&quot;)" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_apphost_behind_transitive_symlinks" time="1.7784765" result="Pass" />
      <test name="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks.Run_apphost_behind_transitive_symlinks(firstSymlinkRelativePath: \&quot;a/FirstSymlink\&quot;, secondSymlinkRelativePath: \&quot;c/SecondSymlink\&quot;)" type="Microsoft.NET.HostModel.Tests.AppHostUsedWithSymbolicLinks" method="Run_apphost_behind_transitive_symlinks" time="1.5951218" result="Pass" />
    </collection>

@agocke
Copy link
Member

agocke commented Aug 12, 2021

I'll take a look -- I didn't see that failure the first time

@ViktorHofer
Copy link
Member Author

Without wanting to put too much pressure on us, Arcade's test retry feature is currently blocked on getting this in. So ideally we get this in today so that we can update our dependencies later today or tomorrow.

@agocke
Copy link
Member

agocke commented Aug 12, 2021

I've removed the test -- it's dependent on SDK-specific functionality (dotnet run) so I don't think it should be in the runtime at all. I think the cause of that error is infra, specifically version mismatches in the livebuild layout vs installer tests. I think it's a low value test that we shouldn't block on.

@danmoseley
Copy link
Member

@dotnet/jit-contrib could one of you comment on this? #57143 (comment)
I cannot find where "execute.cmd" exists or is created.

More importantly, are you comfortable with us merging with that failure, or should we resolve it first? We are pushing on this SDK ingestion because it should improve infra reliability.

@MattGal
Copy link
Member

MattGal commented Aug 12, 2021

@dotnet/jit-contrib could one of you comment on this? #57143 (comment)
I cannot find where "execute.cmd" exists or is created.

More importantly, are you comfortable with us merging with that failure, or should we resolve it first? We are pushing on this SDK ingestion because it should improve infra reliability.

Execute.cmd is auto-generated by the Arcade SDK. Check out HelixPre/PostCommands in your projects.

e.g.
src\coreclr\scripts\superpmi.proj
src\libraries\sendtohelixhelp.proj
src\tests\Common\helixpublishwitharcade.proj

@ViktorHofer
Copy link
Member Author

ViktorHofer commented Aug 12, 2021

Right, https://github.com/dotnet/runtime/search?q=taskkill lists the ones that Matt just posted.

More importantly, are you comfortable with us merging with that failure, or should we resolve it first? We are pushing on this SDK ingestion because it should improve infra reliability.

Sounds like that was a faulty machine state and likely isn't related to this change.

@MattGal
Copy link
Member

MattGal commented Aug 12, 2021

Right, https://github.com/dotnet/runtime/search?q=taskkill lists the ones that Matt just posted.

More importantly, are you comfortable with us merging with that failure, or should we resolve it first? We are pushing on this SDK ingestion because it should improve infra reliability.

Sounds like that was a faulty machine state and likely isn't related to this change.

While it does still seem like a weird crash happened, whatever it was didn't reproduce with the same payload on the same machine (I really wanted to make sure since I'm pushing this update). It does use the updated SDK for test execution, but I saw no evidence this is caused by the update.

@ViktorHofer
Copy link
Member Author

I see a couple Installer issues just popped up. @agocke can you please take a look?

@agocke
Copy link
Member

agocke commented Aug 12, 2021

@ViktorHofer These weren't in the previous runs. Did something change?

@AndyAyersMS
Copy link
Member

@dotnet/jit-contrib could one of you comment on this? #57143 (comment)
I cannot find where "execute.cmd" exists or is created.

More importantly, are you comfortable with us merging with that failure, or should we resolve it first? We are pushing on this SDK ingestion because it should improve infra reliability.

Looks like the tests passed and the crash happened later?

I'd be fine merging with this.

@agocke agocke merged commit 7d12f75 into main Aug 12, 2021
@agocke agocke deleted the ViktorHofer-sdkupgrade-rc1 branch August 12, 2021 22:09
@danmoseley
Copy link
Member

Thanks folks for the team effort merging this, let's see whether it helps CI stability.

@ViktorHofer
Copy link
Member Author

Amazing 👋👋👋

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-Infrastructure NO-MERGE The PR is not ready for merge yet (see discussion for detailed reasons)
Projects
None yet
Development

Successfully merging this pull request may close these issues.