Skip to content

[main] Source code updates from dotnet/dotnet #5473

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 43 commits into
base: main
Choose a base branch
from

Conversation

dotnet-maestro[bot]
Copy link
Contributor

@dotnet-maestro dotnet-maestro bot commented May 1, 2025

Note

This is a codeflow update. It may contain both source code changes from the VMR as well as dependency updates. Learn more here.

This pull request brings the following source code changes

From https://github.com/dotnet/dotnet

Updated Dependencies

@dotnet-maestro dotnet-maestro bot requested a review from a team as a code owner May 2, 2025 02:08
@ViktorHofer
Copy link
Member

@mikem8361 @tommcdon @hoyosjs can you please help us diagnose the CI failures? This is bringing in a new runtime, so could be that.

@hoyosjs
Copy link
Member

hoyosjs commented May 3, 2025

@ViktorHofer - I haven't taken a look yet, but the VMR builds are lacking exports on the DAC. All debugger scenarios (tests and this repo) are on the ground

@ViktorHofer
Copy link
Member

We would appreciate guidance what needs to be done, especially if this also impacts P4 which ships from the VMR. cc @mmitche @jkoritzinsky

@hoyosjs
Copy link
Member

hoyosjs commented May 5, 2025

@ViktorHofer - P4 is OK. This fixes for main dotnet/runtime#115309

@akoeplinger
Copy link
Member

@hoyosjs can you please take another look? the fix has flown here now but there are still some test failures

@ViktorHofer
Copy link
Member

ViktorHofer commented May 8, 2025

No the runtime fix hasn't yet flown here. I verified that earlier today.

@akoeplinger
Copy link
Member

ah you're right sorry, my bad

@akoeplinger
Copy link
Member

ok it flowed now and all jobs except Windows x86 are green. @hoyosjs can you please take a look at that one? :)

@mikem8361
Copy link

There is something wrong with x86 stack walking in the latest .NET 10 builds. We need to investigate further.

@hoyosjs
Copy link
Member

hoyosjs commented May 9, 2025

The fix is a mix of dotnet/runtime#115405 and dotnet/runtime#115391 - mostly the former

@ViktorHofer
Copy link
Member

@hoyosjs @mikem8361 the latest dependency update has https://github.com/dotnet/runtime/commits/e67e997094b65c8ed7289b00f2304288d8f75a12 but there are still test failures (this time in MacOS x64). PTAL

akoeplinger and others added 5 commits May 19, 2025 17:52
Updated Dependencies:
Microsoft.DotNet.Arcade.Sdk, Microsoft.DotNet.CodeAnalysis (Version 10.0.0-beta.25266.103 -> 10.0.0-beta.25229.109)
Microsoft.NET.Sdk (Version 10.0.100-preview.5.25266.103 -> 10.0.100-preview.5.25229.109)
Microsoft.AspNetCore.App.Ref.Internal, Microsoft.AspNetCore.App.Ref, Microsoft.NETCore.App.Runtime.win-x64, VS.Redist.Common.NetCore.SharedFramework.x64.10.0 (Version 10.0.0-preview.5.25266.103 -> 10.0.0-preview.5.25229.109)
[[ commit created by automation ]]
@akoeplinger
Copy link
Member

@hoyosjs it looks like tests are timing out now

@hoyosjs
Copy link
Member

hoyosjs commented May 21, 2025

I noticed yesterday - it's SOS on breeding edge but hadn't had time to look. I'll try prioritizing this.

dotnet-maestro bot added 3 commits May 22, 2025 02:08
[[ commit created by automation ]]
Updated Dependencies:
Microsoft.DotNet.Arcade.Sdk, Microsoft.DotNet.CodeAnalysis (Version 10.0.0-beta.25269.109 -> 10.0.0-beta.25229.109)
Microsoft.NET.Sdk (Version 10.0.100-preview.5.25269.109 -> 10.0.100-preview.5.25229.109)
Microsoft.AspNetCore.App.Ref.Internal, Microsoft.AspNetCore.App.Ref, Microsoft.NETCore.Platforms (Version 10.0.0-preview.5.25269.109 -> 10.0.0-preview.5.25229.109)
@hoyosjs

This comment was marked as outdated.

dotnet-maestro bot added 2 commits May 23, 2025 02:07
[[ commit created by automation ]]
@hoyosjs
Copy link
Member

hoyosjs commented May 23, 2025

This is actually looking like a case where we don't get runtime startup:

000004980D7D960 00007FFF883C6AE2 coreclr!Thread::DoAppropriateWaitWorker + 374 at D:\a\_work\1\s\src\coreclr\vm\threads.cpp:3467
0000004980D7DA20 00007FFF883C692F coreclr!Thread::DoAppropriateWait + 127 at D:\a\_work\1\s\src\coreclr\vm\threads.cpp:3182
0000004980D7DAA0 00007FFF883C6850 coreclr!WaitHandleNative::CorWaitOneNative + 176 at D:\a\_work\1\s\src\coreclr\vm\comwaithandle.cpp:31
0000004980D7DB18                  [HelperMethodFrame: 0000004980d7db18] System.Private.CoreLib.dll!System.Threading.WaitHandle.WaitOneCore(IntPtr, Int32)
0000004980D7DC20 00007fff6c441004 System.Private.CoreLib.dll!System.Threading.WaitHandle.WaitOneNoCheck(Int32) + 100 [/_/src/libraries/System.Private.CoreLib/src/System/Threading/WaitHandle.cs @ 128]
0000004980D7DC80 00007fff28909a4c DbgShim.UnitTests.dll!Microsoft.Diagnostics.DbgShimTests.TestRegisterForRuntimeStartup(Microsoft.Diagnostics.DebuggeeInfo, Int32) + 1580 [E:\repos\diagnostics\src\tests\DbgShim.UnitTests\DbgShimTests.cs @ 401]
0000004980D7DED0 00007fff288f4229 DbgShim.UnitTests.dll!Microsoft.Diagnostics.DbgShimTests+c+<b__10_0>d.MoveNext() + 457 [E:\repos\diagnostics\src\tests\DbgShim.UnitTests\DbgShimTests.cs @ 73]
0000004980D7DFA0 00007fff6c772a5a System.Private.CoreLib.dll!System.Runtime.CompilerServices.AsyncMethodBuilderCore.Start[[System.__Canon, System.Private.CoreLib]](System.__Canon ByRef) + 90 [/_/src/libraries/System.Private.CoreLib/src/System/Runtime/CompilerServices/AsyncMethodBuilderCore.cs @ 38]
0000004980D7E010 00007fff288f4040 System.Private.CoreLib.dll!System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1[[System.Int32, System.Private.CoreLib]].Start[[System.__Canon, System.Private.CoreLib]](System.__Canon ByRef) + 96 [/_/src/libraries/System.Private.CoreLib/src/System/Runtime/CompilerServices/AsyncTaskMethodBuilderT.cs @ 35]
0000004980D7E050 00007fff288f3f4d DbgShim.UnitTests.dll!Microsoft.Diagnostics.DbgShimTests+c.b__10_0(System.String) + 205
0000004980D7E0C0 00007FFF88469F43 coreclr!CallDescrWorkerInternal + 131 at D:\a\_work\1\s\src\coreclr\vm\amd64\CallDescrWorkerAMD64.asm:100
0000004980D7E100 00007FFF883D24BA coreclr!RuntimeMethodHandle::InvokeMethod + 890 at D:\a\_work\1\s\src\coreclr\vm\reflectioninvocation.cpp:739
0000004980D7E2F8                  [HelperMethodFrame_PROTECTOBJ: 0000004980d7e2f8] System.Private.CoreLib.dll!System.RuntimeMethodHandle.InvokeMethod(System.Object, Void**, System.Signature, Boolean)
0000004980D7E430 00007fff6c4c3a02 System.Private.CoreLib.dll!System.Reflection.MethodBaseInvoker.InvokeDirectByRefWithFewArgs(System.Object, System.Span`1<System.Object>, System.Reflection.BindingFlags) + 162 [/_/src/libraries/System.Private.CoreLib/src/System/Reflection/MethodBaseInvoker.cs @ 178]
0000004980D7E4B0 00007fff6c4c339a System.Private.CoreLib.dll!System.Reflection.MethodBaseInvoker.InvokeWithOneArg(System.Object, System.Reflection.BindingFlags, System.Reflection.Binder, System.Object[], System.Globalization.CultureInfo) + 602 [/_/src/libraries/System.Private.CoreLib/src/System/Reflection/MethodBaseInvoker.cs @ 104]
0000004980D7E590 00007fff6c4c2953 System.Private.CoreLib.dll!System.Reflection.MethodBase.Invoke(System.Object, System.Object[]) + 35 [/_/src/libraries/System.Private.CoreLib/src/System/Reflection/MethodBase.cs @ 56]
0000004980D7E5D0 00007fff288f38e4 Microsoft.DotNet.RemoteExecutor.dll!Microsoft.DotNet.RemoteExecutor.Program.Main(System.String[]) + 692 [/_/src/Microsoft.DotNet.RemoteExecutor/src/Program.cs @ 57]
0000004980D7E760 00007FFF88469F43 coreclr!CallDescrWorkerInternal + 131 at D:\a\_work\1\s\src\coreclr\vm\amd64\CallDescrWorkerAMD64.asm:100

<MicrosoftNETCorePlatformsVersion>10.0.0-preview.5.25229.109</MicrosoftNETCorePlatformsVersion>
<MicrosoftNETCoreAppRefVersion>10.0.0-preview.5.25229.109</MicrosoftNETCoreAppRefVersion>
<MicrosoftNETCorePlatformsVersion>10.0.0-preview.6.25272.109</MicrosoftNETCorePlatformsVersion>
<MicrosoftNETCoreAppRefVersion>10.0.0-preview.5.25266.103</MicrosoftNETCoreAppRefVersion>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@akoeplinger @ViktorHofer the rootcause seems to come from this. 103 is not a netcoreapp ref version. the xml has the right one. Any idea what happened here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was a regression from #5486

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mmitche - I assume this is what was intended
platform -> non-stable
ref -> stable version

For my learning - why is arch-specific asset not recommended for this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#5486 missed adding the entry for Microsoft.NETCore.App.Ref to Version.Details.xml so the version wasn't getting updated and you got a mix of versions.

For my learning - why is arch-specific asset not recommended for this?

The problem is when we build e.g. the win-x86 vertical then the Microsoft.NETCore.App.Runtime.win-x64 package isn't built so MicrosoftNETCoreAppRuntimewinx64Version would not be defined. Right now we have to manually inject properties like these with the correct value and we'd like to not have to do that.

@hoyosjs
Copy link
Member

hoyosjs commented May 23, 2025

This one is a DAC issue most likely:

OS Thread Id: 0x3e62 (1)
        Child SP               IP Call Site
00007FFDB36B4108 00007F5320BC8302 libc.so.6!__strchr_evex + 34 at sysdeps/x86_64/multiarch/strchr-evex.S:78
00007FFDB36B4110 00007F531A4F930B libmscordaccore.so!CoreLibBinder::LookupClassLocal(BinderClassID) + 123 at /__w/1/s/src/runtime/src/coreclr/vm/binder.cpp:71
00007FFDB36B41B0 00007F531A4F979C libmscordaccore.so!CoreLibBinder::LookupFieldLocal(BinderFieldID) + 156 at /__w/1/s/src/runtime/src/coreclr/inc/daccess.h:1247
00007FFDB36B41B0 00007F531A4F973F libmscordaccore.so!CoreLibBinder::LookupFieldLocal(BinderFieldID) + 63 at /__w/1/s/src/runtime/src/coreclr/inc/daccess.h:0
00007FFDB36B41F0 00007F531A5889F1 libmscordaccore.so!ClrDataAccess::GetObjectComWrappersData(unsigned long, unsigned long*, unsigned int, unsigned long*, unsigned int*) + 369 at /__w/1/s/src/runtime/src/coreclr/vm/binder.h:0
00007FFDB36B41F0 00007F531A58897A libmscordaccore.so!ClrDataAccess::GetObjectComWrappersData(unsigned long, unsigned long*, unsigned int, unsigned long*, unsigned int*) + 250 at /__w/1/s/src/runtime/src/coreclr/inc/daccess.h:1550
00007FFDB36B42B0 00007F531A7337CA libsos.so!PrintObj(unsigned long, int) + 1546 at /__w/1/s/src/SOS/Strike/strike.cpp:1517
00007FFDB36B4750 00007F531A736EE4 libsos.so!DumpObj + 1348 at /__w/1/s/src/SOS/Strike/strike.cpp:2127
00007FFDB36B4A70 00007F52A360CA23
00007FFDB36B4AC8                  [InlinedCallFrame: 00007ffdb36b4ac8]
00007FFDB36B4AC8                  [InlinedCallFrame: 00007ffdb36b4ac8]
00007FFDB36B4AB0 00007F52A360CA23 Microsoft.Diagnostics.Runtime.dll!ILStubClass.IL_STUB_PInvoke(IntPtr, System.String) + 243
00007FFDB36B4B60 00007F52A360C7FE SOS.Hosting.dll!SOS.Hosting.SOSLibrary.ExecuteCommand(IntPtr, System.String, System.String) + 318 [/__w/1/s/src/SOS/SOS.Hosting/SOSLibrary.cs @ 197]
00007FFDB36B4C20 00007F52A360C6A4 SOS.Hosting.dll!SOS.Hosting.SOSHost.ExecuteCommand(System.String, System.String) + 68 [/__w/1/s/src/SOS/SOS.Hosting/SOSHost.cs @ 102]
00007FFDB36B4C50 00007F52A360C4DA SOS.Hosting.dll!SOS.Hosting.SOSCommandBase.Invoke() + 538 [/__w/1/s/src/SOS/SOS.Hosting/Commands/SOSCommand.cs @ 94]
0

The obj itself is ok - has a good method table

hoyosjs added a commit that referenced this pull request May 23, 2025
#5473 surfaced an issue of
test resiliency.

Per test we wait for a process to start and send a notification. If the
process exits quickly, we'll wait for 5 min per test trying to see the
message. This considers process exits as an error path eagerly.
dotnet-maestro bot added 6 commits May 24, 2025 02:05
Updated Dependencies:
Microsoft.DotNet.Arcade.Sdk, Microsoft.DotNet.CodeAnalysis (Version 10.0.0-beta.25272.109 -> 10.0.0-beta.25229.109)
Microsoft.NET.Sdk (Version 10.0.100-preview.6.25272.109 -> 10.0.100-preview.5.25229.109)
Microsoft.AspNetCore.App.Ref.Internal, Microsoft.AspNetCore.App.Ref, Microsoft.NETCore.Platforms (Version 10.0.0-preview.6.25272.109 -> 10.0.0-preview.5.25229.109)
Microsoft.CodeAnalysis, Microsoft.CodeAnalysis.CSharp (Version 5.0.0-1.25272.109 -> 4.11.0-2.24271.11)
Microsoft.CodeAnalysis.Analyzers (Version 5.0.0-1.25272.109 -> 3.12.0-beta1.24605.1)
Microsoft.CodeAnalysis.NetAnalyzers (Version 10.0.0-preview.25272.109 -> 10.0.0-preview.24605.1)
[[ commit created by automation ]]
[[ commit created by automation ]]
@akoeplinger
Copy link
Member

@hoyosjs looks like only the SOS.ConcurrentDictionaries test is failing now. Can we disable that one temporarily to unblock this dependency flow?

@hoyosjs
Copy link
Member

hoyosjs commented May 27, 2025

Yeah, I've been looking but it might take a bit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants