-
Notifications
You must be signed in to change notification settings - Fork 652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Azure Execute hangs for ~25 minutes and then fails with generic node error. #2864
Comments
I just posted this in the other bug: we are really having a problem with this. its bad enough where im getting pressure to pull the tool. we show about 15% of our pipelines fail because of this error. The problem case has expanded to include pipelines that have already worked correctly. meaning we did the empty commit mentioned above and got it to work, then the next PR, it fails. Any ideas here? It really seems like node is either hanging because it doesn't like something in the repo, or, what i really think is happening is gitversion is failing for some reason and node isn't picking up the exit code or logs. possibly because the action has it always parsing json? if there is an error, is the return here, able to capture and return an error? I know in testing locally, if gitversion generates an error, the output to the screen is not json. Its only json when things go correctly. |
We did some more work on this. We eliminated the azure plugin and are running gitversion as a cli application now and we think we found the problem. The problem is with multiple source branches. If gitversion recognizes multiple source branches, it gets stuck in an infinite loop, here's a snip from the 1.8gb log file,
this repeats over and over until node times out, then im guessing azure doesn't know what to do with a +gb file. I changed my config file from this:
to this:
which seems to have cleared the error and got us running again. But the bug is, gitversion should have protections in it to prevent it from getting locked in infinite loops. Maybe a global configuration for a max levels to traverse? |
Thanks for figuring this one out, @scphantm!
Agreed.
As long as there is protection from infinity and we can inject a warning explaining that infinity was avoided, perhaps pointing to documentation explaining what you just discovered and how to circumvent it with configuration, I would like to avoid configurability if possible. A PR implementing infinity protection would be highly appreciated! 🙏🏼 |
I looked at the code before and kinda got lost. Where's the recursion at? |
i submitted a patch, but @asbjornu is going to have to refactor it into the current code base as im not able to run dot net 6 in any of my environments yet. |
Do you have a minimal repro? You said you were able to get the issue to occur locally finally. |
i do not. just my production repo. i have been trying to simulate it, but spent enough time on this. the recursion happens in the method i put the recursion check on. what happens is it tries to inherit the version, until it comes down to returning 2 branches to search, |
🎉 This issue has been resolved in version 5.8.3 🎉 Your GitReleaseManager bot 📦🚀 |
Describe the bug
i reported this here GitTools/actions#511 but havn't gotten any response, my logs and such are over there.
Heres the basic, when we are seeding a repo, the azure pipeline will hang on the GitVersion Execute task. we don't know why, but when it does happen, it will hang for about 30 minutes and then fail with a generic node error. What fixes it is simply creating a new PR with just a white space commit to it.
the bug is 2 fold
1 - why is it failing
2 - why is it failing in such a way that we can't debug.
Expected Behavior
It works
Actual Behavior
It hangs on the execute task until Node or Azure times out.
Possible Fix
putting in a white space commit makes it work again.
Steps to Reproduce
when we are seeding our repos, the initial commits cause the thing to hang. Simply adding a blank white space commit will clear the error.
Context
we have hundreds of repos and pipelines. This is killing us.
Your Environment
Azure Pipelines
The text was updated successfully, but these errors were encountered: