Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Partial Sync or re-indexing possible? #472

Open
xannasavin opened this issue Jul 14, 2024 · 8 comments
Open

Partial Sync or re-indexing possible? #472

xannasavin opened this issue Jul 14, 2024 · 8 comments

Comments

@xannasavin
Copy link

Bug Description
I have a large WordPress Site with a Woocommerce Shop etc.
Recently it looked like the filestate json was not correct anymore, since deployment always failed (happens from time to time).
As usual I simply deleted the file, so it could be reindexed.
However, the workflow now is always running into the timelimit and fails, since there seem to be too many files.

Is it possible to re-index in another way?
Or is it possible to exclude folders and re-include them one after another, so the index is complete again?
Not sure, if you exclude folders, if they get wiped from the index?

🎉 DeployThe job running on runner GitHub Actions 20 has exceeded the maximum execution time of 360 minutes.

🎉 DeployThe operation was canceled.

@xannasavin
Copy link
Author

I tried to exclude the plugins folder, but for some reason it still takes it into account...
there are 36,728 files. If the workflow was canceled (manually or by timeout) there is still no ftp-deploy-sync-state.json placed, so it starts from scratch every time.
Any idea?

@SHJordan
Copy link

SHJordan commented Sep 3, 2024

I actually am trying to figure out a way to do the same. Since my Laravel project is big when it comes to node modules and vendor files it would be great if there was a way to write the frp-deploy-sync.json lets say, after 10 files/folders created on the remote. Or even better, check if those folders already exist in the server before re-creating them. @SamKirkland

@garrettw
Copy link

garrettw commented Oct 13, 2024

I would like to see this too. Progressive writing to the JSON file would let us pick up where we left off if a long deployment fails partway through. Or maybe on failure, detect it and make sure to write the current progress to the JSON file before exiting.

Edit: Duplicate of #341

@cirosayagueslaso
Copy link

Hey guys! I had the same problem and I have a workaround that can help someone. Letting a script, a txt and a readme which allows to prevent uploading the entire project when it's already uploaded but not synced.

I had 2 projects and it worked in both so I hope it works fine for you as well and save you some time and resources.

Hope it helps!

sync-file-creator.zip

@SHJordan
Copy link

Hey guys! I had the same problem and I have a workaround that can help someone. Letting a script, a txt and a readme which allows to prevent uploading the entire project when it's already uploaded but not synced.

I had 2 projects and it worked in both so I hope it works fine for you as well and save you some time and resources.

Hope it helps!

sync-file-creator.zip

Can you elaborate on it?

@cirosayagueslaso
Copy link

Hey guys! I had the same problem and I have a workaround that can help someone. Letting a script, a txt and a readme which allows to prevent uploading the entire project when it's already uploaded but not synced.
I had 2 projects and it worked in both so I hope it works fine for you as well and save you some time and resources.
Hope it helps!
sync-file-creator.zip

Can you elaborate on it?

Yes, it's a script which generates a sync state json file to be uploaded to the server and prevent the ftp deploy action not know when the files are already there. It has a readme where it shows detailed steps to use it (including running the script, the CD actions and uploading the json file). Please let me know if that readme is not clear. Unfortunately I hadn't enough time to create a repo or make a PR in this one but that would be the best, I know.

@xannasavin
Copy link
Author

If I got it correct, the script needs to be uploaded on the target system, right?
Unfortunately I don't have shell access there, it's just a cheap hosting.
Would be great if @SamKirkland could add a function, that writes updates to the sync state more often during deployment.

@cirosayagueslaso
Copy link

If I got it correct, the script needs to be uploaded on the target system, right?
Unfortunately I don't have shell access there, it's just a cheap hosting.
Would be great if @SamKirkland could add a function, that writes updates to the sync state more often during deployment.

No, no the script, just the resulting json should be uploaded. That's what the FTP action takes into account to compare the repo's files and the files in the target system.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants