-
-
Notifications
You must be signed in to change notification settings - Fork 363
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
clean-exclude also prevents files from syncing #672
Comments
This option will prevent them from being copied, it's meant for constant directories and files that almost never change such as |
I was hoping to to clean most deleted files (like any static files like images that get deleted), but keep old versions of hashed build files that could potentially still be referenced by users that have old html pages cached pointing to a prior hashed file. |
Sorry for the late reply, it's been a pretty busy week for me with work. That makes sense, so an ideal fix would be to allow files that are marked as excluded from the clean job to still be able to be committed in the deployment job? We do something like there here to allow pass through of modifications to the CNAME file so I don't see why the same check couldn't be made for all items in the exclusion list. Do you mind giving a sample of your workflow you expected to work so I can base my test workflow on it? I'd be happy to add this as a feature. |
I'm not sure if that same pattern is going to work for my use case, but maybe we can figure something out! My use case is around building a Single Page App. (I happen to currently be using the create-react-app defaults for Webpack settings) The build script creates hashed files in This serves the purpose of breaking css and js caches for anyone who gets the new index file to force them to get the newly built files. However, after a deploy, some people might still have an old index.html file cached pointing to the older build files. For this purpose I'd like to copy all new I'd like anything outside of ./build/static to get auto-deleted when those files are removed though as usual via clean. Does the way I described my workflow make sense? Thanks! |
Makes sense. I've been trying to think of a good way to resolve this but I'm struggling to come up with a solution that would work natively with rsync. From what I can tell for this to properly work the deployment action would need to have context of the previous deployment that occurred so it knows which files to sync back with the current deployment branch and which ones to not. I think such a thing could be achieved with workflow artifacts as rsync lists out a report of all of the synced files but I'd have to do some more research on that. The other options I've explored have been keeping the previous versions of the hashed files around as part of a Webpack configuration, and then also removing files based on time, but that's not something I'm comfortable with adding to this project as that's highly dependent on the project and probably better suited for a follow-up workflow step. If yourself (or anyone else reading this) is more familiar with rsync and has any ideas on how this could be easily resolved I'd love some suggestions because I think this would be a great thing to make available. |
I've been coming back to this issue from time to time, and I think I might have a proposed solution for this. I think the entire handling of this might be out of scope for this action, but with a few accommodations this setup might allow this to work:
This of course would require some time to explore more, but I feel like this is a pretty reasonable workflow lifecycle. Ideally I'd like to keep the aim of this project at simply moving the files around, and I think it already has all the necessary tools to make something like this working without fundamentally changing the scope. With these changes all the action would need to do is be able to read from a list of files when processing Let me know what you think! |
Thanks for taking so much time thinking about this. That approach does sound very promising and it sounds like it should fulfill this use case! |
when I specify directories to clean-exclude, it is also not copying new files matching this pattern.
Am I misunderstanding what clean-exclude is for? I would expect it to still transfer new files added to the source, just not delete files from the destination that are no longer in the source. Similar to clean: false, but only in certain directories
It looks like this is due to clean-exclude passing these as --exclude to rsync which does make rsync completely ignore them.
The text was updated successfully, but these errors were encountered: