ANONYMOUS wrote:
> Initially my approach did do a lot of unnecessary work, but I fixed it by having a growing array of all the files that have been synced already, so that when scanning other directories, its not re-synced.
I can see that the outcomes should be the same - you make the file copies, and then extend a dynamic data-structure to remember what has been performed. In contrast, the sample solution builds its dynamic data-structure to define what it will copy, and then performs all the activities recorded in it structure.
> And yes it did lead to an infinite sequence of directories, but only when syncing a directory with it's child. We can't (yet) see what the sample solution does in such a situation right? Because it treats each individual zipped file as it's own top level dir.
The sample solution successfully recursively syncs directories and their (embedded) child directories *because* it works out what to copy before it starts copying, and does not continue 'processing' the newly created directories.