this is an operating-system agnostic question. Like many I use a NAS for storing my data, and spinning harddisks for my backups. I have a good backup strategy (3 extra copies, 1 of them on a different location, encrypted, etc…)
Now my question is about archiving (zip or 7z or tar.gz) folders with many (!) files in it or not.
The thing is: Copying many files always takes a lot more time. This is caused by the fact that copying MANY files is slower than one bigger file. Its the case on nearly most of the filesystems.
Lets say I got a project with 30.000 files in it, the backup process would be faster if the folder is compressed in a 7z or zip file. Additionally, I personally do think that archive-formats give me a little bit more ordering (its just a subjective thing).
So, in normal case I would totally prefer packing my things in archive formats.
I have this imagination that one bigger file would be more “fragile” in case of hardware-errors or simply filesystem-errors; that - when some file-header (or different part of the archive-file) would be corrupted, the whole file could be irreparabel.
And, on the other hand, if I would not have used archive formats, perhaps a bigger part of my files would still be readable.
I know this is a question like “worst case” or “what if”… but it is also a very fundamental question to me… :
Use package formats or just copy the files like they are (because this would even be more along the KISS principle)
I know, you could say that I do multiple copies of my files anyhow… but again… its also a general question…
What do you think or recommend to me?