Comment Re:Issue is not limited to MS Store (Score 4, Interesting) 88
The difference is because Unix has two-phase deletes, while Windows doesn't, inheriting the delete semantics from DOS and a bit from VMS.
Two-phase delete refers to the act of deleting a file - if you delete a file on Unix, the file gets marked as deleted, but the actual file blocks are not deallocated until the last reference to the file is closed, at which point the file blocks are reclaimed. This can leave orphan file nodes if a delete was scheduled but not done (e.g., power loss, sudden reboot without unmount, etc). This is why you have a "lost+found" directory on Linux filesystems - file blocks that were deleted but not actually reclaimed end up there to indicate the directory entry was removed, but the inode was not reclaimed.
On DOS, deletes are blocked if a file is in use. For a single tasking OS, this is perfectly normal as the last reference to the file should be gone by the time you delete the file. Of course, with the introduction of multitasking, and networking, these things become more complex. (MS-DOS's SHARE.EXE was a program used ot manage shared files). This leads to issues when deleting a file since you cannot delete a file that's in-use.
This is also why there are tricks on Unix where you create a file, then delete it immediately and you then use it as a temporary scratch file that will go away when you're done with it without littering the filesystem with random files. You can't do that on Windows.
When you do an update, you may be replacing files that are in-use, e.g., system libraries. On Unix, you can do this - programs already running continue to use the old library on disk whose reference was deleted, while new programs will use the updated library. However this can lead to issues since now you can have programs with a library mismatch - one program is using an old version of the library, while another program is using the newer version, so you need to restart the programs using it. And sometimes this can lead to bigger problems like file corruption if the file format changed and programs are using a mix of the and new libraries.
So in general, a reboot is generally safest unless you are sure no one was using the file. Windows needs it more as it can't update files in use so the early boot process the system performs the updates and deletes that were scheduled, while Unix they may reboot just to ensure everyone is using the right version of a library.