Removing a large number of files in python

I am trying to delete all files found in a directory. Accepted answer Delete the contents of a folder in Python offers to get a list of all files and call "unlink" on them in a loop.

Suppose I have thousands of files on a network share and you want to link the directory as short as possible.

How much more efficient is it to delete them all with a shell command, for example rm -f /path/* or with shutils.rmtree or some?

+4
source share
2 answers

If you really want to delete the entire directory tree, shutils.rmtree should be faster than os.remove (which is the same as os.unlink ). It also allows you to specify a callback function to handle errors.

The suggestion in the @nmichaels comment is also good, you can os.rename create a directory and then create a new one in its place and use shutils.rmtree in the original, renamed directory.

+5
source

I tried this solution and it seems to work well:

 while os.path.exists(file_to_delete): os.remove(file_to_delete) 
0
source

All Articles