Its rather small 371 dir with an average of 10 files in each directory. some channels contain other auxiliary devices
This is just a comment, but your numbers look pretty high. I spent below using essentially the same recursive method that you are using, and my times are much lower, despite creating the output of the string.
public void RecurseTest(DirectoryInfo dirInfo, StringBuilder sb, int depth) { _dirCounter++; if (depth > _maxDepth) _maxDepth = depth; var array = dirInfo.GetFileSystemInfos(); foreach (var item in array) { sb.Append(item.FullName); if (item is DirectoryInfo) { sb.Append(" (D)"); sb.AppendLine(); RecurseTest(item as DirectoryInfo, sb, depth+1); } else { _fileCounter++; } sb.AppendLine(); } }
I ran the above code in several different directories. On my machine, the 2nd call to scan the directory tree was usually faster because of caching by either the runtime or the file system. Please note that this system is not something special, just a 1yr development workstation.
// cached call
Dirs = 150, files = 420, max depth = 5
Time taken = 53 milliseconds
// cached call
Dirs = 1117, files = 9076, max depth = 11
Time taken = 433 milliseconds
// first call
Dirs = 1052, files = 5903, max depth = 12
Time taken = 11921 milliseconds
// first call
Dirs = 793, files = 10748, max depth = 10
Time taken = 5433 milliseconds (2nd run 363 milliseconds)
With concern that I did not receive the creation and modification date, the code was modified to display this also with the following points.
// now grabbing last update and creation time.
Dirs = 150, files = 420, max depth = 5
Time taken = 103 milliseconds (2nd run 93 milliseconds)
Dirs = 1117, files = 9076, max depth = 11
Time taken = 992 milliseconds (2nd run 984 milliseconds)
Dirs = 793, files = 10748, max depth = 10
Time taken = 1382 milliseconds (2nd run 735 milliseconds)
Dirs = 1052, files = 5903, max depth = 12
Time taken = 936 milliseconds (2nd run 595 milliseconds)
Note. The System.Diagnostics.StopWatch class is used for synchronization.