Tcl deep recursive file search, search for files with * .c extension

Using the old answer to search for a file in tcl: https://stackoverflow.com/a/4646263

First, let's discuss what I'm doing right now: Using this function: (credit to Jackson)

# findFiles # basedir - the directory to start looking in # pattern - A pattern, as defined by the glob command, that the files must match proc findFiles { basedir pattern } { # Fix the directory name, this ensures the directory name is in the # native format for the platform and contains a final directory seperator set basedir [string trimright [file join [file normalize $basedir] { }]] set fileList {} # Look in the current directory for matching files, -type {fr} # means ony readable normal files are looked at, -nocomplain stops # an error being thrown if the returned list is empty foreach fileName [glob -nocomplain -type {fr} -path $basedir $pattern] { lappend fileList $fileName } # Now look for any sub direcories in the current directory foreach dirName [glob -nocomplain -type {dr} -path $basedir *] { # Recusively call the routine on the sub directory and append any # new files to the results set subDirList [findFiles $dirName $pattern] if { [llength $subDirList] > 0 } { foreach subDirFile $subDirList { lappend fileList $subDirFile } } } return $fileList } 

And calling the following command:

 findFiles some_dir_name *.c 

Current result:

 bad option "normalize": must be atime, attributes, channels, copy, delete, dirname, executable, exists, extension, isdirectory, isfile, join, lstat, mtime, mkdir, nativename, owned, pathtype, readable, readlink, rename, rootname, size, split, stat, tail, type, volumes, or writable 

Now, if we run:

 glob *.c 

We get a lot of files, but they are all in the current directory.

The goal is to get ALL files in ALL subfolders on the machine with their paths. Who can help?

I really want to find the directory with the highest number of * .c files. However, if I could list all the files and their paths, I could calculate how many files are in each directory and get the one with the highest score.

+4
source share
3 answers

The old version of Tcl is used. [file normalize] was introduced in Tcl 8.4 around 2002 or so. Update already.

If you cannot, then you use glob, but name it once only for files, and then go through the directories. See the glob -types .

Here is a demo:

 proc on_visit {path} { puts $path } proc visit {base glob func} { foreach f [glob -nocomplain -types f -directory $base $glob] { if {[catch {eval $func [list [file join $base $f]]} err]} { puts stderr "error: $err" } } foreach d [glob -nocomplain -types d -directory $base *] { visit [file join $base $d] $glob $func } } proc main {base} { visit $base *.c [list on_visit] } main [lindex $argv 0] 
+3
source

I would use :: fileutil :: traverse to do this.

Sort of:

 package require ::fileutil::traverse proc check_path {path} { string equal [file extension $path] ".c" } set obj [::fileutil::traverse %AUTO% -filter check_path] array set pathes {} $obj foreach file { if {[info exists pathes([file dirname $file])]} { incr pathes([file dirname $file]) } else { set pathes([file dirname $file]) 1 } } # print pathes and find the biggest foreach {name value} [array get pathes] { puts "$name : $value" } 
+3
source

To quickly (1 level) match the file template, use:

 glob **/*.c 

If you want to search recursively, use:

 proc ::findFiles { baseDir pattern } { set dirs [ glob -nocomplain -type d [ file join $baseDir * ] ] set files {} foreach dir $dirs { lappend files {*}[ findFiles $dir $pattern ] } lappend files {*}[ glob -nocomplain -type f [ file join $baseDir $pattern ] ] return $files } puts [ join [ findFiles $basepath "*.tcl" ] \n ] 
+2
source

All Articles