How can I read multiple files from multiple directories in R for processing?

I run a simulation study and must process and save the results from several text files. I have data organized in this way, when there are subdirectories and inside each subdirectory, I need to process and get separate results for 1000 data files. This is very easy to do in SAS using macros. However, I am new to R and cannot figure out how to do this. Below I am trying to execute.

DATA Folder-> DC1 -> DC1R1.txt ... DC1R1000.txt
              DC2 -> DC2R1.txt ... DC2R1000.txt

Any help would be greatly appreciated!

+5
source share
4 answers

I'm not right next to a computer with R, but I read help about file-related functions:

dir . . list.files dir. file.info ( ), , file.path .

basename dirname .

, .

EDIT , :

# Make a function to process each file
processFile <- function(f) {
  df <- read.csv(f)
  # ...and do stuff...
  file.info(f)$size # dummy result
}

# Find all .csv files
files <- dir("/foo/bar/", recursive=TRUE, full.names=TRUE, pattern="\\.csv$")

# Apply the function to all files.
result <- sapply(files, processFile)
+7

, , list.files(recursive = T). , Data Folder. recursive = T .

+3

filenames <- list.files("path/to/files", recursive=TRUE) , .

+2

You can use the Perl function glob ()to get a list of files and send them to R using, for example, the RSPerl interface.

0
source

All Articles