I have a question that concerns reading files into R. I have a folder with many files that I would like to have in a dataframe in R to work with. What is the most efficient way to read in a large amount of data (over 1000 files)? My code is below, it has been running for a day now and still not all files are read in.
data = data.frame()
for(file in files) {
path = paste0("Data/",file,".RData")
if(file.exists(path)) {
load(path) # as file_data
data = dplyr::bind_rows(data, file_data)
}
}
You can list all the files and read them into a list. Then bind them at the end.
A better solution than using
mgetis defining anew.envto load the files there and then read them into a list: