Quantcast
Channel: Recent Questions - Stack Overflow
Viewing all articles
Browse latest Browse all 11631

Processing very large list of dataframes by chunks in R

$
0
0

Let's say I have a list of 1000 dataframes:

1000_dataframe_list

And because of memory problems with spread(), I need to process this list by chunks. To process the whole list, I'm using:"

out_df <- do.call(rbind.data.frame, 1000_dataframe_list)data_wide <- spread(out_df, Sample, expression)

My idea is to process the list in chunks, then merge all chunks. As you can see, data_wide is a basically a matrix with the first column being rownames, but I don't think this matters very much.

How can I process the 1000_dataframe_list in a way that I can divide it by any number of parts I need (10, 100, 200, using this number as input for the number of chunks)?


Viewing all articles
Browse latest Browse all 11631

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>