R Error: Cannot Allocate Vector of Size N GB (2 Examples) | How to Increase the Memory Limit
This tutorial shows how to increase or decrease the memory limit in R.
The tutorial contains two examples for the modification of memory limits. To be more specific, the article consists of the following topics:
So let’s take a look at some R codes in action:
The Starting Point: Running Memory Intense Function with Default Specifications
Let’s assume we want to create a huge vector of randomly distributed values using the rnorm function in R. Then we could use the following R code:
x <- rnorm(4000000000) # Trying to run rnorm function # Error: cannot allocate vector of size 29.8 Gb |
x <- rnorm(4000000000) # Trying to run rnorm function # Error: cannot allocate vector of size 29.8 Gb
Unfortunately, the RStudio console returns the error message: “cannot allocate vector of size 29.8 Gb”.
Even though there is no general solution to this problem, I’ll show in two examples how you might be able to fix this issue. So keep on reading!
Example 1: Garbage Collection Using gc() Function in R
The first possible solution for this problem is provided by the gc function. The gc function causes a garbage collection and prints memory usage statistics.
Let’s do this in R:
gc() # Apply gc() function # used (Mb) gc trigger (Mb) max used (Mb) # Ncells 531230 28.4 1156495 61.8 1156495 61.8 # Vcells 988751 7.6 8388608 64.0 1761543 13.5 |
gc() # Apply gc() function # used (Mb) gc trigger (Mb) max used (Mb) # Ncells 531230 28.4 1156495 61.8 1156495 61.8 # Vcells 988751 7.6 8388608 64.0 1761543 13.5
Have a look at the previous output of the RStudio console. It shows some memory usage statistics that might be helpful to evaluate your RAM usage.
Perhaps, the call of the gc function solves our memory problems:
x <- rnorm(4000000000) # Trying to run rnorm function # Error: cannot allocate vector of size 29.8 Gb |
x <- rnorm(4000000000) # Trying to run rnorm function # Error: cannot allocate vector of size 29.8 Gb
Unfortunately, this is not the case in this example. So let’s try something else…
Example 2: Increase Memory Limit Using memory.limit() Function
Another solution for the error message: “cannot allocate vector of size X Gb” can be the increasing of the memory limit available to R. First, let’s check the memory limit that is currently specified in R. For this, we simply have to call the memory.limit function as shown below:
memory.limit() # Check currently set limit # [1] 16267 |
memory.limit() # Check currently set limit # [1] 16267
The RStudio console shows that our current memory limit is 16267.
We can also use the memory.limit function to increase (or decrease) memory limits in R. Let’s increase our memory limit to 35000:
memory.limit(size = 35000) # Increase limit # [1] 35000 |
memory.limit(size = 35000) # Increase limit # [1] 35000
Now, we can run the rnorm function again:
x <- rnorm(4000000000) # Successfully running rnorm function |
x <- rnorm(4000000000) # Successfully running rnorm function
Nice, this time it worked (even though it took a very long time to compute)!
Video, Further Resources & Summary
Do you want to learn more about adjusting memory limits? Then you may want to have a look at the following video of my YouTube channel. In the video, I’m explaining the R syntax of this tutorial.
The YouTube video will be added soon.
Note that the examples shown in this tutorial are only two out of many possible solutions for the error message “cannot allocate vector of size X Gb”. Have a look at this thread on Stack Overflow to find additional tips and detailed instructions.
Furthermore, you may want to have a look at the other tutorials on Statistics Globe.
- Clear RStudio Console
- Remove All Objects But One from Workspace in R
- Difference between rm() & rm(list=ls())
- Clean Up Memory in R
- Determine Memory Usage of Data Objects
- What’s the Difference Between the rm & gc Functions?
- R Programming Tutorials
In summary: At this point you should know how to increase or decrease the memory limit in R programming. If you have additional questions, please let me know in the comments. Besides that, don’t forget to subscribe to my email newsletter in order to get updates on the newest tutorials.
10 Comments. Leave new
PLEASE IT IS STILL GIVING ME A POP-UP OF CANNOT ALLOCATE VECTOR OF SIZE 53.1Gb
what should I do
Hey Pamela,
It’s impossible for me to help you without further information. Could you tell me some more details?
Regards,
Joachim
Hello,
I have the same problem as Pamela. I’m still new to analyzing data in this way, so I’ll start with some detail, and can provide more if needed. I’m working through a large clustering analysis with hclust() using a matrix of 24000 x 24000. Initially, I’d used the code above to increase the memory limit to 15000 which enabled me to calculate the adjacency in an earlier step, but that limit generated a warning of “memory overload” when I reached the hclust step. I increased the memory limit to 40000 and then to 60000, but continue to get “Error: cannot allocate vector of size 2.2 Gb” (2.2 for both 40000 and 60000). I’ve used the collect.garbage() function to try to free up memory, but that hasn’t worked either.
Hi Hank,
Thanks for the comment and the detailed explanations.
Is there a way to clean up your data to decrease its size before analyzing it?
For example:
– Remove rows and columns that contain only NA values: https://statisticsglobe.com/r-remove-data-frame-rows-with-some-or-all-na
– Remove duplicate rows and columns from your data: https://statisticsglobe.com/remove-duplicated-rows-from-data-frame-in-r
– Remove rows or columns that do not contribute any value to the analysis.
Generally speaking: I assume that at some size R will simply not be able to handle your data anymore. For that reason, I’d try to decrease the size of the data as much as possible before trying to increase memory limits etc.
I hope that helps!
Joachim
Thank you so much, this worked perfectly for me.
That’s great to hear, thank you for the nice comment!
I would also suggest you sample the data as suggested in
https://stackoverflow.com/questions/40989003/hclust-in-r-on-large-datasets
Hey Ross,
Thank you for the additional resource!
Regards
Joachim
Hi Joachim
I tried to apply the example you gave but I still facing the problem. I did all the steps correctly but the problem still there.
Hey Omar,
Unfortunately, this error message can have many different causes, so it’s not possible to give a general solution for all cases.
You may have a look at this thread on Stack Overflow for more possible solutions.
I hope that helps!
Joachim