R Error: Cannot Allocate Vector of Size N GB (2 Examples) | How to Increase the Memory Limit


This tutorial shows how to increase or decrease the memory limit in R.

The tutorial contains two examples for the modification of memory limits. To be more specific, the article consists of the following topics:

So let’s take a look at some R codes in action:


The Starting Point: Running Memory Intense Function with Default Specifications

Let’s assume we want to create a huge vector of randomly distributed values using the rnorm function in R. Then we could use the following R code:

x <- rnorm(4000000000)            # Trying to run rnorm function
# Error: cannot allocate vector of size 29.8 Gb

Unfortunately, the RStudio console returns the error message: “cannot allocate vector of size 29.8 Gb”.

Even though there is no general solution to this problem, I’ll show in two examples how you might be able to fix this issue. So keep on reading!


Example 1: Garbage Collection Using gc() Function in R

The first possible solution for this problem is provided by the gc function. The gc function causes a garbage collection and prints memory usage statistics.

Let’s do this in R:

gc()                              # Apply gc() function
#          used (Mb) gc trigger (Mb) max used (Mb)
# Ncells 531230 28.4    1156495 61.8  1156495 61.8
# Vcells 988751  7.6    8388608 64.0  1761543 13.5

Have a look at the previous output of the RStudio console. It shows some memory usage statistics that might be helpful to evaluate your RAM usage.

Perhaps, the call of the gc function solves our memory problems:

x <- rnorm(4000000000)            # Trying to run rnorm function
# Error: cannot allocate vector of size 29.8 Gb

Unfortunately, this is not the case in this example. So let’s try something else…


Example 2: Increase Memory Limit Using memory.limit() Function

Another solution for the error message: “cannot allocate vector of size X Gb” can be the increasing of the memory limit available to R. First, let’s check the memory limit that is currently specified in R. For this, we simply have to call the memory.limit function as shown below:

memory.limit()                    # Check currently set limit 
# [1] 16267

The RStudio console shows that our current memory limit is 16267.

We can also use the memory.limit function to increase (or decrease) memory limits in R. Let’s increase our memory limit to 35000:

memory.limit(size = 35000)        # Increase limit
# [1] 35000

Now, we can run the rnorm function again:

x <- rnorm(4000000000)            # Successfully running rnorm function

Nice, this time it worked (even though it took a very long time to compute)!


Video, Further Resources & Summary

Do you want to learn more about adjusting memory limits? Then you may want to have a look at the following video of my YouTube channel. In the video, I’m explaining the R syntax of this tutorial.


The YouTube video will be added soon.


Note that the examples shown in this tutorial are only two out of many possible solutions for the error message “cannot allocate vector of size X Gb”. Have a look at this thread on Stack Overflow to find additional tips and detailed instructions.

Furthermore, you may want to have a look at the other tutorials on Statistics Globe.

In summary: At this point you should know how to increase or decrease the memory limit in R programming. If you have additional questions, please let me know in the comments. Besides that, don’t forget to subscribe to my email newsletter in order to get updates on the newest tutorials.


Subscribe to the Statistics Globe Newsletter

Get regular updates on the latest tutorials, offers & news at Statistics Globe.
I hate spam & you may opt out anytime: Privacy Policy.

10 Comments. Leave new

    what should I do

  • Hello,

    I have the same problem as Pamela. I’m still new to analyzing data in this way, so I’ll start with some detail, and can provide more if needed. I’m working through a large clustering analysis with hclust() using a matrix of 24000 x 24000. Initially, I’d used the code above to increase the memory limit to 15000 which enabled me to calculate the adjacency in an earlier step, but that limit generated a warning of “memory overload” when I reached the hclust step. I increased the memory limit to 40000 and then to 60000, but continue to get “Error: cannot allocate vector of size 2.2 Gb” (2.2 for both 40000 and 60000). I’ve used the collect.garbage() function to try to free up memory, but that hasn’t worked either.

  • Thank you so much, this worked perfectly for me.

  • I would also suggest you sample the data as suggested in

  • Hi Joachim
    I tried to apply the example you gave but I still facing the problem. I did all the steps correctly but the problem still there.

    • Hey Omar,

      Unfortunately, this error message can have many different causes, so it’s not possible to give a general solution for all cases.

      You may have a look at this thread on Stack Overflow for more possible solutions.

      I hope that helps!



Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.