How to Clear Memory in R

Gustavo du Mortier Feb 02, 2024
How to Clear Memory in R

After working with R scripts for a while, the R environment may hoard a lot of in-memory data, withholding significant parts of the computer’s resources. This situation may end up with the R system refusing to run code because it can’t allocate any more memory. Even if you restart your R environment, it can happen that the memory isn’t freed-up.

The command rm(list=ls()) is expected to release the memory used by all objects, but what it really does is to destroy the pointers to the used memory chunks. The problem is those memory chunks are not immediately freed-up for use by new tasks.

Clear Memory in R With the gc Function

The garbage collection process runs automatically and periodically within R, but sometimes it doesn’t run at the precise moment when you need a lot of memory for some big data operation. In such a situation, it could be useful to call the gc() function to clear the memory in R.

The main purpose of gc() is to show a report about memory usage. As a side effect, calling gc() triggers the garbage collection process, clearing memory. Therefore, as the gc documentation notes, it is a good idea to call gc() after a large object has been removed since this action prompts R to release the memory it no longer uses.

You can also insert a call to gc() within a loop to repeatedly run the garbage collection process. Just take into account that this process takes some time in the order of 100ms per call, so it is not recommended to place it within a highly repetitive loop or a loop with a high number of iterations.

If you use Windows, you can call the memory.size() function with the max=T parameter to show the amount of memory accessible by the operating system, or with max=F to show how much memory is being used at any moment. The R documentation explains other aspects of the memory.size() function.

To show how memory consumption varies when you create and destroy big data objects, we can run the following code in Windows that creates a sample vector with 100,000 elements, removes the vector, and finally performs a garbage collection. In between each instruction, we can check how much memory is used by executing the memory.size function with the max=F parameter.

memory.size (max = F)

Output:

[1] 32.6

These 32.6 MB represent the minimum amount of memory used by R.

my_data <- sample (1:100000)
memory.size (max = F)

Output:

[1] 33.75

We can see that the sample data created consumes approximately 1 MB of memory since the used memory reported by the memory.size function increased by 1.

rm (my_data)
memory.size (max = F)

Output:

[1] 33.75

Even though we removed the object, the memory seems to be still occupied.

gc()

Output:

         used (Mb) gc trigger  (Mb)  max used  (Mb)
Ncells 315785 16.9     617200  33.0    617200  33.0
Vcells 633077  4.9   25730861 196.4 110503932 843.1
memory.size (max = F)

Output:

[1] 32.6

The amounts of memory will be different for each computer, but you can see that the amount of free memory reported by memory.size remains the same after running the rm (my_data) instruction and that it only returns to the initial value of 32.6 MB after the gc() instruction is executed and the memory is effectively freed up.

Related Article - R Memory