QBoard » Statistical modeling » Stats - Tech » Forcing garbage collection to run in R with the gc() command

Forcing garbage collection to run in R with the gc() command

  • Periodically I program sloppily. Ok, I program sloppily all the time, but sometimes that catches up with me in the form of out of memory errors. I start exercising a little discipline in deleting objects with the rm() command and things get better. I see mixed messages online about whether I should explicitly call gc() after deleting large data objects. Some say that before R returns a memory error it will run gc() while others say that manually forcing gc is a good idea.
    Should I run gc() after deleting large objects in order to ensure maximum memory availability?
      August 3, 2020 4:45 PM IST
    0
  • From the help page on gc:
    A call of 'gc' causes a garbage collection to take place. This will also take place automatically without user intervention, and the primary purpose of calling 'gc' is for the report on memory usage.
    However, it can be useful to call 'gc' after a large object has been removed, as this may prompt R to return memory to the operating system.
    So it can be useful to do, but mostly you shouldn't have to. My personal opinion is that it is code of last resort - you shouldn't be littering your code with gc() statements as a matter of course, but if your machine keeps falling over, and you've tried everything else, then it might be helpful.
    By everything else, I mean things like
    1.Writing functions rather than raw scripts, so variables go out of scope.
    2.Emptying your workspace if you go from one problem to another unrelated one.
    3.Discarding data/variables that you aren't interested in. (I frequently receive spreadsheets with dozens of uninteresting columns.)
      August 3, 2020 4:49 PM IST
    0
  • Supposedly R uses only RAM. That's just not true on a Mac (and I suspect it's not true on Windows either.) If it runs out of RAM, it will start using virtual memory. Sometimes, but not always, processes will 'recognize' that they need to run gc() and free up memory. When they do not do so, you can see this by using the ActivityMonitor.app and seeing that all the RAM is occupied and disk access has jumped up. I find that when I am doing large Cox regression runs that I can avoid spilling over into virtual memory (with slow disk access) by preceding calls with gc(); cph(...)
      September 19, 2020 5:53 PM IST
    0
  • Explicitly calling gc will free some memory "now". ...so if other processes need the memory, it might be a good idea. For example before calling system or similar. Or perhaps when you're "done" with the script and R will sit idle for a while until the next job arrives - again, so that other processes get more memory.

    If you just want your script to run faster, it won't matter since R will call it later if it needs to. It might even be slower since the normal GC cycle might never have needed to call it.

    ...but if you want to measure time for instance, it is typically a good idea to do a GC before running your test. This is what system.time does by default.
      September 19, 2020 5:55 PM IST
    0
  • "Maybe." I don't really have a definitive answer. But the help file suggests that there are really only two reasons to call gc():

    1. You want a report of memory usage.
    2. After removing a large object, "it may prompt R to return memory to the operating system."

    Since it can slow down a large simulation with repeated calls, I have tended to only do it after removing something large. In other words, I don't think that it makes sense to systematically call it all the time unless you have good reason to.

      September 19, 2020 5:56 PM IST
    0