Classic running out of memory... huh? what? <long>

Thomas Charron twaffle at gmail.com
Thu Jun 11 12:30:16 EDT 2009


On Thu, Jun 11, 2009 at 11:54 AM, <bruce.labitt at autoliv.com> wrote:
> Anyways, the program seems to run out of memory after processing many
> blocks.  So either there is a memory leak, or something else going on.
> Any suggestions?

  valgrind

  It's what for diner.  :-D

> In both cases, if I use "free" I see that free memory is all the way down
> to 160MB during the file write.  This seems absurdly low somehow ;)
> Any good memory tracking tools?  I have used valgrind but not gained much
> insight.  Must be operator error...

  What tests did you perform using valgrind?  The 'simple' running of
it will just look at things like memory leaks, however, if your
cleaning up after you run, it won't always see that as being an error.
 Have you tried using it in massive mode?

  http://valgrind.org/docs/manual/ms-manual.html

  The only other suggestion I would have would be to potentially store
data in memory mapped files.  Not as fast as having them resident, but
not as slow as having to directly access the file.

  Another trick I've used when dealing with massive amounts of data is
to use 'fast' media, aka, flash instead of a hard drive.  Memory
mapping to this sort of media worked well for me.

  But I have to say, 32 gigs resident memory sounds like a metric
arseload of data.  Perhaps it's best to look at what your keeping and
how.  How big is your data set?

-- 
-- Thomas



More information about the gnhlug-discuss mailing list