Classic running out of memory... huh? what? <long>
Thomas Charron
twaffle at gmail.com
Thu Jun 11 13:27:01 EDT 2009
On Thu, Jun 11, 2009 at 1:22 PM, <bruce.labitt at autoliv.com> wrote:
> Thomas Charron <twaffle at gmail.com> wrote on 06/11/2009 12:30:16 PM:
>> Another trick I've used when dealing with massive amounts of data is
>> to use 'fast' media, aka, flash instead of a hard drive. Memory
>> mapping to this sort of media worked well for me.
> I would love to run a fast drive. However, all I have available is NFS on
> gbit e-net. My sustained file write rate is ~45MiB/sec. (377.5e6
> bits/sec)
> That sounds moderately ok until one realizes the file size (one chunk) is
> ~4.5GiB.
You don't have physical access to the machine? Even a USB can give
better performance then that.
>> But I have to say, 32 gigs resident memory sounds like a metric
>> arseload of data. Perhaps it's best to look at what your keeping and
>> how. How big is your data set?
> The end dataset is even bigger. I have to worry about stuff fitting on
> disk.
> I just got a 1TB disk. It will last me about 10 runs. (~100GB files)
> Good
> thing they are cheap.
Any ability to preprocess the files? I don't know what kind of data
your talking about here, but multipass processing of the data could
result in less memory usage.
--
-- Thomas
More information about the gnhlug-discuss
mailing list