I climbed into a large file using File :: Slurp, but given the size of the file, I see that it should have it in memory twice, or maybe it was pumped up, turning into a 16-bit unicode. How can I best diagnose such a problem in Perl?
The file I pulled out is 800 MB in size, and my perl process, which parses this data, has approximately 1.6 GB allocated at runtime.
I understand that I can be wrong in my reason, but I'm not sure what the most effective way to prove / disprove my theory.
Update:
I banished cunning character encoding from the list of suspects. It seems like I'm copying a variable at some point, I just can't figure out where.
Update 2:
Now I have done some more research and found that they are actually just getting data from File :: Slurp that is causing the problem. I looked through the documentation and found that I can make it return scalar_ref, i.e.
my $data = read_file($file, binmode => ':raw', scalar_ref => 1);
Then I do not get inflation of my memory. This makes sense and is most logical when you need to make data in my situation.
Information about which variables exist, etc. usually helps, although thanks.
memory-management memory-leaks perl
Colin newell
source share