I'm using Archive::Tar::Streamed to archive a 4GB directory containing very small small source code files and some jars. I'm using a system with 4GB RAM and using Archive::Tar::Streamed because I don't want my entire directory to be on the memory as it results in out of memory condition. I reading files one by one from the directory in DFS manner using File::Find and writing it to the tar file using add method in Archive::Tar::Streamed. At any given point of time only one file is residing in the memory, even then %MEM of my perl process is gradually increasing. It got up to 20%. There is no file in my directory that is of 800MB. I have two questions here 1) why is %MEM increasing gradually? 2) Why hast it gone up to 20%. Is it because Archive::Tar is not freeing the memory after it has written the contents to the tar file?
-
It is not normal for your VSZ memory envelope to grow when you process your data in a stream-oriented manner, and moreso for RES (%MEM) which should normally stabilize in your kind of application.
Therefore I would strongly suspect that cross-references to data are being kept around (e.g. in a hash), or (less likely) that circular-references are generated.
See this post, "are there any tools for finding memory leaks in my perl program" for tips on tracing down which variables (be them in your code or in the module you are using) grow the memory envelope or keep cross-references around.
Cheers, V.
0 comments:
Post a Comment