[reportlab-users] Generating huge reports and memory consumption

Alessandro Praduroux reportlab-users@reportlab.com
Fri, 3 Jan 2003 15:06:15 +0100


Hi all

I am using reportlab to generate huge printings (more than 40,000 pages of 
output), basically it's a letter that should be sent to 40,000 customers

I use the following approach:
	
	# open the data file 
        f = open(data,"r")
        for line in xreadlines.xreadlines(f):
	    # for each line of data, append what's needed to the Story 
            self.pageOut(len)
            self.i += 1
	    # to avoid using up all memory, print out a pdf each 100 pages
            if (self.i % 100) == 0:
                doc = SimpleDocTemplate(c + "." + str(self.i)+".pdf")
		# build 'consumes' all Story content
                doc.build(self.Story, onFirstPage=self.layoutPages,
			      onLaterPages=self.layoutPages)
                doc = None
                print "printed so far: %d" % self.i

The problem I have is that the memory usage goes up even if I print out chunks 
of 100 pages to the point that I consume all available memory (the pc I am 
using is pretty tight) and the script is not sceduled anymore by the os when 
I reach the limit of 10,000 pages printed.

Playing with gc, I noticed that it remains one object not freed for each page 
I print, but I can't determine exactly which object and where it's created.

Some more infos: for each page I print a couple of images (PIL image objects 
allocated at the beginning of the script) and some fixed text in 
'self.layoutPages'

Any suggestion?


-- 
Linux webmaster & sysadmin
pradu@pradu.it