[reportlab-users] Generating huge reports and memory consumption
Alessandro Praduroux
reportlab-users@reportlab.com
Mon, 6 Jan 2003 20:23:25 +0100
On Monday 06 January 2003 12:18, Robin Becker wrote:
> In article <thq6$KAE$VG+EwGt@jessikat.demon.co.uk>, Robin Becker
> <robin@reportlab.com> writes
>
> >Hi, thanks for your report on the latest CVS _rl_accel.c source. Can you
> >say which test was the one that had the memory increase? When I run the
> >last test to 40000 pages I see essentially no increase at all under
> >win32 (the memory oscillates up and down near 6.9Mb).
> >
> >I will attempt this under freeBSD to see what happens there.
>
> OK the freeBSD memory for that test is also completely static. If you're
> getting memory increases I guess I need to know a bit more about the
> content.
With the simple script you posted near the start of this thread, I get no
memory increase, but with my real life example I get a little increase over
time. I attach here my script, along with a couple of data files (obviously,
all data is fake to protect privacy :))
All U have to do is run
$ python avviso.py
and see the slight memory increase.
here is my actual setup:
bash-2.05b$ uname -a
Linux server 2.4.19-k7-smp #1 SMP Tue Nov 19 04:02:50 EST 2002 i686 unknown
unknown GNU/Linux
bash-2.05b$ python
Python 2.2.2 (#1, Dec 18 2002, 10:36:37)
[GCC 2.95.4 20011002 (Debian prerelease)] on linux2
Reportlab CVS version of yesterday night, CET time ~ 22.00
Distro is Debian sid.
Let me know if U need any more details or if I can do any more testing.
Bye
--
Linux webmaster & sysadmin
pradu@pradu.it