[reportlab-users] Platypus tables with large numbers of rows

Mike Kent mrmakent at cox.net
Thu Apr 12 22:43:12 EDT 2007


(I'm reposting this because of the recent mailing list problems, and the
fact that this message has not shown up in the mailing list digests I'm
getting)

Forgive me if this has already been addressed, but I've been reading
back thru the mailing list archives, and I've not seen it mentioned yet.

I'm new to ReportLab,but having great success with a reporting solution
involving SQLAlchemy and ReportLab for a real-world business
application. However, I'm concerned about the memory requirements when
generating tables with many hundreds or thousands of rows.

A Platypus table requires that you give it a list of all of the rows in
the table. That means that all of the data for all of the rows in the
table has to be in memory at the same time. From experimentation, it
appears that a table will not accept an iterator for a sequence of rows,
which would allow me to do lazy evaluation of the rows; it must be an
actual list.

Am I right about this? Is there no way of getting a table to use lazy
evaluation for the rows?




More information about the reportlab-users mailing list