Page 1 of 1

Memory consumption and big tables

PostPosted: Fri Oct 30, 2009 8:11 am
by cbrunet
Hi!

I'm working on a big database (more than 12k records, more than 30 fields). Many functions of Xataface aren't working because Xataface try to load the entire table in memory to perform operations. I had to rewrite the cvs export function to export one row at time. Functions like update _set and copy_replace too are using a lot of memory when applied on the entire record set. I think some of the database access functions should be rewritten to work only on a few rows at time instead of working on the entire table.

Charles.

PostPosted: Fri Oct 30, 2009 5:28 pm
by shannah
Yes. There are certain functions that load the entire loaded set. In the general case Xataface will only load the number of records specified by the -limit directive (default is 30). For some actions This limit directive is raised to a really high value so that it can get everything.

Unfortunately actions like copy set isn't feasible on really large sets because it requires everything to be loaded into memory. Possible future improvements might be to change these selected actions to perform these actions incrementally.