Memory consumption and big tables

A place for users and developers of the Xataface to discuss and receive support.

Memory consumption and big tables

Postby cbrunet » Fri Oct 30, 2009 8:11 am

Hi!

I'm working on a big database (more than 12k records, more than 30 fields). Many functions of Xataface aren't working because Xataface try to load the entire table in memory to perform operations. I had to rewrite the cvs export function to export one row at time. Functions like update _set and copy_replace too are using a lot of memory when applied on the entire record set. I think some of the database access functions should be rewritten to work only on a few rows at time instead of working on the entire table.

Charles.
--
Charles Brunet
http://cbrunet.net
cbrunet
 
Posts: 13
Joined: Tue Aug 19, 2008 7:14 am
Location: Québec, Canada

Postby shannah » Fri Oct 30, 2009 5:28 pm

Yes. There are certain functions that load the entire loaded set. In the general case Xataface will only load the number of records specified by the -limit directive (default is 30). For some actions This limit directive is raised to a really high value so that it can get everything.

Unfortunately actions like copy set isn't feasible on really large sets because it requires everything to be loaded into memory. Possible future improvements might be to change these selected actions to perform these actions incrementally.
shannah
 
Posts: 4457
Joined: Wed Dec 31, 1969 5:00 pm


Return to Xataface Users

Who is online

Users browsing this forum: No registered users and 17 guests

cron
Powered by Dataface
© 2005-2007 Steve Hannah All rights reserved