
Feature request: make db backup/restore handle LARGE files
Reported by dleffler | November 22nd, 2014 @ 02:39 PM
Currently, the expFile::dumpDatabase(), expFile::parseDatabase() & expFile::restoreDatabase() all work with the entire database/eql file at once (unless the table selection is small) which may exhaust available memory (esp. on ecommerce sites). We need to fix this by:
- Change the file($filename) calls in parseDatabase() & restoreDatabase() to use fgets($filename) and handle the .eql file line by line rather than all at once.
- Change the dumpDatabase to incrementally output/write the .eql file instead of creating an variable of the entire .eql file and then echoing/writing it.
Comments and changes to this ticket
-
dleffler December 20th, 2014 @ 01:47 PM
- Assigned user set to dleffler
- Milestone set to 2.3.3
-
-
-
dleffler August 20th, 2015 @ 09:09 PM
restoreDatabase() has already been updated to use fgets() which handles the file in small chunks instead of loading the entire file into memory
parseDatabase() is only used for importing module items, but is still memory based since we return the entire record as a data structure to allow individual selection of items for import...HOWEVER it should be less likely to consume all memory since it's only dealing with one module's table at a time
dumpDatabase() can be converted, but it appears that stdout buffers the entire output and can therefore exhaust available memory so we'll have to use temporary files to output as a download. See #1308
-
dleffler August 20th, 2015 @ 09:09 PM
- Tag changed from database, ecommerce to database, ecommerce, importexport
-
-
-
-
Please Sign in or create a free account to add a new ticket.
With your very own profile, you can contribute to projects, track your activity, watch tickets, receive and update tickets through your email and much more.
Create your profile
Help contribute to this project by taking a few moments to create your personal profile. Create your profile ยป
Bug Tracker for Exponent CMS