Welcome Guest, Not a member yet? Register   Sign In
Database Backup Utility
#1

[eluser]AgentPhoenix[/eluser]
On this matter, the user guide says the following:

Quote:Note: Due to the limited execution time and memory available to PHP, backing up very large databases may not be possible. If your database is very large you might need to backup directly from your SQL server via the command line, or have your server admin do it for you if you do not have root privileges.

So my question is, what does CI consider "very large?" Are we talking 1MB or 100MB? Does anyone have any real world experience with this? Do the CI devs know? And what does the utility return if in fact the database is too large to be backed up?
#2

[eluser]cahva[/eluser]
Well if your database is 16M and your memory_limit is 16M you definately wont be backing it up with CI's own backup. Backup takes eats atleast the 16M and you have to also count all the INSERT and other text that comes with it + memory taken by CI with loaded libs etc. Memory usage will most probably almost double if you gzip it also.

Generally its better and faster to use mysqldump either from shell or if your host have disable shell but you are enable to run mysqldump from php with exec commands.

For example if you are not limited by safe mode and you can use shell command mysqldump, backup of database is simple as this:

Code:
exec('mysqldump -u [user] -p[password] dbname > dump.sql');

..or with safe mode set to on you can try this althought memory_limit can also be issue in this because the dump is dumped to variable:

Code:
ob_start();
passthru('mysqldump -u [user] -p[password] dbname');
$out = ob_get_clean();
$fp = fopen('dump.sql','w');
fwrite($fp,$out);
#3

[eluser]samseko[/eluser]
[quote author="cahva" date="1224553790"]
Code:
ob_start();
passthru('mysqldump -u [user] -p[password] dbname');
$out = ob_get_clean();
$fp = fopen('dump.sql','w');
fwrite($fp,$out);
[/quote]

Thanks cahva - this got me out of a mess.. to make download prompt the result in gz format:
Code:
$date = date("Y-m-d");
$name = $date.'-sitename.gz';
$this->load->helper('download');
force_download($name, gzencode($out));
hope that is of use to others, too.
cheers
#4

[eluser]bretticus[/eluser]
In cases of having to upload database data, I came by this little gem called Big Dump. It let me upload a 80 MB database to crappy 1and1 cheapo account with no shell access.

It uses a staggered upload model. Worked quite well. Thought I'd pass it on.




Theme © iAndrew 2016 - Forum software by © MyBB