• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Sync live server and development server

#1
[eluser]mvdg27[/eluser]
Hi guys,

I have an issue, that's not directly PHP related, but since this forum has always been very helpful to me, I'm gonna give this one a go as well.

Currently I run two dedicated servers, a live server and a development server. On the live server I'm hosting a centralized CMS system for several clients now.

The development server I want to use to develop the CMS without worrying about making mistakes on the production server. After everything is working smoothly on the development server, I will deploy the application on the production server.

Now I think it would be useful to be able to test changes made on the development server on the actual websites without having to mess with the production server. So I'm looking for a way to periodically make a copy of these website files from the production server to the development server. This way it's easier to develop and test new modules in 'real life situations' without risking messing up the live server. So I'm looking for a way to sync the website data (not the application itself) between the two servers (from production to live)

Now I can do this manually, but there should be a way to do this automatically, right?

Just to be clear and give all the info:
- my servers are running DirectAdmin as an administration panel
- the website files (template files, images, etc.) are in separate hosting accounts
- each website has a seperate database, but all databases are part of one and the same hosting account for all websites

Does anyone have any experience with this and can maybe share some tips?

Thanks in advance. -Michiel

#2
[eluser]jwindhorst[/eluser]
You should be able to export your database to SQL files, and then repopulate the dev server with that data.

If you are using mysql as your DB this can be done pretty easily with phpmyadmin, but there is a file size limit on the import end of things.

If your not using MySql, try running a google search for some front end GUI tool for your DB of choice, or get savvy with CLI! Wink

#3
[eluser]mvdg27[/eluser]
Hi jwindhorst,

Thanks for your reply. I am aware of the fact that I can use phpmyadmin to export a database, upload and import it to the development server and have a working copy there. That's actually what I'm doing now. The thing is that I want to automate this process or at least minimize my workload. I'm getting more customers and all the copying is taking up too much time. And I don't just want to copy the database, but the files that make up the website (templates/graphics etc.) as well.

So all in all, I'm looking for a more sophisticated method here.

Thanks for your input anyway!

Cheers, Michiel

#4
[eluser]jwindhorst[/eluser]
I see, my apologies for over simplifying. There are still some decent options out there though. My first choice would be a bash script that would export the DB, tar up the files, ftp them to the new server, untar and reimport for me.

Or look into springloops. They are an hosted SVN service that also thought to roll out a deployment system. I've used it a few times, and really does add sanity when you are in a situation where you have to manage numerous clients and deployments. Of course you still have to get the db files into your Repo, but hey, sanity is sanity.

#5
[eluser]jedd[/eluser]
Hmmm ... I think in general it's a Bad Idea to use production data in a testing system. There are security and privacy issues to contend with. From a more pragmatic point of view, your test/development system should have data that is carefully designed to bother your code with edge and corner cases, amongst other stresses. Production data rarely provides a good test set.

Having said that, you could look at some kind of log-shipping approach (actually, that's an Oracle and MS-SQL -ism I think). With MySQL you may be able to set up a database to be a slave replica. In all these cases, though, well certainly MySQL and MS-SQL, the replica is basically forced into read-only mode, so would probably be of limited use to you here.

I suspect that you'll have problems with your provider, though, especially on the slave end of that equation. On the master end, too, I expect that anything managed by a 'control panel' will not be geared up to handle such complexities.

#6
[eluser]jedd[/eluser]
I'd assumed your primary focus was database data.

Copying the file system components should be relatively easy with a single rsync command - assuming you can ssh into the boxes.

#7
[eluser]mvdg27[/eluser]
@jwindhorst:
No need for apologies here Wink

Your description "a bash script that would export the DB, tar up the files, ftp them to the new server, untar and reimport" sounds like exactly what I need. The problem in this case is the 'how to'. I'm happy to study programming bash scripts, but I'm gonna need some pointers/ links to tutorials.

@jedd:
I understand your concerns. In this case it's not my main testing environment, but rather an easy way to test new stuff for specific clients (e.g. they want a new module, but first want to see how it's going to work out, stuff like that).

I've looked into making realtime replica's, but that's not really what I'm looking for. Again, it's just to have something relatively up-to-date to work with/ show the customer.

I own both my webservers, so I won't have problems with providers. I have full access. I just prefer to use DirectAdmin for the domain administration.

You're right I'm after both database and file system. I've read something about rsync. Maybe you can point out a good resource for my situation to me?

Thanks for the replies.

#8
[eluser]jwindhorst[/eluser]
For the most part, a bash script is just a group of command line commands bundled in a file. So if you can figure out how to do it on the command line, you can relatively easily put it into a file.

rsync is great for syncing servers, lots and lots of options. If you have CLI access to a *nix server most applications have a man. man rsync for example will tell you how to use rsync. man ftp same thing, and man tar, you guessed it.

CLI is fun, I just hope the penguins don't come after me for that crack about bash scripting being nothing but a group of commands.

#9
[eluser]jedd[/eluser]
Quote:Your description "a bash script that would export the DB, tar up the files, ftp them to the new server, untar and reimport" sounds like exactly what I need. The problem in this case is the 'how to'. I'm happy to study programming bash scripts, but I'm gonna need some pointers/ links to tutorials.

As the other J-man pointed out, bash scripts are fairly easy. Even if you don't realise it yet Wink

The big question is whether you can have any downtime on your boxes to do this stuff? MySQL can do hot backups, but I've always found them to be as slow as a wet week in August (Nothern Hemispherists should adapt this aphorism to taste).. Cold backups - where the DB is offline - are blindingly fast .. but obviously require the DB to be down. A lot depends on the complexity and size of the DB, of course.

If hot, you should google 'mysql hot backup' and then consider cold.

If cold, then it's easy - you usually stop the daemon, tar up the database, start the daemon. My script look like this:
Code:
#!/bin/bash
/etc/init.d/mysql  stop
tar cf - /var/lib/mysql  |  bzip2  --fast > /path/to/database.dump
/etc/init.d/mysql  start

And the middle line is only that complicated because I want to really minimise the downtime of the DB, and bzip is faster than gzip (tar natively supports the latter, not the former). I digress.

There's probably something to having regular backups of your production site anyway.


Quote:I've read something about rsync. Maybe you can point out a good resource for my situation to me?

I've met the bloke who wrote rsync, btw. Disturbingly smart guy.

If you can ssh directly into your development box, then it's really easy, because you can run the script on there and have it pull the data from production. Again, you may want to stage it locally, and do the copy from production --> local PC --> development box - as it gives you another backup on the local PC.

In any case, assume ssh to development is hunky dory, and my paths are all made up here, you'd run:

Code:
$  rsync  -av --delete  ssh.productionbox.com:/var/www/stuff/   /var/www/copyofstuff/

-a is the magic 'do everything you'd expect' option (retain permissions, recursive copy, etc)
-v is verbose - you'd turn that off if you didn't want to watch it (or you were scripting it)
--delete means that if it finds files on development that aren't on production any more, it'll delete them. It will NOT do anything to production, mind.

#10
[eluser]mvdg27[/eluser]
Hi Jedd,

Thanks for your great reply. This is really helpful information to learn more about this.

Your bash script tars all the MySQL databases doesn't it? I've been looking at "mysqldump" on the MySQL reference page. That would allow me to make dumps of only the specific databases associated with the CMS/Website, doesn't it? Would that be a sensible idea?

The rsync command looks really easy like this. I have ssh access, so I'm going to try this one out!

Thanks again, Michiel


Digg   Delicious   Reddit   Facebook   Twitter   StumbleUpon  


  Theme © 2014 iAndrew  
Powered By MyBB, © 2002-2021 MyBB Group.