Backup Methods

Discussion in 'Hosting Discussions' started by raxafarian, Jan 8, 2004.

  1. raxafarian

    raxafarian Participant

    50
    56
    +0
    If this fits better in another topic, please move.

    I have a dedicated server and have daily backups of all sites to a secondary harddrive. I also have a mysqldump performed every morning at the point of least useage. I was downloading this mysqldump every day or so to have a local offsite backup. Face it... the database IS your forum. I'd prefer to overdo the backup rather than not be able to recover a recent backup.

    I worked out a deal with another vbulletin.com user that also has a dedicated server. We exchanged 1gig space and 10 gig transfer. Now I send a daily database dump to my space on that server. No extra cost. I download a copy to my local machine every 4-7 days, but still have an backup moved offsite daily.

    Maybe I'm just paranoid :)
     
  2. Ryan Crocker

    Ryan Crocker Participant

    76
    0
    +1
    No, thats a very good habit to get into. Just in case, you have almost all your data so you can get it restored and up-to-date easily and as soon as possible.
     
  3. eva2000

    eva2000 Habitué

    1,190
    837
    +266
    i use to have my 6+ GB of mysql databases backed up every 12hrs to a separate ide backup drive via cron and mysqldump script, then once or twice a week manually make a backup and transfer to a 2nd backup server and once a week download a copy to my local pc.. unfortunately ide backup drive, scsi drive and controller and cpu died so all replaced except waiting on ide back up drive replacement :eek:
     
  4. wizard1974uk

    wizard1974uk Tazmanian Gremlin

    5,766
    790
    +20
    I'm on a shared server and have a daily cronjob running to backup my database. I get emailed if there is a problem with the cronjob.
     
  5. Raz

    Raz Aspirant

    32
    56
    +0
    I do what eva used to do, but also to a daily remote backup on some cheap webhost. The remote backup is encrypted automatically thru pgp, so I don't have any real worries.
     
  6. eva2000

    eva2000 Habitué

    1,190
    837
    +266
    Interesting.. worth further discussion into how you do this .. remote backup encrypted
     
  7. Wayne Luke

    Wayne Luke Tazmanian

    5,793
    0
    +35
    I just use cPanel's backup command about once a week.
     
  8. welo

    welo Enthusiast

    230
    0
    +0
    Here's something I ran across awhile back you guys might want to look at. Allows you to use a CRON job set to any interval you wish to automatically perform db backups. If your db isn't too big you can even set it up to mail the backups to you.
     
  9. Raz

    Raz Aspirant

    32
    56
    +0
    Sure no problem.

    I've got this in my backup script
    Code:
    nice -19 gpg --encrypt --recipient "<INSERT NAME>" ${backup}/<BACKUPNAME>.sql.bz2
    I then use ncftpput to place it on the other server via FTP.

    The public key is on the server and the private key is encrypted and stored on my PC.

    I've also got a little script running on the cheap host which makes sure that there are only a certain amount of backups so that I don't go over my disk usage.
     
  10. eva2000

    eva2000 Habitué

    1,190
    837
    +266
    ncftpput ? another thread needed for what this is ... your post worthy of a more extensive how to guide for admins of forums :)
     
  11. Raz

    Raz Aspirant

    32
    56
    +0
    ncftpput is part of the ncftp package. IIRC it's default on Redhat.

    Is a HOWTO guide appropriate for these forums?
     
  12. The Sandman

    The Sandman Administrator

    24,419
    1,822
    +2,382
    :yup:
     
  13. silver_2000

    silver_2000 Neophyte

    6
    51
    +0
    There is a SQL backup script that I found a while back that I modified and started using.

    My DBs are small <50mb and I FTP them to my home PC every night.

    The script is fired off by cron at 2am - it only gets the data tables - skipping the search index for example and then zips them and then ftp's it to my home PC using the day as a file descriptor. So If I do nothing - I always have 30 days of backups on my home pc. And 2 days on the server.

    I think I posted the script or a version of it on the vbulletin site. It also has email options for really small db's.

    If anyone is interested I can find it and post it here

    Doug
     
  14. vB Floris

    vB Floris vbfans.com

    1,372
    642
    +24
    I do daily backups to the web server, and the hosting provider makes a backup of the the whole system each 24h.

    Once a week I make a big proper backup to my hard drive.
     
  15. welo

    welo Enthusiast

    230
    0
    +0
    Hey Doug. I'd be interested in taking a look at this if you can drum it up.
     
  16. tamarian

    tamarian Enthusiast

    212
    46
    +0
    I rent an FTP backup space at the same data center where I'm hosted (EV1), and run a nightly script. The script backs up all the web files and databases into gzipped tarballs, and uploads them daily through ncftpput to the FTP backup space, It rotates them on a weekly bases, so we always have a Monday, Tuesday, Wednesday, etc. instances of our site and forum. I occasionally download to my home server, but the data is over 1 Gig compressed, so when the s*it hits the fan, it will require half a day of uploading. Having space on the same network helps, as it's on 10 Mbps.
     
  17. Tony-TXB

    Tony-TXB Neophyte

    4
    1
    +0
    Here's a shell script I use with cron:

    Code:
    #!/bin/bash
    
    # Mysql Hostname
    DBHOST='localhost'
    
    # Mysql Username
    DBUSER='root'
    
    # Mysql Password
    DBPASSWD='yourpass'
    
    # Local Directory for Dump Files
    LOCALDIR=/your/backup/dir/
    
    # Remove old backup files
    rm $LOCALDIR*.gz
    
    cd $LOCALDIR
    
    DBS=`/usr/local/mysql/bin/mysql -u$DBUSER -p$DBPASSWD -h$DBHOST -e"show databases"`
    
    for DATABASE in $DBS
    do
            if [ $DATABASE != "Database" ]; then
                    FILENAME=$DATABASE.gz
                   /usr/local/mysql/bin/mysqldump -u$DBUSER -p$DBPASSWD -h$DBHOST $DATABASE | gzip --best > $LOCALDIR$FILENAME
            fi
    done
    
    exit 0
    
    Save this as mydbdump.sh and then use cron to call it every night. :D

    As you can see all backups are removed before this script even starts new ones. The reason this is safe for me? Because this local dir is sync with a remote disk. If you'd rather just over write the files each night you'll want to remove this:

    Code:
    # Remove old backup files
    rm $LOCALDIR*.gz
    
     
  18. Anonymous

    Anonymous Habitué

    1,132
    613
    +273

    Putting all that on the command line is a security problem. You Should never put a password on the command line unless you are the only user on the server.
     
  19. Tony-TXB

    Tony-TXB Neophyte

    4
    1
    +0
    If you have root access and can access cron I don't think it would be an issue. Considering that you run the server.

    Yes, you can see the password by issuing a "ps aux" ...
     
  20. Anonymous

    Anonymous Habitué

    1,132
    613
    +273
    Root access and no other customers on the server.

    Be careful folks.
     
Verification:
Draft saved Draft deleted