Announcement

Collapse
No announcement yet.

Database backups

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Database backups

    I'm having a problem when I try to save a backup on the webserver.

    When I click Save on the Backup database to a file on the server option, it starts running through the Database, but always falls short (never shows the bottom part), and always finishes on vbchat_store (I have several hacks installed), and when I check the saved file on the webspace, the filesize never matches, it's always different sizes!!

    This is damn frustrating and highly annoying as I prefer to back up regularly, but this has just started happening. I've made sure nothing else is using any bandwidth, but it always does the same thing.

    Please help!!

  • #2
    It sounds like the size of your database has grown above the file/time limits of php/mysql, you're going to have to make a dump of each table individually, or use phpmyadmin who can gzip it up and hopefully do it within those limits.

    Shell access (telnet/ssh) is still the best solution, perhaps your hosting provider can provide this upon request.

    Comment


    • #3
      I have phpMyAdmin and can back up that way, just prefer thru the AdminCP - makes life a little easier

      For the record, the database sits at just over 12mb currently, I wouldn't have said that was too large by any comparison, but it may be too large for php as you suggested.

      Thanks for your help.

      Jason

      Comment


      • #4
        ssh telnet + mysqldump is only way for safe backups

        while your database is small it will grow and getting into habit of using mysqldump is best
        :: Always Back Up Forum Database + Attachments BEFORE upgrading !
        :: Nginx SPDY SSL - World Flags Demo [video results]
        :: vBulletin hacked forums: Clean Up Guide for VPS/Dedicated hosting users [ vbulletin.com blog summary ]

        Comment


        • #5
          Even if it was 2mb, if you're setting is to timeout after 30s, and you download it for 31s, the download is cut off .. incomplete download.

          I have a 2mbit downstream, so 12mb within 30 or 60s is not a problem for me. Usually I can download through the admincp/ up to 100mb without much trouble, but 150 to 200 is always risky. Through phpmyadmin the system can gzip the dump for you so it is smaller and you can get a lot more data.

          Additionally, make a dir like 'backup' and chmod it to 777, then point your browser in the admincp/ to dump to the server into that dir, this is then using the speed of the server and not your download speed. So the dump is on the server and has more speed to do more data in a shorter period of time. Bigger change of being successful, if you don't want to use phpmyadmin that is and download through browser doesn't work. That way when it is done you can download it with your ftp to your computer.

          Comment

          widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
          Working...
          X