#StackBounty: #mysql #backup #mysqldump Backup a mysql database with a large number of tables (>100,000)

Bounty: 50

I know, really bad database design, but here we are, I have some kind of forum platform (Based on PunBB) and for each forum, I’ve generated a new set of tables. Really bad idea.

Time has passed and now I have more than 100,000 tables (SHOW TABLES; SELECT FOUND_ROWS();112965 rows in set (1.21 sec)). Performance are great though as the tables do the job of indexes and when you make a direct references to one table, it’s ultrafast.

The issue is now I am trying to back everything up and to move to another server. Of course, it take forever. I’ve launched a mysqldump:

mysqldump --max_allowed_packet=500M --force --opt -u root -pXXXX a > fullbackup.sql 

And it’s still processing, a little more than 12 hours! The backup is already 546 Mb in size and Mysql is still alive and working.

I’ve tried to copy directly the Mysql files but I’ve ran into the issue that a lot of tables has been corrupted.

Any idea to speed this up?


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.