I was migrating the bacula director from a FreeBSD box onto a Red Hat server. The former was a proof-of-concept kinda setup left running for a while and there wasn’t too much disk space left by now. On the disk, the bacula MySQL database was taking around 1GB, and there wasn’t even that much free space left.
I remembered there was an interesting way of checking how much space does a MySQL database occupy from within mysql prompt, and a quick google search revealed it:
SELECT table_schema "Data Base Name",
sum( data_length + index_length ) / 1024 / 1024 "Data Base Size in MB",
sum( data_free )/ 1024 / 1024 "Free Space in MB"
GROUP BY table_schema;
Interestingly enough, it also shows the “free space” inside the database file. This showed around 400MB free in my case, so I was hoping dumping the db it won’t result in too big of a file. What I thought I’d do was to pipe it over ssh to the other server, which was on the same LAN anyhow. I did it like this:
freebsd# mysqldump -u bacula bacula | gzip | ssh redhat "cat > bacula.db.gz"
And it worked out pretty well. To my great amazement resulting file was only 20MB big! I did a local dump on the FreeBSD box and it was the same size all right. I tried comparing md5 sums, but the headers of the dumps have the current dates/times inside, so it didn’t work. I think I saw it in the bacula manual that the catalog db grows over time, but it surprised me nevertheless.
Afterwards I loaded the catalog onto the redhat’s MySQL with:
[root@redhat ~]# gunzip -c ~/bacula.db.gz | mysql -u bacula bacula
and checking the database size from MySQL showed it takes 100MB now.
One final step I had to take was to upgrade the database structure, as I was also migrating from bacula-3 to bacula-5. Bacula-dir complained with:
Version error for database "bacula". Wanted 12, got 11
So I did the upgrade using the
update_bacula_tables.mysql script (located under /usr/libexec/bacula/ on RedHat), which showed a few warnings, but made bacula-dir run fine otherwise. So, if you are looking for some disk space to free up, check if your bacula catalog is not bloated. My setup does around 12 jobs and up to 100GB/30k files a day, and ran, I would imagine, for a year, so your mileage will vary depending on your backup intensity.