Holy False Alarm, Batman!

When I get a notification from one of my SQL Servers that a job failed or something is broken, I leap into action. Those alerts are like the Batsignal for a DBA. They are a big deal and I don’t want to see them unless whatever caused it really is a big deal. So I am really annoyed when I get what I call stupid alerts – alerts that technically are correct, but really don’t mean anything or are caused by a trivial transitory condition. A perfect example is the alert my transaction log backup job sends out if it fails.

A real, legitimate alert would signal something like a lack of disk space or corruption or some other equally serious issue. But for a time, I was getting alerts from this job whenever a new database had been created. This is, in part, due to they way our backup policy is implemented. We take full backups once a week, differential backups daily, and transaction log backups hourly. Since a transaction log backup cannot be made until an initial full backup has been made, new databases would often cause my transaction log backup job to fail, since it runs every hour. (They would also cause my differential backup job to fail too, but the transaction log backup job runs more frequently, so it was the main culprit.) Even if I was the one who created the database and know I have to create a full backup right away, I would sometimes get the failed job alert if I happened to create the database right before the transaction log backup job ran. Annoying. The alert really doesn’t mean anything because I know the database hasn’t had a full backup yet – I just created it! It likely doesn’t even have any data in it. But technically, the alert is correct.

So to overcome this problem, I modified my transaction log backup and differential backup maintenance plans to include an initial step. This step checks to see if there are any databases that haven’t had a full backup created yet and, if so, makes one. Then any transaction log or differential backups can be made with no problems.

We backup to disk at our company, with each database’s backups in their own directory. My code, shown below, places these initial full backups in one folder, which I called InitialBackups. So that I remember what this directory is for, I created a ReadMe.txt file that I keep in that directory telling me or anyone else that might come across the directory what these backups are and that they can be deleted once the regularly scheduled full backup job has made a backup of the database. (Actually, since this is a subfolder of where my regular backup files are stored, my regular maintenance plan clean up job deletes then automatically after a certain amount of time.)

DECLARE @InitialBackupPath varchar(1000)
DECLARE @dbname varchar(256)
DECLARE @BackupPath varchar(1260)

SET @InitialBackupPath = ‘<insert your backup directory here>\’

DECLARE DatabaseCursor CURSOR FOR
SELECT name FROM sys.databases WHERE name <> ‘tempdb’ AND source_database_id is NULL

OPEN DatabaseCursor

FETCH NEXT FROM DatabaseCursor INTO @dbname
WHILE @@FETCH_STATUS = 0
BEGIN

IF NOT EXISTS (SELECT    1
FROM    msdb.dbo.backupset
WHERE    type = ‘D’ AND database_name = @dbname)
BEGIN

SET @BackupPath = @InitialBackupPath + @dbname + ‘.bak’
BACKUP DATABASE @dbName TO DISK=@BackupPath
END

FETCH NEXT FROM DatabaseCursor INTO @dbname

END

CLOSE DatabaseCursor
DEALLOCATE DatabaseCursor

UPDATE: Ran into a problem with this code this morning. I had just made a snapshot and, shortly thereafter, my transaction log backup failed. Turns out, the SELECT statement that defines the cursor was picking up the database snapshot. A snapshot cannot be backed up, so the job failed. I added the code in red to exclude snapshots from the cursor. Currently, snapshots are an Enterprise-only feature, but the code will still work on non-Enterprise versions as well.

Leave a Reply

Your email address will not be published. Required fields are marked *

I am a real person and can prove it by doing this math problem: (required) Time limit is exhausted. Please reload CAPTCHA.

This site uses Akismet to reduce spam. Learn how your comment data is processed.