This topic is not necessarily specific to keeping AX2012 functional, however, the older your implementation is the more likely poorly maintained backups are going to be an issue. I am going to start with the assumption that everyone has a back-up plan in place. This post will not directly cover back-up best practices like to 321 rule, or recovery objectives. You should be reviewing your back-up plan regularly to ensure it still meets the current business requirements.
This purpose of this post is to ensure that when you do need your back-ups they are consistent and reliable.
The first step in this process is to ensure that the back-ups are consistent. At my company, this is done by scheduled data refreshes of various pre-production environments using the production backups. It is a multi-step process which accomplishes this, each step will be detailed and discussed.
- SQL Job for nightly back-ups
- Powershell script to move back-ups off-site
- Powershell script to restore SQL backup to repository
- Powershell script to refresh data into pre-production environment.
SQL Job for Nightly Back-Up
I'm going to be quick here as everyone has their own back-up strategy. The part here that is important is to ensure that you are using the CheckSum switch on your back-up script. It is as simple as adding the WITH CHECKSUM flag to your back-up script. Note that enabling will increase your back-up time by some amount. I do not recommend enabling CHECKSUM on your transaction back-ups and instead only enable on the full backups.
Powershell script to move back-ups off-site
This step is optional, however if you do not already have an automated offsite solution this one is simple.
We use a straightforward PowerShell script to take the latest back-up from our full back-up directory and copy it to an offsite network location.
The PS Get-Child command is used to get the latest file in a location
$LatestBackUp = Get-ChildItem -Path '' -File | Sort-Object LastWritetime-Descending | Select-Object -First 1
The Get-Child command uses the -File tag to return a list of files in the location. This list is then piped to the Sort-Object command which sorts the list based on LastAccessTime with newest first. Finally the Select-Object command is used to select only the first item in the list.
The variable $LatestBackUp will now contain the latest file in the specified network location. This variable can be used with RoboCopy to copy the file to another location:
Robocopy.exe ('Path to Source Folder') ('Path to Destination Folder') $LatestBackUp /r:2 /w:10 /v /Log:C:\ScriptResults\CopyLog\Copy.txt
Remember to add a \ at the end of the file path for it work correctly with RoboCopy.
We also use RoboCopy with the /mir parameter in order to make an offsite copy of both our modelstores and installation files. All of this runs together in a single scheduled script.
Powershell script to restore SQL backup to repository
The PowerShell script to restore SQL is a bit more complicated.
#******AX Back-Up Restore and Validition Script******************
#Parameters
Param([string]$DBServer = "" , [string]$LatestBackUpLocation = "" )
Import-Module SqlServer
#SQL Restore Variables
$relocateDate = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('' ,')
$relocateLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('' , '' )
#Latest File Test Variables
$LatestBackUp = Get-ChildItem -Path $LatestBackUpLocation -File | Sort-Object LastWriteTime -Descending | Select-Object -First 1
$LatestBackupPath = $LatestBackUp.FullName
#Check that back-up date is correct
if(Test-Path $LatestBackUp.PSPath -NewerThan ((Get-Date).AddDays(-1)) ) #We are looking for the previous night's back-up.
{
write-host "Backup is latest"
}
else
{
write-host "Back-up is out of date"
Exit
}
Restore-SqlDatabase -ServerInstance $DBServer -Database '' -BackupFile $LatestBackupPath -ReplaceDatabase -Checksum -RelocateFile @($relocateDate, $relocateLog) -verbose
#******END******************
First you will need to install the SQLServer module for PowerShell on the computer where the script will be run. Once this has been accomplished we can move onto developing the script.
To start, use the Import-Module SqlServer command to import the SQLServer module.
Before crafting the restore command we need to deal with the variables that will be used to store the backup location, and the restore database. You could just hard-code the backup location and then find the latest back-up in a similar manner to how the back-up copy script works. We have found it helpful to use parameters in the script so that we can run the same SQL script on multiple environments. Parameters in PowerShell are pretty straightforward at least for our purposes.
#Parameters
Param([string]$DBServer = "" , [string]$LatestBackUpLocation = "" )
This command will define two optional parameters, one containing the name of the database instance to restored into, and other the path to backup files.
We will be using the Restore-SqlDatabase cmdlet to restore the SQL database and that requires a few variables be setup to contain the restored databases file locations. You will only need these variables setup if you are moving the database files to a different relative location from how they are stored in the backup.
#SQL Restore Variables
$relocateDate = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('' ,')
$relocateLog = New-Object Microsoft.SqlServer.Management.Smo.RelocateFile('' , '' )
It may be helpful to make sure that you are only restoring the latest back-up. The exact nature of this section will depend on your BACK-UP timing. In our case we run our SQL back-ups at 5:30PM, copy them at 8PM , and restore them at 4AM the next morning. As a result we are verifying that the back-up is from yesterday.
#Latest File Test Variables
$LatestBackUp = Get-ChildItem -Path $LatestBackUpLocation -File | Sort-Object LastWriteTime -Descending | Select-Object -First 1
$LatestBackupPath = $LatestBackUp.FullName
#Check that back-up date is correct
if(Test-Path $LatestBackUp.PSPath -NewerThan ((Get-Date).AddDays(-1)) ) #We are look at the previous night's back-up.
{
write-host "Backup is latest"
}
else
{
write-host "Back-up is out of date"
Exit
}
The first section finds the latest file in the directory specified in the parameter variables. Next if statement is used to test that the file is from yesterday, it is possible to also have a backup from today, this is intentional so that the script can also be used with backups run and restored in the same day.
With these pieces complete we can build the restore command.
Restore-SqlDatabase -ServerInstance $DBServer -Database '' -BackupFile $LatestBackupPath -ReplaceDatabase -Checksum -RelocateFile @($relocateDate, $relocateLog) -verbose
Server Instance The instance to restore into, kept in a parameter.
Database Name of the database to restore into
BackUpFile Path to the latest backup
ReplaceDatabase Allows for the existing database to be overridden
CheckSum Enforces a checksum to ensure the integrity of the data, has to be combined with the correct backup options
RelocatedFile Allows the database files to be relocated from their original locations; requires the correct object types, see above
I will expand on this script in future blog posts, this example is a simplified version of what we run everyday. Additional features include logging, e-mail notification, and Try..Catch to capture errors during the restore.
Powershell script to refresh data into pre-production environment.
This is another area that will be expanding in a future post. We use a custom solution here that I am not able to share. There are however native Microsoft stack approaches which I will share.
Conclusion
Automating your backup and restore process is important to maintain the integrity of your back-ups. While many companies have dedicated database administrators to maintain backups , many more do it. If you are an ERP administrator who is also responsible for your database backups I hope this has provided some insight in how to manage them and ensure their reliability.