Local Storage Backup FlowOn the Local Storage (or Console Storage) tab, the Primary backup path must always be used. The backup file is first saved to the Primary backup path and then the RMAD server will copy it to the Additional backup path. If the backup fails to be saved on the Primary backup path, then it will never be copied to the Additional backup path.
We strongly recommend the Primary backup path always points to a local drive and folder on the RMAD server itself. For example "D:\RMADBackups". However, the Additional backup path is typically a UNC share (using CIFS or SMB), for example "\\<hostname>\RMADBackups".
If access to your UNC share is restricted by IP address, you only need to allow the IP of the RMAD server access to that share.
Another option: Use Remote Storage tab insteadAnother option is to enter the share name in either of the paths (Primary or Additional) on the "Remote Storage" or "DC Storage" path:

Click To See Full Image.
Both the Primary and Additional Backup paths on this tab are managed by the settings at the bottom of the properties tab. Things to note when using Remote Storage:
1) The retention of both the primary and additional backup paths is governed by the same setting, so they are always the same.
2) The Backup agent on the Domain Controllers will copy the backup to both paths individually. Thus, if the Primary backup path fails, that does not affect the Additional backup path.
3) If access to the share is restricted by IP address (several enterprise backup solutions do this), then you need to allow access from the IP address of every DC you back up. This could be several IP addresses.
Because of the IP address restrictions some Enterprise backup solutions use, it may be better to use the Local Storage tab’s Additional backup path, and then run the script shared in this article.
Implementing the ScriptThe script is intended to be run from the "Advanced" tab of a computer collection, console side AFTER the backup creation is complete.

Click To See Full Image.
You must enter the same backup path ($backupPath variable), then the Script will delete any files that
(a) contain the DC name in the file name or path, and
(b) are older than the $nDays variable below.
The Delete_old_backups.ps1 scriptBelow is the script. Pay attention to the highlighted lines - you should either comment/uncomment or modify their values to meet your parameters.
Note: This script is not Quest-supported code. It has not gone through Quest's QA process and comes with no warranty, expressed or implied.
<# This script will find backups for computers in a computer collection on a UNC path, then Delete those backups older than $nDays. It's intended to be run within an RMAD computer collection to manage backups sent to the "additional backup path" on the Local Storage tab. for RMAD versions 10.3.1 and earlier Assumptions: - Must be run on the RMAD server - RMAD must be installed in its default location (C:\Program Files\Quest\...) - Collection only contains DC Names (not containers, ADLDS servers or list files) - Backup path must include %commputer% variable Written by Brian Hymer, Quest Software WARNING: This Script has no warranty expressed or implied. #> # Load RMAD Cmdletscd 'C:\Program Files\Quest\Recovery Manager for Active Directory Forest Edition\'Import-Module -DisableNameChecking ".\QuestSoftware.RecoveryManager.AD.PowerShell64.dll" Import-Module ".\QuestSoftware.RecoveryManager.AD.PowerShellFE.dll" #<- You don’t really need this module # Set Variables$collectionName = "ACME AD" # <- Specify the Computer Collection name, or, if on Version 10.3.1 or higher, you can comment this variable out. $backupPath = "\\10.1.1.142\rmadcifs" # <- change to the UNC share you use in additional backup path$nDays = 5 # <- This is the number of days old. Adjust as you see fit$today = Get-DateWrite-Host -ForegroundColor Cyan "This Script will recursively look for backups in '$backupPath' and delete any that are more than $nDays days old."#get DC names from collection$DCNames = (Get-RMADCollectionItem -Name $CollectionName).ComputerName#Write-Host -ForegroundColor Cyan "DCs in collection $collectionName : " $DCNames#find .bkf files on share (use escapes)Write-Host "finding .bkf files..."$Test = Get-ChildItem -filter *.bkf -Path $backupPath -Recurse | Select-Object name, creationtime, fullname # The meat of the script. # Double ForEach loops to compare Filenames to DC names, 1 DC at a time. # If there's a match - check the file age and delete if older than $nDays valueForEach ($DC in $DCNames) { Write-Host #adding a blank line Write-Host "Checking DC: " $DC Foreach ($File in $Test) { #file must match DC names in collection if (($File).fullname -match $DC) { # Get the file creation time $creationTime = $File.CreationTime # Calculate the age of the file in days $ageInDays = ($today - $creationTime).Days Write-Host -ForegroundColor Green "Matched file: "($File).fullname "(age: $ageInDays days)" # Check if the file is older than nDays if ($ageInDays -gt $nDays) { # if true, Delete the file #Remove-Item -Path $file.FullName -Force # <- Uncomment to actually delete files older than $nDays Write-Host -ForegroundColor Yellow "Deleted file: " ($File).FullName "(age: $ageInDays days)" } } }}