The goal of this setup is the following. We have data being collected with many users hitting the database during production hours and we want to avoid getting further server load causing performance issues. So the goal is to have all the reporting done from a staging server. This way the production server data can remain functional and working at all times.
Production server is PD-AumTech_PC1
The staging reporting server is SG-AumTech_PC1
Database name: EmployeeDB
We setup the log shipping so that reporting server gets daily changes over from production server once a day so we have previous days data on reporting server by 8am on staging server.
In order to do this we looked at replication and SQL backups but log shipping in the end was the best option for this problem.
Currently the following jobs are created and steps below are sequentially setup for the log shipping solution.
Important Note: The Employee database on production has to be in full recovery model. The very first time you must make a backup of the EmployeeDatabase from production environment. Then restore it to the Staging server using that same full backup. The EmployeeDatabase must be restored with “standby mode with recovery” option in standby/read-only mode on the staging server as only changes in TRN log files coming from production server will be applied on a daily basis. Also be sure to run SQL Sever and SQL server agent are running under a network domain account. The account must have access to the file share where transaction files are stored. If both servers are running under different network accounts then make sure both accounts have access to the network file share path.
1. Using the wizard for log shipping the following jobs will be created, The log shipping jobs on PD-AumTech_PC1 is “LSBackup_EmployeeDB”. This job will create a transaction log file 7am in this network path “\\AumTechLSNetworkShare\DBBackupFolder”. Both the staging server and production server must have access to this network path.
2. The log shipping jobs on SG-AumTech_PC1 is “LSCopy_EmployeeDatabase”. This job will copy the transaction log file (from earlier 7am on production server) and run at 715am and copy the trn file to local path on the staging server machine “\\SG-AumTech_PC1\c$\SQL_LogFiles”
3. The log shipping job on SG-AumTech_PC1 is “LSRestore_EmployeeDatabase”. This job will apply the transactions trn file from c:\SQL_LogFiles to the database on staging server and update the database with all data from the previous trn file 24 hours prior.
These 3 jobs continue to run daily and update data on the staging server daily.
Additional Notes: If for any reason the log shipping stops working. Then best way to reset the entire process is to do the following.
1. Check if the service on the staging server is running the sql server agent and it has access to the network path where the trn files are stored.
2. If for any reason you need to redo the setup then take a full backup of the database WITHOUT the COPY_ONLY command. Then restore the database to staging server. Keep the same jobs and schedule. You can manually do the following steps to make sure the setup is still working.
A. Run The logshipping jobs on PD-AumTech_PC1 is “LSBackup_EmployeeDatabase”. This job will create a transaction log file in this path “\\AumTechLSNetworkShare\DBBackupFolder”
B. Next run the logshipping jobs on SG-AumTech_PC1 is “LSCopy_EmployeeDatabase”. This job will copy the transaction log file and copy the trn file to local path on the machine”\\SG-AumTech_PC1\c$\SQL_LogFiles” (c:\SQL_LogFiles)
C. Next run the log shipping job on SG-AumTech_PC1 is “LSRestore_EmployeeDatabase”. This job will apply the transactions trn file from c:\SQL_LogFiles to the database and update the database with all data from the previous TRN file 24 hours prior.
Be sure to thinking about how your production database backups should be setup and when the database logs should be shrinked for disk space. At which point you may have to follow the steps mentioned in the additional steps listed above.