SQL – AumTechHub.Com http://aumtechhub.com Just share it!!! Mon, 09 May 2022 13:01:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.8 179596041 UTC Time For North America – EST http://aumtechhub.com/utc-time/?utm_source=rss&utm_medium=rss&utm_campaign=utc-time Sat, 09 Apr 2022 00:54:25 +0000 https://aumtechhub.com/?p=366 EST 12H-Time: 2025-04-17 06:28:28 EDT

EST 24H-Time: 2025-04-17 06:28:28 EDT

UTC Time: 2025-04-17 10:28:28 UTC

This is the UTC for Eastern time zone for North America based on current eastern time.

]]>
366
How to get table size in a database? http://aumtechhub.com/how-to-get-table-size-in-a-database/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-get-table-size-in-a-database Thu, 11 Nov 2021 04:54:04 +0000 https://aumtechhub.com/?p=280 This query will return size of each table in a sql database. This is a great way to monitor growth of a database and decide what restrictions might be needed to keep the database size from growing out of control. You may want to put a limit on file upload size on a table that might storing documents for example.

SELECT
t.NAME AS TableName,
p.rows AS TableRowCounts,
CONVERT(DECIMAL,SUM(a.total_pages)) * 8 / 1024 AS TotalSpaceMB,
CONVERT(DECIMAL,SUM(a.total_pages)) * 8 / 1024 /1024 AS TotalSpaceGB,
SUM(a.used_pages) * 8 / 1024 AS UsedSpaceMB ,
SUM(a.used_pages) * 8 / 1024 / 1024 AS UsedSpaceGB
FROM
DatabaseName.sys.tables t
INNER JOIN
DatabaseName.sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN
DatabaseName.sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN
DatabaseName.sys.allocation_units a ON p.partition_id = a.container_id
LEFT OUTER JOIN
DatabaseName.sys.schemas s ON t.schema_id = s.schema_id
WHERE
t.is_ms_shipped = 0
AND i.OBJECT_ID > 255
GROUP BY
t.Name, s.Name, p.Rows
ORDER BY
UsedSpaceGB DESC, t.Name

]]>
280
Database Mail Error Failed to initialize sqlcmd library with error number -2147467259 http://aumtechhub.com/database-mail-error-failed-to-initialize-sqlcmd-library-with-error-number-2147467259/?utm_source=rss&utm_medium=rss&utm_campaign=database-mail-error-failed-to-initialize-sqlcmd-library-with-error-number-2147467259 Wed, 11 Aug 2021 03:29:26 +0000 https://aumtechhub.com/?p=249 In this example below the error will occur because the context of the query is running under MSDB when the query is specified without the database name in the query.

The correct query for the above example is:

EXEC msdb.dbo.sp_send_dbmail
@profile_name ='dbMail',
@recipients = 'dharmyog.com@gmail.com',
@subject ='Test Email',
@body ='Test Email',
@query=' select * from [dbo].[vwEmployees] Where EmployeeAge = 25',
@attach_query_result_as_file = 1;

EXEC msdb.dbo.sp_send_dbmail
@profile_name ='dbMail',
@recipients = 'dharmyog.com@gmail.com',
@subject ='Test Email',
@body ='Test Email',
@query=' select * from EmployeeDB.dbo.vwEmployees Where EmployeeAge = 25',
@attach_query_result_as_file = 1;

This way the stored procedure will execute under the msdb context and the query will execute because the database hierarchy is referenced. So the database name has to be entered in the query.

]]>
249
How to get timezone information? http://aumtechhub.com/how-to-get-timezone-information/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-get-timezone-information Thu, 04 Mar 2021 12:58:56 +0000 http://aumtechhub.com/?p=219 Using UTC time and using offset values can be one of the best ways to make sure you capture the correct local time based on region. In order to get the time information directly from SQL you can run the following query.

select * from sys.time_zone_info
order by name

]]>
219
How to rerun SQL query after delay? http://aumtechhub.com/how-to-rerun-sql-query-after-delay/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-rerun-sql-query-after-delay Thu, 21 Jan 2021 15:27:28 +0000 http://aumtechhub.com/?p=203 This sample show how to run the same query with delay. It will keep running the same query with 10 second delay and keep re-executing at least 5 times. Good example to when to use this is when you want to check the status on an on going backup.

Select L.NAME DBName,R.percent_complete as [Progress%]
FROM Master.dbo.sysdatabases L Left join sys.dm_exec_requests R on L.DBID=R.Database_ID
WHERE percent_complete > 0
raiserror('',0,1) with nowait
waitfor delay '00:00:10'
GO 15

]]>
203
How to autorecover sql script in ssms? http://aumtechhub.com/how-to-autorecover-sql-script-in-ssms/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-autorecover-sql-script-in-ssms Tue, 01 Sep 2020 18:34:34 +0000 http://aumtechhub.com/?p=165 Sometimes when SQL Server Management Studio crashes you end up losing your work. Many times it restarts on its own and loads the scripts that were not saved and gives you the option to reopen them. Other times it does not show this option especially when it doesnt auto start ssms after the crash. In this case you can still get your data back if you open the following path:

This is the path for if you have have SSDT version installed for VS2017

C:\Users\AumTechHubUser1\Documents\Visual Studio 2017\Backup Files\Solution1

]]>
165
Database Backup Percentage Complete Status http://aumtechhub.com/database-backup-percentage-complete-status/?utm_source=rss&utm_medium=rss&utm_campaign=database-backup-percentage-complete-status Sat, 29 Aug 2020 05:37:07 +0000 http://aumtechhub.com/?p=157 When performing backup with scripts its not easily possible to get the percentage complete of the running backup. This simple script will show status of current running backup on the server.

SELECT
L.NAME DBName,
R.percent_complete as [Progress%]
FROM Master.dbo.sysdatabases L Left join sys.dm_exec_requests R on L.DBID=R.Database_ID
WHERE R.Command LIKE '%BACKUP%'

Or you can change the where clause and use the following

SELECT
L.NAME DBName,
R.percent_complete as [Progress%]
FROM Master.dbo.sysdatabases L Left join sys.dm_exec_requests R on L.DBID=R.Database_ID
WHERE percent_complete > 0

If no backups are running then the query will return no data.

]]>
157
How to get list of new or modified database objects? http://aumtechhub.com/how-to-get-list-of-new-or-modified-database-objects/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-get-list-of-new-or-modified-database-objects Sat, 29 Aug 2020 03:28:44 +0000 http://aumtechhub.com/?p=142 This simple query tells us if there were any database objects created within the last 24 hours. You can modify the DATEADD parameters to get a larger time span.

select * from sys.objects where create_date > dateadd(D, -1, getdate())

This second query tells us if there were any database objects modified within the last 24 hours. You can modify the DATEADD parameters to get a larger time span.

select * from sys.objects where modify_date > dateadd(D, -1, getdate())

]]>
142
How to setup Database Log Shipping? http://aumtechhub.com/how-to-setup-database-log-shipping/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-setup-database-log-shipping Sat, 29 Aug 2020 03:11:48 +0000 http://aumtechhub.com/?p=133 The goal of this setup is the following. We have data being collected with many users hitting the database during production hours and we want to avoid getting further server load causing performance issues. So the goal is to have all the reporting done from a staging server. This way the production server data can remain functional and working at all times.

Production server is PD-AumTech_PC1

The staging reporting server is SG-AumTech_PC1

Database name: EmployeeDB

We setup the log shipping so that reporting server gets daily changes over from production  server once a day so we have previous days data on reporting server by 8am on staging server.

In order to do this we looked at replication and SQL backups but log shipping in the end was the best option for this problem.

Currently the following jobs are created and steps below are sequentially setup for the log shipping solution.

Important Note:  The Employee database on production has to be in full recovery model. The very first time you must make a backup of the EmployeeDatabase from production environment. Then restore it to the Staging server using that same full backup. The EmployeeDatabase must be restored with “standby mode with recovery” option in standby/read-only mode on the staging server as only changes in TRN log files coming from production server will be applied on a daily basis. Also be sure to run SQL Sever and SQL server agent are running under a network domain account. The account must have access to the file share where transaction files are stored. If both servers are running under different network accounts then make sure both accounts have access to the network file share path.

1. Using the wizard for log shipping the following jobs will be created, The log shipping jobs on PD-AumTech_PC1 is “LSBackup_EmployeeDB”. This job will create a transaction log file 7am in this network path “\\AumTechLSNetworkShare\DBBackupFolder”. Both the staging server and production server must have access to this network path.

2. The log shipping jobs on SG-AumTech_PC1 is “LSCopy_EmployeeDatabase”. This job will copy the transaction log file (from earlier 7am on production server) and run at 715am and copy the trn file to local path on the staging server machine “\\SG-AumTech_PC1\c$\SQL_LogFiles”

3. The log shipping job on SG-AumTech_PC1 is “LSRestore_EmployeeDatabase”. This job will apply the transactions trn file from c:\SQL_LogFiles to the database on staging server and update the database with all data from the previous trn file 24 hours prior.

These 3 jobs continue to run daily and update data on the staging server daily.

Additional Notes: If  for any reason the log shipping stops working. Then best way to reset the entire process is to do the following.

1. Check if the service on the staging server is running the sql server agent and it has access to the network path where the trn files are stored.

2. If for any reason you need to redo the setup then take a full backup of the database WITHOUT the COPY_ONLY command. Then restore the database to staging server. Keep the same jobs and schedule. You can manually do the following steps to make sure the setup is still working.

A. Run The logshipping jobs on PD-AumTech_PC1 is “LSBackup_EmployeeDatabase”. This job will create a transaction log file in this path “\\AumTechLSNetworkShare\DBBackupFolder”

B. Next run the logshipping jobs on SG-AumTech_PC1 is “LSCopy_EmployeeDatabase”. This job will copy the transaction log file and copy the trn file to local path on the machine”\\SG-AumTech_PC1\c$\SQL_LogFiles” (c:\SQL_LogFiles)

C. Next run the log shipping job on SG-AumTech_PC1 is “LSRestore_EmployeeDatabase”. This job will apply the transactions trn file from c:\SQL_LogFiles to the database and update the database with all data from the previous TRN file 24 hours prior.

Be sure to thinking about how your production database backups should be setup and when the database logs should be shrinked for disk space. At which point you may have to follow the steps mentioned in the additional steps listed above.

]]>
133
How to get total row count of all database tables? http://aumtechhub.com/how-to-get-total-row-count-of-all-database-tables/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-get-total-row-count-of-all-database-tables Thu, 27 Aug 2020 04:29:27 +0000 http://aumtechhub.com/?p=54 This script will create a temp table and store the name of the database table and total row counts for each table in the database.

CREATE TABLE #TableRowCounts
(
TableName varchar(150),
TotalRows int
)

EXEC sp_MSForEachTable @command1=’INSERT #TableRowCounts (TableName, TotalRows) SELECT ”?”, COUNT(*) FROM ?’

SELECT TableName, TotalRows FROM #TableRowCounts ORDER BY TableName, TotalRows DESC

DROP TABLE #TableRowCounts

More on sp_MSForEachTable

]]>
54