Best practise to manipulate large database

Best Practise to manipulate (select, insert, update, delete) a large database.

Recently I found a database table with 36GB data on one single table which neither have a primary key nor Index (custard or non- clustered). This database table holds some TPS audit data (couple of million records) for more than 3 years. I couldn’t even execute a select statement since the database gives a “timeout expired” error.

I’ve tried various methods and finally found out this simple and efficient way to delete data using row count. If you want to delete more than 2 million records you can use ROWCOUNT_BIG

 

Normal ‘delete statement’:

delete

FROM [Employer2003].[dbo].[Attendance_Audit_1]

where [UpdatedOn]<‘2010-01-01’ and date <‘2010-01-01’

 

New ‘delete statement’:

SET ROWCOUNT 1000

delete_more:

delete

FROM [Employer2003].[dbo].[Attendance_Audit_1]

where [UpdatedOn]<‘2010-01-01’ and date <‘2010-01-01’

IF @@ROWCOUNT > 0 GOTO delete_more

SET ROWCOUNT 0

 

By removing these records a few hundred at a time, you can dramatically reduce the number of locks that accumulate per transaction and prevent lock escalation in other words this reduces the deadlocks, and enhance the performance of you SQL DBMS.

Leave a Reply

Your email address will not be published. Required fields are marked *