I have a database in Simple mode from which I wish to delete several million rows of one table (not all rows so 'TRUNCATE TABLE' is no good), without making the transaction log file grow to several GB in size. Any ideas? I'm almost at the point where I might create a new table and copy the records I want to keep instead...
Thanks, Simon.Assuming the rows have a primary key and you can select the PK from the Primary table , you could setup a secondary table (PKToDelete) with, say, 10,000 PK , populate that table with a select, and then join the PKToDelete and Primary tables for the delete operation. Upon successful completion of the delete, truncate the PKToDelete table, and repeat at perodic intervals until all Primary table deletes are complete.
Since you are in simple mode, the transaction log will be checkpointed when your pulsed delete completes, thereby preventing major growth of the transaction log.|||declare @.batchsize int
set @.batchsize=10000
set rowcount @.batchsize
delete from bigtab where something='some value'
while @.@.rowcount=@.batchsize
begin
delete from bigtab where something='some value'
end
set rowcount 0
Edit:
PS. Can do the same for updates e.g.
declare @.batchsize int
set @.batchsize=10000
set rowcount @.batchsize
update bigtab set type='shrub'
where id between 2867127 and 1908917872
and type='tree'
while @.@.rowcount=@.batchsize
begin
update bigtab set type='shrub'
where id between 2867127 and 1908917872
and type='tree'
end
set rowcount 0
No comments:
Post a Comment