INSERT large number of records 
Author Message
 INSERT large number of records
I want to run an uncomplicated INSERT statement (no joins) into another
table (2.5 Million rows).  This is a one time deal.  Given such a large set,
will this fill up my Transaction log and cause problems.  Is there a way in
Transact SQL to have a loop run the insert and commit this transaction after
every 10,000 records or so, and then start with the next subset of records?
If so, can anyone give me a hint as to how to go about this?

Thanks ahead of time,

Brian Moore



Fri, 13 Apr 2001 03:00:00 GMT
 INSERT large number of records

One way of doing this is to use a table lock on the table your selecting
from.  This will prevent SQL Server from running out of locks.

Example:

INSERT INTO mytable
SELECT *
FROM fromtable (TABLOCKX)

Quote:

>I want to run an uncomplicated INSERT statement (no joins) into another
>table (2.5 Million rows).  This is a one time deal.  Given such a large
set,
>will this fill up my Transaction log and cause problems.  Is there a way in
>Transact SQL to have a loop run the insert and commit this transaction
after
>every 10,000 records or so, and then start with the next subset of records?
>If so, can anyone give me a hint as to how to go about this?

>Thanks ahead of time,

>Brian Moore



Fri, 13 Apr 2001 03:00:00 GMT
 INSERT large number of records
Michael,

Quote:
>One way of doing this is to use a table lock on the table your selecting
>from.  This will prevent SQL Server from running out of locks.

This does not help with log space, however.

Roy



Sat, 14 Apr 2001 03:00:00 GMT
 INSERT large number of records
Brian,

I usually manage to do this sort of thing in steps based on the
clustered index key of the source table.  In the example below the
source table has an eight digit order number as the first column of
the clusterd index.  The column values are all over the lot, but tend
to cluster in ranges.  There is a simpler version I use for tables
clustered on a date, where I just use dateadd to increment a datetime
one day at a time.

If you cannot turn on the truncate log on checkpoint option you can
either add a DUMP TRAN to the loop - perhaps not every iteration - or
do them manually in another window.

I hope this helps.

Roy

set nocount on











BEGIN

   INSERT Target
   SELECT *
     FROM DWJDE..Source



END

set nocount off

Quote:

>I want to run an uncomplicated INSERT statement (no joins) into another
>table (2.5 Million rows).  This is a one time deal.  Given such a large set,
>will this fill up my Transaction log and cause problems.  Is there a way in
>Transact SQL to have a loop run the insert and commit this transaction after
>every 10,000 records or so, and then start with the next subset of records?
>If so, can anyone give me a hint as to how to go about this?

>Thanks ahead of time,

>Brian Moore



Sat, 14 Apr 2001 03:00:00 GMT
 INSERT large number of records
It depends what your database system allows you to do.  For example in
Informix On-Line you can commit your
transactions when the log is 75% full or every x number of minutes.  (I
usually leave it at 5 minutes).  Most databases will
commit when the log is X% full,  check your database documentation on this.

If you can lock the table in exclusive mode when doing the insert it will
also help to speed things up.  If you have indexes drop them before doing
the insert and after the insert re-create the indexes.

Good luck.



Quote:
> I want to run an uncomplicated INSERT statement (no joins) into another
> table (2.5 Million rows).  This is a one time deal.  Given such a large
set,
> will this fill up my Transaction log and cause problems.  Is there a way
in
> Transact SQL to have a loop run the insert and commit this transaction
after
> every 10,000 records or so, and then start with the next subset of
records?
> If so, can anyone give me a hint as to how to go about this?

> Thanks ahead of time,

> Brian Moore



Tue, 17 Apr 2001 03:00:00 GMT
 
 [ 6 post ] 

 Relevant Pages 

1. INSERT large number of records

2. Inserting large number of records in a hugh table ...

3. Insert slower with large number of columns

4. Problem when saving a large number of XML elements using insert with openxml

5. insert..select bug for large number of rows

6. Insert slower with large number of columns

7. Large Number of inserts

8. ARCHIVELOG and large number of inserts.

9. MS SQL Server, ADO, Large Number of Records, timeout

10. Changing a large number of records in a database over the Internet

11. ODBC problem in fetching large number of records.

12. Need help updating UPDATE-ing large number of records


 
Powered by phpBB® Forum Software