Importing a 66 gigabyte csv file into SQL2000 using DTS 
Author Message
 Importing a 66 gigabyte csv file into SQL2000 using DTS
Hi ,

I have to import 230 gigabytes into SQL2000, the largest table ( *.csv
format ) is 66 gigabytes !

Anyone out there with experience in using DTS with files of this size ?

Any help / tips etc. are greatly appreciated.

Andr



Mon, 02 May 2005 00:23:35 GMT
 Importing a 66 gigabyte csv file into SQL2000 using DTS

The best way is to use the Bulk Insert task in your DTS Package or directly
the bcp command line. Look in the SQL documentation for details.
Denis



Quote:
> Hi ,

> I have to import 230 gigabytes into SQL2000, the largest table ( *.csv
> format ) is 66 gigabytes !

> Anyone out there with experience in using DTS with files of this size ?

> Any help / tips etc. are greatly appreciated.

> Andr



Mon, 02 May 2005 01:33:55 GMT
 Importing a 66 gigabyte csv file into SQL2000 using DTS
Adjust the fetch batch and commit size. Commit is most important in my
experience. You do not want 66 gig trying to commit in one go.

Bulk Insert SQL command I believe is the fastest.

--
Darren Green (SQL Server MVP)
http://www.sqldts.com

Check out the PASS Community Summit - Seattle, the largest and only user
event entirely dedicated to SQL Server, November 19-22.
http://www.sqlpass.org/events/seattle/index.cfm


Quote:
> Hi ,

> I have to import 230 gigabytes into SQL2000, the largest table ( *.csv
> format ) is 66 gigabytes !

> Anyone out there with experience in using DTS with files of this size ?

> Any help / tips etc. are greatly appreciated.

> Andr



Mon, 02 May 2005 01:50:35 GMT
 Importing a 66 gigabyte csv file into SQL2000 using DTS
Darren is correct BULK INSERT issued from within SQL Server is the fastest
method.  If you try to commit this as one big batch then something is going
to give.

I personally would go to Simple recovery Mode and backup afterwards.

--

Allan Mitchell (Microsoft SQL Server MVP)
MCSE, MCDBA
www.SQLDTS.com
I support PASS - the definitive, global community for SQL Server
professionals - http://www.sqlpass.org



Quote:
> Adjust the fetch batch and commit size. Commit is most important in my
> experience. You do not want 66 gig trying to commit in one go.

> Bulk Insert SQL command I believe is the fastest.

> --
> Darren Green (SQL Server MVP)
> http://www.sqldts.com

> Check out the PASS Community Summit - Seattle, the largest and only user
> event entirely dedicated to SQL Server, November 19-22.
> http://www.sqlpass.org/events/seattle/index.cfm



> > Hi ,

> > I have to import 230 gigabytes into SQL2000, the largest table ( *.csv
> > format ) is 66 gigabytes !

> > Anyone out there with experience in using DTS with files of this size ?

> > Any help / tips etc. are greatly appreciated.

> > Andr



Mon, 02 May 2005 03:57:31 GMT
 Importing a 66 gigabyte csv file into SQL2000 using DTS
Is BULK INSERT issued from within SQL Server different from BCP or the bulk
insert task in DTS ?

I will go with the BULK INSERT statement from within SQL, thanks for the
tip. ( Also thanks to Darren and Denis )

The data (the whole DB, not the 66 GB table)  will grow to about 1.3
Terabyte next year. ( It allready is that large but remains on the
Mainframe )
I know SQL Server can handle this, but how about *.csv file in combination
with bulk inserting them ?

Has someone imported bigger csv files in one piece using described method ?
Please let me know so I can tell the project manager that it has been done
before and SQL Server can do the job know and in the future.

Andr


Quote:
> Darren is correct BULK INSERT issued from within SQL Server is the fastest
> method.  If you try to commit this as one big batch then something is
going
> to give.

> I personally would go to Simple recovery Mode and backup afterwards.

> --

> Allan Mitchell (Microsoft SQL Server MVP)
> MCSE, MCDBA
> www.SQLDTS.com
> I support PASS - the definitive, global community for SQL Server
> professionals - http://www.sqlpass.org



> > Adjust the fetch batch and commit size. Commit is most important in my
> > experience. You do not want 66 gig trying to commit in one go.

> > Bulk Insert SQL command I believe is the fastest.

> > --
> > Darren Green (SQL Server MVP)
> > http://www.sqldts.com

> > Check out the PASS Community Summit - Seattle, the largest and only user
> > event entirely dedicated to SQL Server, November 19-22.
> > http://www.sqlpass.org/events/seattle/index.cfm



> > > Hi ,

> > > I have to import 230 gigabytes into SQL2000, the largest table ( *.csv
> > > format ) is 66 gigabytes !

> > > Anyone out there with experience in using DTS with files of this size
?

> > > Any help / tips etc. are greatly appreciated.

> > > Andr



Mon, 02 May 2005 07:21:23 GMT
 Importing a 66 gigabyte csv file into SQL2000 using DTS
Bulk Insert, DTS and BCP are similar in principle but different. They have
similar requirements and in some cases follow the same code path inside SQL
Server. There is a great presentation on high performance data loading in
SQL Server on http://www.sqldev.net.

BCP is an external process to SQL Server, Bulk insert runs in process with
SQL Server (and has perf improvements because of that) and DTS uses OLE DB
from an external process.

Bigger files than this have been bulk inserted into SQL Server for TPC-H
benchmark runs and for customers.

Take a look at the following:

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/opti...
p_tun_1a_5gyt.asp

--
-Euan

Please reply only to the newsgroup so that others can benefit. When posting,
please state the version of SQL Server being used and the error number/exact
error message text received, if any.

This posting is provided "AS IS" with no warranties, and confers no rights.


Quote:
> Is BULK INSERT issued from within SQL Server different from BCP or the
bulk
> insert task in DTS ?

> I will go with the BULK INSERT statement from within SQL, thanks for the
> tip. ( Also thanks to Darren and Denis )

> The data (the whole DB, not the 66 GB table)  will grow to about 1.3
> Terabyte next year. ( It allready is that large but remains on the
> Mainframe )
> I know SQL Server can handle this, but how about *.csv file in combination
> with bulk inserting them ?

> Has someone imported bigger csv files in one piece using described method
?
> Please let me know so I can tell the project manager that it has been done
> before and SQL Server can do the job know and in the future.

> Andr



> > Darren is correct BULK INSERT issued from within SQL Server is the
fastest
> > method.  If you try to commit this as one big batch then something is
> going
> > to give.

> > I personally would go to Simple recovery Mode and backup afterwards.

> > --

> > Allan Mitchell (Microsoft SQL Server MVP)
> > MCSE, MCDBA
> > www.SQLDTS.com
> > I support PASS - the definitive, global community for SQL Server
> > professionals - http://www.sqlpass.org



> > > Adjust the fetch batch and commit size. Commit is most important in my
> > > experience. You do not want 66 gig trying to commit in one go.

> > > Bulk Insert SQL command I believe is the fastest.

> > > --
> > > Darren Green (SQL Server MVP)
> > > http://www.sqldts.com

> > > Check out the PASS Community Summit - Seattle, the largest and only
user
> > > event entirely dedicated to SQL Server, November 19-22.
> > > http://www.sqlpass.org/events/seattle/index.cfm



> > > > Hi ,

> > > > I have to import 230 gigabytes into SQL2000, the largest table (
*.csv
> > > > format ) is 66 gigabytes !

> > > > Anyone out there with experience in using DTS with files of this
size
> ?

> > > > Any help / tips etc. are greatly appreciated.

> > > > Andr



Mon, 02 May 2005 16:18:07 GMT
 Importing a 66 gigabyte csv file into SQL2000 using DTS
There was also a great comparison a while back in SQL Server magazine.

I think they all import at a wild rate of knots but BULK INSERT has the
record.

--

Allan Mitchell (Microsoft SQL Server MVP)
MCSE, MCDBA
www.SQLDTS.com
I support PASS - the definitive, global community for SQL Server
professionals - http://www.sqlpass.org


Quote:
> Bulk Insert, DTS and BCP are similar in principle but different. They have
> similar requirements and in some cases follow the same code path inside
SQL
> Server. There is a great presentation on high performance data loading in
> SQL Server on http://www.sqldev.net.

> BCP is an external process to SQL Server, Bulk insert runs in process with
> SQL Server (and has perf improvements because of that) and DTS uses OLE DB
> from an external process.

> Bigger files than this have been bulk inserted into SQL Server for TPC-H
> benchmark runs and for customers.

> Take a look at the following:

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/opti...

- Show quoted text -

Quote:
> p_tun_1a_5gyt.asp

> --
> -Euan

> Please reply only to the newsgroup so that others can benefit. When
posting,
> please state the version of SQL Server being used and the error
number/exact
> error message text received, if any.

> This posting is provided "AS IS" with no warranties, and confers no
rights.



> > Is BULK INSERT issued from within SQL Server different from BCP or the
> bulk
> > insert task in DTS ?

> > I will go with the BULK INSERT statement from within SQL, thanks for the
> > tip. ( Also thanks to Darren and Denis )

> > The data (the whole DB, not the 66 GB table)  will grow to about 1.3
> > Terabyte next year. ( It allready is that large but remains on the
> > Mainframe )
> > I know SQL Server can handle this, but how about *.csv file in
combination
> > with bulk inserting them ?

> > Has someone imported bigger csv files in one piece using described
method
> ?
> > Please let me know so I can tell the project manager that it has been
done
> > before and SQL Server can do the job know and in the future.

> > Andr



> > > Darren is correct BULK INSERT issued from within SQL Server is the
> fastest
> > > method.  If you try to commit this as one big batch then something is
> > going
> > > to give.

> > > I personally would go to Simple recovery Mode and backup afterwards.

> > > --

> > > Allan Mitchell (Microsoft SQL Server MVP)
> > > MCSE, MCDBA
> > > www.SQLDTS.com
> > > I support PASS - the definitive, global community for SQL Server
> > > professionals - http://www.sqlpass.org



> > > > Adjust the fetch batch and commit size. Commit is most important in
my
> > > > experience. You do not want 66 gig trying to commit in one go.

> > > > Bulk Insert SQL command I believe is the fastest.

> > > > --
> > > > Darren Green (SQL Server MVP)
> > > > http://www.sqldts.com

> > > > Check out the PASS Community Summit - Seattle, the largest and only
> user
> > > > event entirely dedicated to SQL Server, November 19-22.
> > > > http://www.sqlpass.org/events/seattle/index.cfm



> > > > > Hi ,

> > > > > I have to import 230 gigabytes into SQL2000, the largest table (
> *.csv
> > > > > format ) is 66 gigabytes !

> > > > > Anyone out there with experience in using DTS with files of this
> size
> > ?

> > > > > Any help / tips etc. are greatly appreciated.

> > > > > Andr



Tue, 03 May 2005 04:11:22 GMT
 
 [ 7 post ] 

 Relevant Pages 

1. Dts Import Question:Importing Csv file in an existing SQL2000 table with row-level conditions

2. DTS import error from CSV file - cannot insert value NULL

3. DTS - importing CSV file drops last record

4. SQL 7.0 DTS Import of CSV files

5. Importing unicode CSV file via DTS

6. Importing unicode CSV file via DTS

7. SQL 7.0 DTS Import of CSV Files

8. SQL 7.0 DTS Import of CSV files

9. DTS import csv - dynamic file name

10. DTS-CSV import file has dangling comma at the end of the record

11. DTS-CSV file import with dangling comma after last column

12. DTS and importing CSV/ascii files?


 
Powered by phpBB® Forum Software