DTS and importing CSV/ascii files? 
Author Message
 DTS and importing CSV/ascii files?

Alan

With DTS you don't need to set up a separate transformation for each column,
rather, one DTS package including one datapump for each combination of
source text file --> table.
DTS lets you 'see' the source text file as if it was a table, with rows, and
columns.
So your transformation would probably look like:

TextFile1 --> Table1
TextFile2 --> Table2
...
TextFile(n) --> Table(n)

If you have more than one processor, you would also benefit from real
parallel execution.

--
BG


Quote:
> I wish to write a program to import a number of CSV files into predefined
> tables on a database. Can anyone suggest the best method to use?
> I have explored using a datapump but it seems as though I have to set up a
> transformation for each column which would be very tedious for 240 tables
(I
> could be wrong here). Any suggestions or code samples would be greatly
> appreciated

> TIA
> Alan



Thu, 27 Jun 2002 03:00:00 GMT
 DTS and importing CSV/ascii files?

Also, check out the Bulk Insert task. Should have very good performance for
this task.

--
BG


Quote:
> I wish to write a program to import a number of CSV files into predefined
> tables on a database. Can anyone suggest the best method to use?
> I have explored using a datapump but it seems as though I have to set up a
> transformation for each column which would be very tedious for 240 tables
(I
> could be wrong here). Any suggestions or code samples would be greatly
> appreciated

> TIA
> Alan



Thu, 27 Jun 2002 03:00:00 GMT
 DTS and importing CSV/ascii files?
I wish to write a program to import a number of CSV files into predefined
tables on a database. Can anyone suggest the best method to use?
I have explored using a datapump but it seems as though I have to set up a
transformation for each column which would be very tedious for 240 tables (I
could be wrong here). Any suggestions or code samples would be greatly
appreciated

TIA
Alan



Fri, 28 Jun 2002 03:00:00 GMT
 DTS and importing CSV/ascii files?
Itzik,

Thanks for your speed response. I am still a little confused though as I set
up the DataPump from within my program but it gives me an error as I havent
set up any transformations. How do I set up a transformation at the file
level rather than column level?

Thanks again
Alan


Quote:
> Alan

> With DTS you don't need to set up a separate transformation for each
column,
> rather, one DTS package including one datapump for each combination of
> source text file --> table.
> DTS lets you 'see' the source text file as if it was a table, with rows,
and
> columns.
> So your transformation would probably look like:

> TextFile1 --> Table1
> TextFile2 --> Table2
> ...
> TextFile(n) --> Table(n)

> If you have more than one processor, you would also benefit from real
> parallel execution.

> --
> BG



> > I wish to write a program to import a number of CSV files into
predefined
> > tables on a database. Can anyone suggest the best method to use?
> > I have explored using a datapump but it seems as though I have to set up
a
> > transformation for each column which would be very tedious for 240
tables
> (I
> > could be wrong here). Any suggestions or code samples would be greatly
> > appreciated

> > TIA
> > Alan



Fri, 28 Jun 2002 03:00:00 GMT
 DTS and importing CSV/ascii files?
I would recommend you to first build an example of your transformation
through the DTS package designer. Once you understand the concept, you
should have no problems working through the Object Module.
The file is represented in the DTS package as a connection, and the pump
represents the transformation process between the connection to the file and
the connection to the table. The file should look to you like a table with
rows and with columns within those rows. And all you are left, is to map
between the appropriate columns.

Suppose you have the following file called c:\temp\authors.txt:
"172-32-1176","White","Johnson","408 496-7223","10932 Bigge Rd.","Menlo
Park","CA","94025",True
"213-46-8915","Green","Marjorie","415 986-7020","309 63rd St.
#411","Oakland","CA","94618",True
"238-95-7766","Carson","Cheryl","415 548-7723","589 Darwin
Ln.","Berkeley","CA","94705",True
"267-41-2394","O'Leary","Michael","408 286-2428","22 Cleveland Av. #14","San
Jose","CA","95128",True
"274-80-9391","Straight","Dean","415 834-2919","5420 College
Av.","Oakland","CA","94609",True
"341-22-1782","Smith","Meander","913 843-0462","10 Mississippi
Dr.","Lawrence","KS","66044",False
"409-56-7008","Bennet","Abraham","415 658-9932","6223 Bateman
St.","Berkeley","CA","94705",True
"427-17-2319","Dull","Ann","415 836-7128","3410 Blonde St.","Palo
Alto","CA","94301",True
"472-27-2349","Gringlesby","Burt","707 938-6445","PO Box
792","Covelo","CA","95428",True
"486-29-1786","Locksley","Charlene","415 585-4620","18 Broadway Av.","San
Francisco","CA","94130",True
"527-72-3246","Greene","Morningstar","615 297-2723","22 Graybar House
Rd.","Nashville","TN","37215",False
"648-92-1872","Blotchet-Halls","Reginald","503 745-6402","55 Hillsdale
Bl.","Corvallis","OR","97330",True
"672-71-3249","Yokomoto","Akiko","415 935-4228","3 Silver Ct.","Walnut
Creek","CA","94595",True
"712-45-1867","del Castillo","Innes","615 996-8275","2286 Cram Pl. #86","Ann
Arbor","MI","48105",True
"722-51-5454","DeFrance","Michel","219 547-9982","3 Balding
Pl.","Gary","IN","46403",True
"724-08-9931","Stringer","Dirk","415 843-2991","5420 Telegraph
Av.","Oakland","CA","94609",False
"724-80-9391","MacFeather","Stearns","415 354-7128","44 Upland
Hts.","Oakland","CA","94612",True
"756-30-7391","Karsen","Livia","415 534-9219","5720 McAuley
St.","Oakland","CA","94609",True
"807-91-6654","Panteley","Sylvia","301 946-8853","1956 Arlington
Pl.","Rockville","MD","20853",True
"846-92-7186","Hunter","Sheryl","415 836-7128","3410 Blonde St.","Palo
Alto","CA","94301",True
"893-72-1158","McBadden","Heather","707 448-4982","301
Putnam","Vacaville","CA","95688",False
"899-46-2035","Ringer","Anne","801 826-0752","67 Seventh Av.","Salt Lake
City","UT","84152",True
"998-72-3567","Ringer","Albert","801 826-0752","67 Seventh Av.","Salt Lake
City","UT","84152",True

You make a text-file-source connection to it, a connection to a table with
an appropriate format, and a data pump between them. The first column is
referred to as Col1, the Second as Col2 etc.

--
BG


Quote:
> Itzik,

> Thanks for your speed response. I am still a little confused though as I
set
> up the DataPump from within my program but it gives me an error as I
havent
> set up any transformations. How do I set up a transformation at the file
> level rather than column level?

> Thanks again
> Alan



> > Alan

> > With DTS you don't need to set up a separate transformation for each
> column,
> > rather, one DTS package including one datapump for each combination of
> > source text file --> table.
> > DTS lets you 'see' the source text file as if it was a table, with rows,
> and
> > columns.
> > So your transformation would probably look like:

> > TextFile1 --> Table1
> > TextFile2 --> Table2
> > ...
> > TextFile(n) --> Table(n)

> > If you have more than one processor, you would also benefit from real
> > parallel execution.

> > --
> > BG



> > > I wish to write a program to import a number of CSV files into
> predefined
> > > tables on a database. Can anyone suggest the best method to use?
> > > I have explored using a datapump but it seems as though I have to set
up
> a
> > > transformation for each column which would be very tedious for 240
> tables
> > (I
> > > could be wrong here). Any suggestions or code samples would be greatly
> > > appreciated

> > > TIA
> > > Alan



Fri, 28 Jun 2002 03:00:00 GMT
 DTS and importing CSV/ascii files?
Alan,
It would probably be helpful to build a sample package in the designer and
then
convert it to VB using the script package utility that is supplied on the
CD.

-Euan

Quote:
> Itzik,

> Thanks for your speed response. I am still a little confused though as I
set
> up the DataPump from within my program but it gives me an error as I
havent
> set up any transformations. How do I set up a transformation at the file
> level rather than column level?

> Thanks again
> Alan



> > Alan

> > With DTS you don't need to set up a separate transformation for each
> column,
> > rather, one DTS package including one datapump for each combination of
> > source text file --> table.
> > DTS lets you 'see' the source text file as if it was a table, with rows,
> and
> > columns.
> > So your transformation would probably look like:

> > TextFile1 --> Table1
> > TextFile2 --> Table2
> > ...
> > TextFile(n) --> Table(n)

> > If you have more than one processor, you would also benefit from real
> > parallel execution.

> > --
> > BG



> > > I wish to write a program to import a number of CSV files into
> predefined
> > > tables on a database. Can anyone suggest the best method to use?
> > > I have explored using a datapump but it seems as though I have to set
up
> a
> > > transformation for each column which would be very tedious for 240
> tables
> > (I
> > > could be wrong here). Any suggestions or code samples would be greatly
> > > appreciated

> > > TIA
> > > Alan



Fri, 28 Jun 2002 03:00:00 GMT
 
 [ 6 post ] 

 Relevant Pages 

1. Dts Import Question:Importing Csv file in an existing SQL2000 table with row-level conditions

2. How to import CSV (comma separated value) ascii file

3. SQL 7.0 DTS Import of CSV files

4. Importing unicode CSV file via DTS

5. DTS - importing CSV file drops last record

6. Importing unicode CSV file via DTS

7. DTS import error from CSV file - cannot insert value NULL

8. SQL 7.0 DTS Import of CSV files

9. DTS import csv - dynamic file name

10. DTS-CSV import file has dangling comma at the end of the record

11. Importing a 66 gigabyte csv file into SQL2000 using DTS

12. DTS-CSV file import with dangling comma after last column


 
Powered by phpBB® Forum Software