Quantcast
Channel: Questions in topic: "bulk-insert"
Viewing all 47 articles
Browse latest View live

Import text file in sql 2005

$
0
0
we have few text files, which have huge no of records and I have to import into staging area (temporay tables) for further processing. I just wanted to know which is best way to import them into tables. 1. BCP 2. Bulk Insert 3. SSIS 4. Or any other best way.

bcp / Bulk Insert difference

$
0
0
The general differences are: bulk insert can only insert and bcp can import and export, bcp runs outside SQL Server and Bulk Insert runs inside. What are some advantages of running inside or outside of the SQL Server process space?

Error: Cannot bulk load. Invalid number of columns in the format file

$
0
0
Hi Team, I am trying to do a Bulk Insert from a monster data file into a sql server 2005 Table with a format file. Here's the command I use: *`bulk insert AdventureWorks..BulkTestTable from '\\\\john\\data\\report.dat' with (formatfile = '\\\\john\\formats\\report.fmt',TABLOCK,batchsize=10000)`* The format file is of version 7.0 and it has 1211 columns in the host file. Of which I am trying to insert 882 columns into my table. The Data file i use is contiguous in nature and each column is defined with specific widths. I have checked and rechecked the format file and the data file for leading spaces and whether all the length of data and their positions match and all seem to be fine. Still I am getting this error. Msg 4822, Level 16, State 1, Line 1 Cannot bulk load. Invalid number of columns in the format file "\\\\john\\formats\\report.fmt" Interesting fact is that, when i try to do the same thing using **BCP command line Utility** , i am able to load the data file successfully into my table with the same format file. PLEASE please HELP!!! :-(

Using bcp Or Bulk Insert To Insert Data From A CSV File With 2 Identical Columns?

$
0
0
Hi, I'm new to SQLServer and wondering how the below may be possible: I have a csv file that contains the following columns: CLIENT, NOTES, NOTES. As you can see there are 2 NOTES columns both of which need to be imported automatically using bcp or Bulk Insert into the table DATA_LOAD which has the following columns CLIENTNO, NOTES_FIELD1, NOTES_FIELD2. How can I use bcp or Bulk Insert to successfully load all 3 columns into their appropriate places in the table? Thanks in advance.

Improving SSIS Insert Performance

$
0
0
I have an SSIS package which loads a fact table and am trying to improve the package's performance. I have gotten to the point where the slowest part of the package load is inserting the records into the warehouse fact table. Here's what I have so far: - I am using an OLE DB data connection (instead of ADO) to connect to the SQL Server. - I am performing a bulk insert (fast table load) and unchecked the "check constraints" box. I set the batch load to 10,000 rows. - Before the data load, I am disabling all non-clustered indexes, and then I rebuild them after the load completes. - My clustered index is an identity key. - The database and log are sized appropriately so I don't run into any auto-growth issues while loading the table. - The destination table is on a SQL Server 2008 installation. - Each row inserted is no greater than 308 bytes (I have a couple varchar columns for the business key). Are there any additional suggestions for improving insert performance? I'm at the point now where the table loads in 2 hours (which is half of my load window and roughly where I want to be), but the source query and translations finish in closer to 40 minutes if I attach a rowcount instead of doing the insert, so I was hoping that maybe I was missing something which could shave off some more time.

Bulk insert escaping newlines

$
0
0
I need to `BULK INSERT` [VantageRx Drug][1] database rows that may contain newlines and/or tabs. Is there a way to escape these characters so that I don't have to try to find a unique field terminator and row terminator? [1]: http://www.multum.com/VantageRxDB.htm

Bulk Insert Data (Excel File)

$
0
0
Can you please advise me which is the best method of importing data into SQL from Excel (2007), Bulk Insert or OpenRowSet. Currently I've tried both and neither is working and before I continued to battle I thought I'd see which is the best option. Many thanks

Dynamic SQL to run Bulk Insert

$
0
0
Hi, I wonder if anyone can help. I am creating dynamic SQL using variables so that I can use the same script for various imports. The script is: DECLARE @sSourceType AS VARCHAR(10) DECLARE @sPath AS VARCHAR(50) DECLARE @sFileName AS VARCHAR(50) DECLARE @sSourceTableName AS VARCHAR(50) DECLARE @sSourceFMTFile AS VARCHAR(50) DECLARE @iSourceID AS INT DECLARE @SQL AS VARCHAR(500) SET @iSourceID = (SELECT TOP 1 iID FROM dbo.SX_SourceFiles WHERE iID NOT IN (SELECT iSourceID FROM SX_ImportHold)) SET @sPath = (SELECT TOP 1 sPath FROM dbo.SX_SourceFiles WHERE iID NOT IN (SELECT iSourceID FROM SX_ImportHold)) SET @sFileName = (SELECT TOP 1 sFileName FROM SX_SourceFiles WHERE iID NOT IN (SELECT iSourceID FROM SX_ImportHold)) SET @sSourceTableName = (SELECT TOP 1 sSourceTableName FROM dbo.SX_SourceFiles WHERE iID NOT IN (SELECT iSourceID FROM SX_ImportHold)) SET @sSourceFMTFile = (SELECT TOP 1 sSourceFMTFile FROM dbo.SX_SourceFiles WHERE iID NOT IN (SELECT iSourceID FROM SX_ImportHold)) PRINT @sSourceTableName EXEC ('Truncate table ' + @sSourceTableName ) PRINT @sSourceTableName + ' has been truncated' SET @SQL = ('BULK INSERT ' + @sSourceTableName + ' FROM ''' + @sPath + @sFileName + ''' WITH (FORMATFILE = ''' + @sSourceFMTFile + ''')') EXEC @SQL When you copy the printed script which looks like this: BULK INSERT PriceImportHoldingEW FROM 'C:\Auto Import\Auto Import\EandWTest.txt' WITH (FORMATFILE = 'C:\Auto Import\EandWFormatFile.fmt') It works fine, but when you to try to execute it within the script is has the error message: > Msg 911, Level 16, State 4, Line 27> Database 'BULK INSERT> PriceImportHoldingEW FROM 'C:\Auto> Import\Auto Import\EandWTest' does not> exist. Make sure that the name is> entered correctly. I am now completely stumped. Any ideas greatfully received. Many thanks

Moving data in one column into multiple columns

$
0
0
We have 2 tables in different databases on the same server. Have to import data from one to another on a weekly basis. Source table has many pieces of information in **values** column and they are comma seperated. They need to be divided and put into different columns as shown below. Source Table Server GroupID MessegeID Values ABC123 XYZ 0123 (UserID= 01, ManagerID= 03, Date= 05/23/11, Database= ABC, Role= XYZ, Object= Tables) Destination Table Server UserID ManagerID Database Role Object ABC123 0100 0300 ABCDEF XYZ Tables What is the best way to do this? Thank you.

SQLBulkCopy Timeout

$
0
0
I'm copying millions of records between identical tables in different SQL Server 200 5databases. It's a simple VB.net 2005 archive program where the user enters an archive before date. I'm using SQLBulkCopy, but am getting Timeout errors. From searching for answers, I've played with the BulkCopyTimeout (30-600) and BatchSize. But what I've determined in my testing is that no matter how many records I'm copying in the run, it times out on the last batch of records. It doesn't seem to know what to do with the last partial batch? Am I missing a setting? BatchSize = 100; 296 records to copy, it copies 200, but not the last 96 BatchSize = 1000; 415,713 records to copy, it copies 415,000, but not the last 713 ========== Try Using sourceConnection As SqlConnection = New SqlConnection(ProdDB) sourceConnection.Open() ' Perform an initial count on the destination table. Dim sqlCount As String = "SELECT COUNT(*) FROM tblTranDetail where PostDate < '" & Me.DateTimePicker1.Value & "';" Dim commandRowCount As New SqlCommand(sqlCount, sourceConnection) Dim countRecs As Long = System.Convert.ToInt32(commandRowCount.ExecuteScalar()) MessageBox.Show("Records to purge: " & countRecs) If countRecs > 0 Then Try ' Get data from the source table as a SqlDataReader. Dim sqlGetData As String = "SELECT * FROM tblTranDetail where PostDate < '" & Me.DateTimePicker1.Value & "';" Dim commandSourceData As SqlCommand = New SqlCommand(sqlGetData, sourceConnection) Dim reader As SqlDataReader = commandSourceData.ExecuteReader Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(ArchiveDB) bulkCopy.DestinationTableName = "tblTranDetail" bulkCopy.BulkCopyTimeout = (My.Settings.BulkCopyTO) bulkCopy.BatchSize = My.Settings.BulkCopyBatch bulkCopy.WriteToServer(reader) End Using Catch ex As Exception MessageBox.Show("Error copying data" & ex.Message & ex.ToString) End Try End If End Using Catch ex As Exception MessageBox.Show("error selecting prod data to archive" & ex.Message & ex.ToString) End Try =================================================================

Bulk Insert and format file not working

$
0
0
I hope someone can help with my dilemna. It's probably super simple that i'm missing but I need to get to the bottom of it so I can do future bulk loads. Problem: When using bulk insert with a format file (created by bcp). format file sample: 9.0 29 1 SQLINT 0 4 "," 1 PERSONID "" 2 SQLINT 0 4 "," 2 SecondID "" 3 SQLNCHAR 2 10 "," 3 Field1 4 SQLNCHAR 2 10 "," 4 Field2 ... 29 SQLNCHAR 2 200 "\r\n" 29 Field29 **Example data:** 1111111,4,NO,NO,NO,YES,NO,1111111,111111,111111,Hourly - Active,#222222222222222222,,,,,,,,,,,,,Lots of text here,#222222222222222222,08/31/2011,, There error says the column row one is being truncated due to conversion failure. When I use the SSMS import wizard it sees the 1st two columns as float. If I'm using a format file to force sql to use a certain data type why is still trying to convert these columns to something else? I hope there is enough here to explain the problem. Anyway help would be greatly appreciated as I do not want to keep importing these files into our production database. I would like to use the bulk insert into a temp table, do my updating as necessary, then remove the temp table. Thanks so much! K **Additional Information (16SEP2011)** Create table script: create table #TempTable ( PERSONID int, Secondid int, Field1 NVARCHAR(5), Field2 NVARCHAR(5), Field3 NVARCHAR(5), Field4 NVARCHAR(5), Field5 NVARCHAR(5), Field6 NVARCHAR(10), Field7 NVARCHAR(10), Field8 NVARCHAR(10), Field9 NVARCHAR(50), Field10 NVARCHAR(100) , Field11 NVARCHAR(50), Field12 NVARCHAR(100) , Field13 NVARCHAR(50), Field14 NVARCHAR(100), Field15 NVARCHAR(50), Field16 NVARCHAR(100), Field17 NVARCHAR(50), Field18 NVARCHAR(5), Field19 NVARCHAR(5), Field20 NVARCHAR(5), Field21 NVARCHAR(50), Field22 NVARCHAR(50), Field23 nvarchar(100), Field24 nvarchar(100), Field25 nvarchar(100), Field26 nvarchar(100) , Field27 nvarchar(100) ) --grab the data into the temporary table BULK INSERT #TempTable FROM 'D:sqlupdatesSampleData.csv' WITH (FORMATFILE='D:sqlupdatesSampleData-n.fmt') --Specify filename to use here select * from #TempTable Here is the bcp syntax used to create the format file: bcp AdventureWorks2008R2.HumanResources.Department format nul -T -n -f Department-n.fmt -i used native (-n) because the character version wasn't working plus i have to do an inner join on the personid which is an int, so I was trying to save myself from doing a cast after the fact. Format file: 9.0 29 1 SQLINT 0 4 "," 1 PERSONID "" 2 SQLINT 0 4 "," 2 SecondId "" 3 SQLNCHAR 2 10 "," 3 Field1 SQL_Latin1_General_CP1_CI_AS 4 SQLNCHAR 2 10 "," 4 Field2 SQL_Latin1_General_CP1_CI_AS 5 SQLNCHAR 2 10 "," 5 Field3 SQL_Latin1_General_CP1_CI_AS 6 SQLNCHAR 2 10 "," 6 Field4 SQL_Latin1_General_CP1_CI_AS 7 SQLNCHAR 2 10 "," 7 Field5 SQL_Latin1_General_CP1_CI_AS 8 SQLNCHAR 2 20 "," 8 Field6 SQL_Latin1_General_CP1_CI_AS 9 SQLNCHAR 2 20 "," 9 Field7 SQL_Latin1_General_CP1_CI_AS 10 SQLNCHAR 2 20 "," 10 Field8 SQL_Latin1_General_CP1_CI_AS 11 SQLNCHAR 2 100 "," 11 Field9 SQL_Latin1_General_CP1_CI_AS 12 SQLNCHAR 2 200 "," 12 Field10 SQL_Latin1_General_CP1_CI_AS 13 SQLNCHAR 2 100 "," 13 Field11 SQL_Latin1_General_CP1_CI_AS 14 SQLNCHAR 2 200 "," 14 Field12 SQL_Latin1_General_CP1_CI_AS 15 SQLNCHAR 2 100 "," 15 Field13 SQL_Latin1_General_CP1_CI_AS 16 SQLNCHAR 2 200 "," 16 Field14 SQL_Latin1_General_CP1_CI_AS 17 SQLNCHAR 2 100 "," 17 Field15 SQL_Latin1_General_CP1_CI_AS 18 SQLNCHAR 2 200 "," 18 Field16 SQL_Latin1_General_CP1_CI_AS 19 SQLNCHAR 2 100 "," 19 Field17 SQL_Latin1_General_CP1_CI_AS 20 SQLNCHAR 2 10 "," 20 Field18 SQL_Latin1_General_CP1_CI_AS 21 SQLNCHAR 2 10 "," 21 Field19 SQL_Latin1_General_CP1_CI_AS 22 SQLNCHAR 2 10 "," 22 Field20 SQL_Latin1_General_CP1_CI_AS 23 SQLNCHAR 2 100 "," 23 Field21 SQL_Latin1_General_CP1_CI_AS 24 SQLNCHAR 2 100 "," 24 Field22 SQL_Latin1_General_CP1_CI_AS 25 SQLNCHAR 2 200 "," 25 Field23 SQL_Latin1_General_CP1_CI_AS 26 SQLNCHAR 2 200 "," 26 Field24 SQL_Latin1_General_CP1_CI_AS 27 SQLNCHAR 2 200 "," 27 Field25 SQL_Latin1_General_CP1_CI_AS 28 SQLNCHAR 2 200 "," 28 Field26 SQL_Latin1_General_CP1_CI_AS 29 SQLNCHAR 2 200 "rn" 29 Field27 SQL_Latin1_General_CP1_CI_AS Sample data: 1111111,4,NO,NO,NO,YES,NO,0011001,011110,1234,Hourly - Active,#9101111111111111111111,,,,,,,,,,,,,There is a long string of text here. xxxxxx,#9101111111111111111111,08/31/2011,, 2222222,47,NO,NO,YES,NO,NO,0022002,022220,5678,Salaried - Active,#9101111111111111111111,,,,,,,,,,,,,There is a long string of text here. xxxxxx,#9101111111111111111111,08/31/2011,, 3333333,15,NO,NO,YES,NO,NO,0033003,033330,9101,Salaried - Active,#9101111111111111111111,,,,,,,,,,,,,There is a long string of text here. xxxxxx,#9101111111111111111111,08/31/2011,, error message: Msg 4863, Level 16, State 4, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (PERSONID). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)". Ok - I hope this gives enough info to resolve this. Thanks!! K

Its intersting try tis Problem in bulk insert using a stored procedure need to pass sinqle quote

$
0
0
Have writen a procedure to bulk insert using a stored procedure BULK INSERT test FROM 'D:\bcp\check.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO Proc IS CREATE PROCEDURE BULKINSERTTXT ( @TABLENAME VARCHAR(50), @PATHNAME VARCHAR(50) ) AS BEGIN DECLARE @SSQL NVARCHAR(50); SET @SSQL= SELECT ' BULK INSERT ' + '@TABLENAME' + ' FROM '' ' + '@PATHNAME' + ' '' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' ) GO' exec sp_ExeCuteSQL @SSQL END Problem with this is the query becomes like the below BULK INSERT @TABLENAME FROM ' @PATHNAME ' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO Now @PATHNAME is not a parameter Also tried without sinqle quote in dynamic query as BULK INSERT @TABLENAME FROM @PATHNAME WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO and exec the procedure as Exec BULKINSERTTXT ‘tablename’, ’'D:\bcp\check.txt'’ this also so error Kindly provide solution for this

minimally logged

$
0
0
how sql engine work in time of minimally logged. or explain how to transaction logged in bulk logged recovery model.

Bulk load format file,need to use ­'¡' as field delimiter.

$
0
0
Hi all . I got a project to bulk load from a flat file (sent by another source). They are sending format file too with the data file . The field delimiter they are using is '¡'(without quotes). I need to use this delimiter from format file for bulk load. I tried both .fmt file and xml with this delimiter for bulk insert ,but its giving me error on reading this delimiter. please help

Bulk insert comma delimited file

$
0
0
I have a file which is comma delimited. There are text fields and numeric fields. There are double quotes within the text fields and in the text fields there are also commas. When I try to load (bulk insert) I get field mismatch error. Is there a way I can avoid this error? I can load either by SSIS package or bulk insert. Thanks in advance for the help.

Bulk insert text qualifier data

$
0
0
I have a file which is comma delimited. The text qualifier is double quote(“”). There are commas within the text qualifier. How can I bulk insert this file into the sql server table? I tried using format file but it keeps giving me below error "The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly." I changed the size but still got the same message. Here’s my format file: > 10.0> 2 > 1 SQLCHAR 0 100 "," 1 CDate ""> 2 SQLCHAR 0 50 "\\r\\n" 10 Ord_ID "" And below is the data file example > "7/17/2013 10:36:40 AM" "32309224" Is there anyway I can bulk insert the file without format file? Thanks.

Insert only new data from excel to sql 2008 r2 bulk data

$
0
0
Respected Geeks !! Good day to you and thanks in advance its so urgent i need it as i am new to my company so please help me to complete my task i have an excel file which has crores data and i have same copy of a table which will contain data of that excel file and i will insert new records daily from excel to that table but if some records matched i will update that if not matched i will have to insert in that table. what will be best practice to perform it waiting for your valuable advise thanx a ton...

deadlock when bulk inserting in two different tables

$
0
0
Hello, I am experiencing dead-locks when performing a bulk insert on two different tables with the SqlBulkCopy classes of .net. Actually, the tool using the SqlBulkCopy class converts data from a Sybase database to an empty Sql server database. To have maximum performance, all indexes are disabled in my sql server database (except for the clustered indexes), as well as the foreign keys and the check constraints. I have used this tool successfully on environments using sql server 2008 r2, but this time I am using it on a sql server 2008 environment (build 10.0.2531). I have multiple deadlock graphs that I could show you (see my post below). In this graph, I see that the object FK_TRA_REG_UNKNOWN_A_TRA_BCI_ is locked. This is a foreign key between tables TRA_REG and TRA_BCI_ID_INFO which are both present in the deadlock graph. This foreign key is disabled, so how could it be locked? Kind greetings

I am Having problems with Bulk Insert.

$
0
0
You do not have permission to use the bulk load statement. [SQLSTATE 42000] (Error 4834). The step failed. This is my syntax: bulk insert Prepaids.dbo.StdPPDGL_Import FROM '\\Main01\Filetransfers\Accounting\glosppd.txt' WITH ( DATAFILETYPE='char', FIELDTERMINATOR='|' ); GO This works just fine as a query but will not run as a Job. I am running as a domain account that has permissions assigned to folder. I have added the Grant Insert Permission, I have tried changing to the openrowset syntax with no avail. Thanks for any suggestions. Any suggestions?,

Stored procedure for data importing from flat files. Need some tips.

$
0
0
I am currently working on creation of SP to automate the process of importing flat data into SQL Server. I want you to give me some useful advice on the best way how to realise, taking into account things I need: 1. File path and name, delimeter sign, table name as variables. 2. Logging process of importing in one file for every import task (errors and output results like in bcp) 3. Process flat files with different column order 4. Check if data we are trying to import already exists in database (don't replicate) 5. Batch import 6. Error cathing Okay, now my thoughts. I would use either bcp or BULK INSERT. Declare delimeter, source file, table as variables. To identify column order I think we should parse first row with names and use intermediate table from which we then INSERT...SELECT into destination table in the right order. Check if data already EXISTS before importing. Still don't know about logging in one file with timeline. Any other errors that may occur? Any advice would be appreciated. Thank you in advance.
Viewing all 47 articles
Browse latest View live