Quantcast
Channel: Questions in topic: "bulk-insert"
Viewing all 47 articles
Browse latest View live

Bulk insert Error in SSIS 2005

$
0
0
Hi I am using SQL 2005 when I use BULK INSERT TASK IN SSIS I HAVE GOT ERROR AS SEE BELOW [Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.Bulk load: An unexpected end of file was encountered in the data file.". DESTINATION TABLE: create table Cricketerinfo(cid int,cname varchar(30),cposition varchar(30),cage int) TEXT FILE: SAMPLE.TXT 1,sachin,opener,35 2,dravid,middle,32 3,ponting,middle,3 4,azhar,lower,36 I have attached process of ssis. find attachment for your ref: ![alt text][1] Any suggestion? [1]: /storage/temp/1546-bulk+insert+error+in+ssis.jpg

Moving data in one column into multiple columns

$
0
0
We have 2 tables in different databases on the same server. Have to import data from one to another on a weekly basis. Source table has many pieces of information in **values** column and they are comma seperated. They need to be divided and put into different columns as shown below. Source Table Server GroupID MessegeID Values ABC123 XYZ 0123 (UserID= 01, ManagerID= 03, Date= 05/23/11, Database= ABC, Role= XYZ, Object= Tables) Destination Table Server UserID ManagerID Database Role Object ABC123 0100 0300 ABCDEF XYZ Tables What is the best way to do this? Thank you.

SQLBulkCopy Timeout

$
0
0
I'm copying millions of records between identical tables in different SQL Server 200 5databases. It's a simple VB.net 2005 archive program where the user enters an archive before date. I'm using SQLBulkCopy, but am getting Timeout errors. From searching for answers, I've played with the BulkCopyTimeout (30-600) and BatchSize. But what I've determined in my testing is that no matter how many records I'm copying in the run, it times out on the last batch of records. It doesn't seem to know what to do with the last partial batch? Am I missing a setting? BatchSize = 100; 296 records to copy, it copies 200, but not the last 96 BatchSize = 1000; 415,713 records to copy, it copies 415,000, but not the last 713 ========== Try Using sourceConnection As SqlConnection = New SqlConnection(ProdDB) sourceConnection.Open() ' Perform an initial count on the destination table. Dim sqlCount As String = "SELECT COUNT(*) FROM tblTranDetail where PostDate < '" & Me.DateTimePicker1.Value & "';" Dim commandRowCount As New SqlCommand(sqlCount, sourceConnection) Dim countRecs As Long = System.Convert.ToInt32(commandRowCount.ExecuteScalar()) MessageBox.Show("Records to purge: " & countRecs) If countRecs > 0 Then Try ' Get data from the source table as a SqlDataReader. Dim sqlGetData As String = "SELECT * FROM tblTranDetail where PostDate < '" & Me.DateTimePicker1.Value & "';" Dim commandSourceData As SqlCommand = New SqlCommand(sqlGetData, sourceConnection) Dim reader As SqlDataReader = commandSourceData.ExecuteReader Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(ArchiveDB) bulkCopy.DestinationTableName = "tblTranDetail" bulkCopy.BulkCopyTimeout = (My.Settings.BulkCopyTO) bulkCopy.BatchSize = My.Settings.BulkCopyBatch bulkCopy.WriteToServer(reader) End Using Catch ex As Exception MessageBox.Show("Error copying data" & ex.Message & ex.ToString) End Try End If End Using Catch ex As Exception MessageBox.Show("error selecting prod data to archive" & ex.Message & ex.ToString) End Try =================================================================

Bulk Insert and format file not working

$
0
0
I hope someone can help with my dilemna. It's probably super simple that i'm missing but I need to get to the bottom of it so I can do future bulk loads. Problem: When using bulk insert with a format file (created by bcp). format file sample: 9.0 29 1 SQLINT 0 4 "," 1 PERSONID "" 2 SQLINT 0 4 "," 2 SecondID "" 3 SQLNCHAR 2 10 "," 3 Field1 4 SQLNCHAR 2 10 "," 4 Field2 ... 29 SQLNCHAR 2 200 "\r\n" 29 Field29 **Example data:** 1111111,4,NO,NO,NO,YES,NO,1111111,111111,111111,Hourly - Active,#222222222222222222,,,,,,,,,,,,,Lots of text here,#222222222222222222,08/31/2011,, There error says the column row one is being truncated due to conversion failure. When I use the SSMS import wizard it sees the 1st two columns as float. If I'm using a format file to force sql to use a certain data type why is still trying to convert these columns to something else? I hope there is enough here to explain the problem. Anyway help would be greatly appreciated as I do not want to keep importing these files into our production database. I would like to use the bulk insert into a temp table, do my updating as necessary, then remove the temp table. Thanks so much! K **Additional Information (16SEP2011)** Create table script: create table #TempTable ( PERSONID int, Secondid int, Field1 NVARCHAR(5), Field2 NVARCHAR(5), Field3 NVARCHAR(5), Field4 NVARCHAR(5), Field5 NVARCHAR(5), Field6 NVARCHAR(10), Field7 NVARCHAR(10), Field8 NVARCHAR(10), Field9 NVARCHAR(50), Field10 NVARCHAR(100) , Field11 NVARCHAR(50), Field12 NVARCHAR(100) , Field13 NVARCHAR(50), Field14 NVARCHAR(100), Field15 NVARCHAR(50), Field16 NVARCHAR(100), Field17 NVARCHAR(50), Field18 NVARCHAR(5), Field19 NVARCHAR(5), Field20 NVARCHAR(5), Field21 NVARCHAR(50), Field22 NVARCHAR(50), Field23 nvarchar(100), Field24 nvarchar(100), Field25 nvarchar(100), Field26 nvarchar(100) , Field27 nvarchar(100) ) --grab the data into the temporary table BULK INSERT #TempTable FROM 'D:sqlupdatesSampleData.csv' WITH (FORMATFILE='D:sqlupdatesSampleData-n.fmt') --Specify filename to use here select * from #TempTable Here is the bcp syntax used to create the format file: bcp AdventureWorks2008R2.HumanResources.Department format nul -T -n -f Department-n.fmt -i used native (-n) because the character version wasn't working plus i have to do an inner join on the personid which is an int, so I was trying to save myself from doing a cast after the fact. Format file: 9.0 29 1 SQLINT 0 4 "," 1 PERSONID "" 2 SQLINT 0 4 "," 2 SecondId "" 3 SQLNCHAR 2 10 "," 3 Field1 SQL_Latin1_General_CP1_CI_AS 4 SQLNCHAR 2 10 "," 4 Field2 SQL_Latin1_General_CP1_CI_AS 5 SQLNCHAR 2 10 "," 5 Field3 SQL_Latin1_General_CP1_CI_AS 6 SQLNCHAR 2 10 "," 6 Field4 SQL_Latin1_General_CP1_CI_AS 7 SQLNCHAR 2 10 "," 7 Field5 SQL_Latin1_General_CP1_CI_AS 8 SQLNCHAR 2 20 "," 8 Field6 SQL_Latin1_General_CP1_CI_AS 9 SQLNCHAR 2 20 "," 9 Field7 SQL_Latin1_General_CP1_CI_AS 10 SQLNCHAR 2 20 "," 10 Field8 SQL_Latin1_General_CP1_CI_AS 11 SQLNCHAR 2 100 "," 11 Field9 SQL_Latin1_General_CP1_CI_AS 12 SQLNCHAR 2 200 "," 12 Field10 SQL_Latin1_General_CP1_CI_AS 13 SQLNCHAR 2 100 "," 13 Field11 SQL_Latin1_General_CP1_CI_AS 14 SQLNCHAR 2 200 "," 14 Field12 SQL_Latin1_General_CP1_CI_AS 15 SQLNCHAR 2 100 "," 15 Field13 SQL_Latin1_General_CP1_CI_AS 16 SQLNCHAR 2 200 "," 16 Field14 SQL_Latin1_General_CP1_CI_AS 17 SQLNCHAR 2 100 "," 17 Field15 SQL_Latin1_General_CP1_CI_AS 18 SQLNCHAR 2 200 "," 18 Field16 SQL_Latin1_General_CP1_CI_AS 19 SQLNCHAR 2 100 "," 19 Field17 SQL_Latin1_General_CP1_CI_AS 20 SQLNCHAR 2 10 "," 20 Field18 SQL_Latin1_General_CP1_CI_AS 21 SQLNCHAR 2 10 "," 21 Field19 SQL_Latin1_General_CP1_CI_AS 22 SQLNCHAR 2 10 "," 22 Field20 SQL_Latin1_General_CP1_CI_AS 23 SQLNCHAR 2 100 "," 23 Field21 SQL_Latin1_General_CP1_CI_AS 24 SQLNCHAR 2 100 "," 24 Field22 SQL_Latin1_General_CP1_CI_AS 25 SQLNCHAR 2 200 "," 25 Field23 SQL_Latin1_General_CP1_CI_AS 26 SQLNCHAR 2 200 "," 26 Field24 SQL_Latin1_General_CP1_CI_AS 27 SQLNCHAR 2 200 "," 27 Field25 SQL_Latin1_General_CP1_CI_AS 28 SQLNCHAR 2 200 "," 28 Field26 SQL_Latin1_General_CP1_CI_AS 29 SQLNCHAR 2 200 "rn" 29 Field27 SQL_Latin1_General_CP1_CI_AS Sample data: 1111111,4,NO,NO,NO,YES,NO,0011001,011110,1234,Hourly - Active,#9101111111111111111111,,,,,,,,,,,,,There is a long string of text here. xxxxxx,#9101111111111111111111,08/31/2011,, 2222222,47,NO,NO,YES,NO,NO,0022002,022220,5678,Salaried - Active,#9101111111111111111111,,,,,,,,,,,,,There is a long string of text here. xxxxxx,#9101111111111111111111,08/31/2011,, 3333333,15,NO,NO,YES,NO,NO,0033003,033330,9101,Salaried - Active,#9101111111111111111111,,,,,,,,,,,,,There is a long string of text here. xxxxxx,#9101111111111111111111,08/31/2011,, error message: Msg 4863, Level 16, State 4, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (PERSONID). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)". Ok - I hope this gives enough info to resolve this. Thanks!! K

Its intersting try tis Problem in bulk insert using a stored procedure need to pass sinqle quote

$
0
0
Have writen a procedure to bulk insert using a stored procedure BULK INSERT test FROM 'D:\bcp\check.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO Proc IS CREATE PROCEDURE BULKINSERTTXT ( @TABLENAME VARCHAR(50), @PATHNAME VARCHAR(50) ) AS BEGIN DECLARE @SSQL NVARCHAR(50); SET @SSQL= SELECT ' BULK INSERT ' + '@TABLENAME' + ' FROM '' ' + '@PATHNAME' + ' '' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' ) GO' exec sp_ExeCuteSQL @SSQL END Problem with this is the query becomes like the below BULK INSERT @TABLENAME FROM ' @PATHNAME ' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO Now @PATHNAME is not a parameter Also tried without sinqle quote in dynamic query as BULK INSERT @TABLENAME FROM @PATHNAME WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO and exec the procedure as Exec BULKINSERTTXT ‘tablename’, ’'D:\bcp\check.txt'’ this also so error Kindly provide solution for this

minimally logged

$
0
0
how sql engine work in time of minimally logged. or explain how to transaction logged in bulk logged recovery model.

Bulk load format file,need to use ­'¡' as field delimiter.

$
0
0
Hi all . I got a project to bulk load from a flat file (sent by another source). They are sending format file too with the data file . The field delimiter they are using is '¡'(without quotes). I need to use this delimiter from format file for bulk load. I tried both .fmt file and xml with this delimiter for bulk insert ,but its giving me error on reading this delimiter. please help

Bulk insert comma delimited file

$
0
0
I have a file which is comma delimited. There are text fields and numeric fields. There are double quotes within the text fields and in the text fields there are also commas. When I try to load (bulk insert) I get field mismatch error. Is there a way I can avoid this error? I can load either by SSIS package or bulk insert. Thanks in advance for the help.

Bulk insert text qualifier data

$
0
0
I have a file which is comma delimited. The text qualifier is double quote(“”). There are commas within the text qualifier. How can I bulk insert this file into the sql server table? I tried using format file but it keeps giving me below error "The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly." I changed the size but still got the same message. Here’s my format file: > 10.0> 2> 1 SQLCHAR 0 100 "," 1 CDate ""> 2 SQLCHAR 0 50 "\\r\\n" 10 Ord_ID "" And below is the data file example > "7/17/2013 10:36:40 AM" "32309224" Is there anyway I can bulk insert the file without format file? Thanks.

Insert only new data from excel to sql 2008 r2 bulk data

$
0
0
Respected Geeks !! Good day to you and thanks in advance its so urgent i need it as i am new to my company so please help me to complete my task i have an excel file which has crores data and i have same copy of a table which will contain data of that excel file and i will insert new records daily from excel to that table but if some records matched i will update that if not matched i will have to insert in that table. what will be best practice to perform it waiting for your valuable advise thanx a ton...

deadlock when bulk inserting in two different tables

$
0
0
Hello, I am experiencing dead-locks when performing a bulk insert on two different tables with the SqlBulkCopy classes of .net. Actually, the tool using the SqlBulkCopy class converts data from a Sybase database to an empty Sql server database. To have maximum performance, all indexes are disabled in my sql server database (except for the clustered indexes), as well as the foreign keys and the check constraints. I have used this tool successfully on environments using sql server 2008 r2, but this time I am using it on a sql server 2008 environment (build 10.0.2531). I have multiple deadlock graphs that I could show you (see my post below). In this graph, I see that the object FK_TRA_REG_UNKNOWN_A_TRA_BCI_ is locked. This is a foreign key between tables TRA_REG and TRA_BCI_ID_INFO which are both present in the deadlock graph. This foreign key is disabled, so how could it be locked? Kind greetings

I am Having problems with Bulk Insert.

$
0
0
You do not have permission to use the bulk load statement. [SQLSTATE 42000] (Error 4834). The step failed. This is my syntax: bulk insert Prepaids.dbo.StdPPDGL_Import FROM '\\Main01\Filetransfers\Accounting\glosppd.txt' WITH ( DATAFILETYPE='char', FIELDTERMINATOR='|' ); GO This works just fine as a query but will not run as a Job. I am running as a domain account that has permissions assigned to folder. I have added the Grant Insert Permission, I have tried changing to the openrowset syntax with no avail. Thanks for any suggestions. Any suggestions?,

Stored procedure for data importing from flat files. Need some tips.

$
0
0
I am currently working on creation of SP to automate the process of importing flat data into SQL Server. I want you to give me some useful advice on the best way how to realise, taking into account things I need: 1. File path and name, delimeter sign, table name as variables. 2. Logging process of importing in one file for every import task (errors and output results like in bcp) 3. Process flat files with different column order 4. Check if data we are trying to import already exists in database (don't replicate) 5. Batch import 6. Error cathing Okay, now my thoughts. I would use either bcp or BULK INSERT. Declare delimeter, source file, table as variables. To identify column order I think we should parse first row with names and use intermediate table from which we then INSERT...SELECT into destination table in the right order. Check if data already EXISTS before importing. Still don't know about logging in one file with timeline. Any other errors that may occur? Any advice would be appreciated. Thank you in advance.

Bulk insert Error in SSIS 2005

$
0
0
Hi I am using SQL 2005 when I use BULK INSERT TASK IN SSIS I HAVE GOT ERROR AS SEE BELOW [Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.Bulk load: An unexpected end of file was encountered in the data file.". DESTINATION TABLE: create table Cricketerinfo(cid int,cname varchar(30),cposition varchar(30),cage int) TEXT FILE: SAMPLE.TXT 1,sachin,opener,35 2,dravid,middle,32 3,ponting,middle,3 4,azhar,lower,36 I have attached process of ssis. find attachment for your ref: ![alt text][1] Any suggestion? [1]: /storage/temp/1546-bulk+insert+error+in+ssis.jpg

Is it possible to right trim a flat file when bulk inserting using an SSIS package?

$
0
0
I've been tasked with modifying an SSIS package so that the flat files that are inserted into the database via a bulk insert are right trimmed. This is something new to me. The issue at hand is that there are several flat files that get inserted using a bulk insert, however, occassionally the files have carriage returns that are too far to the right which in turns causes the insert to fail. Currently we have to manually verify that the files are in a correct format prior to the package being run. We want to modify the package so that it does this for us. I understand that I can create a data flow with a derived column to do this for me, there approximately 20 files for which I need this done for and I'm looking for a faster, easier way of doing this.

directly or indirect insertion in table

$
0
0
i have a powwershell script which saves output in csv format. now i have two options- either to import from csv file to sqlserver table or without creating a csv file, insert into table by sql command by making sql connection. which choice is better keeping security issue ? ps. space issue seems to be in favour of later option

LOADING file with special charectors and japaneese charectors

$
0
0
HI Group, I am trying to load a file japaneese and special charectors . problem is that japaneese will load when i convert my utf-8 file to utf-16 and bcp it (special charectors are not loading inthis scenerio) special charectors are loading if i convert utf-8 to ANCI and then bcp in (japaneese are not loading in this case ) i want to load both the charectors . its urgent ...any help buch of thanks.....

Bulk Insert and non-local file via ODBC

$
0
0
Is it possible to use bulk insert operations where the local file is not on the server machine? ie: Program (perl script using ODBC) running on client machine creates temp text file and via ODBC connection Bulk Inserts this into remote SQLServer table eg: $dbh->do("BULK INSERT FROM \'c:\\data\\otchk.txt\' -- local file(client) WITH ( FIRSTROW = 2, FIELDTERMINATOR=\'\t\', ROWTERMINATOR =\'\n\' ,MAXERRORS = 20 )");

datatype change

$
0
0
Im having table like 'TestTable' with 1.5Crores of records with some 18 indexes. TestTable has CHAR columns. So i need to change it to VARCHAR columns. So i am creating a new table TestTable_Stage table. My Question: when should i need to create indexes in TestTable_Stage After pushing the data from TestTable to TestTable_Stage or Before pushing the data from TestTable to TestTable_Stage Plz reply ASAP Regards, Karthik

How to handle SQL Server BCP bulk insert if extra fields come in data file than we defined in format file

$
0
0
I am using non-xml format file to perform bulk import from a data file. Format File: 12.0 4 1 SQLCHAR 0 7 "\t" 1 DepartmentID "" 2 SQLCHAR 0 100 "\t" 2 Name "" 3 SQLCHAR 0 100 "\t" 3 GroupName "" 4 SQLCHAR 0 24 "\r\n" 4 ModifiedDate "" If I have defined number of fields 4 in format file and I am getting 4 fields and sometimes more than 4 fields in data file. How to handle this situation dynamically from failure.I don't want to make any changes in format file manually.
Viewing all 47 articles
Browse latest View live