.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Errors during bulk insert

Posted By:      Posted Date: September 02, 2010    Points: 0   Category :ASP.Net
Hi, While writing a stored procedure in SQL 2008, for a bulk insert, I am getting several errors. Part of the SP is written below for your reference. Need to know where am I going wrong. Any help will be thankfully acknowledged.-Learn24x7----------CREATE PROCEDURE Sp_BulkInsert(@Directory nvarchar(256), @TblName nvarchar(50), @Collation nvarchar(50), @FieldSeparator nvarchar(50),    @RowTerminator nvarchar(50))ASBEGIN    SET NOCOUNT ON;                BULK INSERT [@TblName]    FROM ''+@Directory+'' --Error1: see below    WITH --Error2: see below     (     FIRSTROW = 2,     CODEPAGE = (SELECT CAST(@Collation AS INT)), --Error3: see below     FIELDTERMINATOR = ''+@FieldSeparator+'',     ROWTERMINATOR = ''+@RowTerminator+''     )  ENDGO----------FROM ''+@Directory+'' --Error1: An object or column name is missing or empty. For SELECT INTO statements, verify each column has a name. For other statements, look for empty alias names. Aliases defined as "" or [] are not allowed. Change the alias to a valid name.WITH --Error2: Incorrect syntax near the keyword 'with'. If this statement is a commo

View Complete Post

More Related Resource Links


Can I bulk insert to specific columns when bulk inserting a text file or do I have to bulk insert into all columns of a table? The extra column in the table is an identity integer. So I had to create a view without the identity integer to make it work. Maybe there is a more direct way?? If so what would be the syntax? I am a beginner to intermediate SQL guy.   strSQL = "BULK " & _        "INSERT CustomersUpload " & _        " FROM 'C:\MydataFile.txt' " & _        " WITH " & _        " ( " & _        " FIELDTERMINATOR = '|', " & _        " ROWTERMINATOR = '\n' " & _        " )" Thanks

Unable to prepare the SSIS bulk insert for data insertion

Hi,   I am using SQL Server Destinations in my data flow tasks. I'm running this package in the server until i encountered this error:   OnError,,,LOAD AND UPDATE Dimension Tables,,,10/24/2007 1:22:23 PM,10/24/2007 1:22:23 PM,-1071636367,0x,Unable to prepare the SSIS bulk insert for data insertion. OnError,,,Load Dimensions,,,10/24/2007 1:22:23 PM,10/24/2007 1:22:23 PM,-1071636367,0x,Unable to prepare the SSIS bulk insert for data insertion. OnError,,,Discount Reason, ISIS Condition, ISIS Defect, ISIS Repair, ISIS Section, ISIS Symptom, Job Status, Parts, Purchase SubOrder Type, Service Contract, Service Reason, Service Type, TechServiceGrp, WarrantyType, Branch, Wastage Reason,,,10/24/2007 1:22:23 PM,10/24/2007 1:22:23 PM,-1073450974,0x,SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Dim_T_ISISDefect" (56280) failed with error code 0xC0202071. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure. What could be the reason for this? I don't usually have an error.   cherriesh

SQL ServerDestination error - Unable to prepare the SSIS bulk insert for data insertion.

Having searched the forum, this one clearly has form... However beyond assisting those who have fallen at the first hurdle (i.e. forgetting/not knowing that they cannot execute the package remotely to the instance of SQL Server into which they are inserting), the issues raised by others have not been addressed. Thus I am bringing nothing new to the table here - just providing an executive summary of problems which others have run into, written about, but not received answers for.   First the complete error: Description: Unable to prepare the SSIS bulk insert for data insertion.  End Error  Error: 2008-01-15 04:55:27.58     Code: 0xC004701A     Source: <xxx> DTS.Pipeline     Description: component "<xxx> failed the pre-execute phase and returned error code 0xC0202071.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  4:53:34 AM  Finished: 5:00:00 AM  Elapsed:  385.384 seconds.  The package execution failed.  The step failed.   Important points It mostly works - It produces no error more than 9 times out of 10. It fails on random dataflows - My package has several dataflows, (mostly) executing concurrently. Where the error occurs it does not do so on the same dataflow each time: on one run it'll fail on dataflow A whilst B,C,D and E succeed, then A-E will all succeed (and continue doing so for the next ten runs thereafter), and

Get error message from bulk insert task.

Hello, I've created a dtsx package that bulk inserts some data from a csv file into a sql table (using the BULK INSERT TASK). Because this package is scheduled to run automatically, I also added a EXECUTE SQL TASK that will insert a row in a 'log table'. It is linked to the BULK INSERT task with a 'Succes constraint (green arrow)', so I know about all the successes. But I am more interested in the failures, so I also added another EXECUTE SQL TASK and linked it to the BULK INSERT task with the 'failure constraint (red arrow)'. Question: How can I get the error message from the BULK INSERT TASK (like 'file does not exist' or 'error converting .....') and store it in a field in the 'log table'? Thanks in advance, Peter

How to do a parameterized bulk insert into SQL Server 2008 R2 from C# (.NET 4.0)

Hi, I'm trying to execute a BULK INSERT with a parameterized SqlCommand from a C# .NET 4.0 project. This is the code: private System.Data.SqlClient.SqlCommand sqlCmd2; private string sqlCmd2String = "BULK INSERT ExchangeTrackingLogs.dbo.ParsedLogs FROM @name WITH (FIELDTERMINATOR=',', ROWTERMINATOR='\n', FIRSTROW=2)"; private System.Data.SqlClient.SqlParameter[] sqlCmd2Params = new System.Data.SqlClient.SqlParameter[] { new System.Data.SqlClient.SqlParameter("@name", System.Data.SqlDbType.VarChar) }; (...) sqlCmd2Params[0].Value = source; sqlCmd2.Parameters.Clear(); sqlCmd2.Parameters.AddRange(sqlCmd2Params); sqlCmd2.ExecuteNonQuery(); where source is a string containing the full path to a file, i.e. a value "C:\\TEMP\\MSGTRK20100701-1.LOG_EXHUB01". When I execute this code, I get an Exception with Message "Incorrect syntax near '@name'." In the database trace I see an RPC:Completed event with TextData equal to exec sp_executesql N'BULK INSERT ExchangeTrackingLogs.dbo.ParsedLogs FROM @name WITH (FIELDTERMINATOR='','', ROWTERMINATOR='' '', FIRSTROW=2)',N'@name varchar(36)',@name='C:\TEMP\MSGTRK20100701-1.LOG_EXHUB01' I tried using verbatim string literals, but doesn't seem to help. Any idea?

Bulk Insert Problem

I am trying to perform a bulk insert on pipe delimited file (approx 25 columns,5000+ rows of data).  Normally this is no problem however this file has a header and trailer record with no delimiters (1 column).  I could use a utility like grep to remove the header/trailer record but I would like to capture the entire file inside sql server. I can bulk insert the entire file into a single column (wide) that would include the delimiters.  My problem from there is to process the single column data into the appropriate rows /columns.  I am looking for a set solution as opposed to a cursor type solution. HTML?  Seems slow.  Cross Apply?   Changing the file format is not a viable solution. I’m likely to have more than a few of these oddball formats with a variable number of columns.  Any ideas?   I thank you for your time on this puzzle.

Bulk insert CSV files

Hello, I am trying to Bulk Insert a CSV file, and I can't seem to find the row terminator. When I use \m I get an error message that an unexpected EOF has occurred, which usually means a carriage return at the end of the file. I removed the carriage return from the file, but I am still getting the same error message. I have tried the following row delimiters... \r,\n,{CR},{CR}{LF}. These work as the data is indeed bulk loaded into the table, however it dumps the majority of the data in the last column, so 7 or 8 columns are stuffed into 1. Has anyone had this issue before? Any help would be appreciated. DaveDave SQL Developer

Permission Denied to SA on Bulk insert

Dear All,I am experiencing strange problem (SQL 2005, build 9.0.1399), receiving following error:Msg 4834, Level 16, State 4, Line 1You do not have permission to use the bulk load statement.I am running the stored procedure SP1.  This stored procedure calls another one, SP2.  Stored procedure SP2 contains a dynamic SQL statement to insert data into the temporary table, created by SP1 and given to SP2 via parameter name.Something like:SET @szSQL = 'BULK INSERT ' + @TName + ' FROM ' + '''' + @FilePath + '''' +  ' WITH ( ' +   ' DATAFILETYPE = ''char'', ' +      ' FIELDTERMINATOR = ' + '''' + @Terminator + ''', ' +      ' ROWTERMINATOR = ''\n'' ' + ' ) 'EXEC (@SQL)If I execute SP1 I receive an errorMsg 4834, Level 16, State 4, Line 1You do not have permission to use the bulk load statement.But I can execute SP2 with no problem.All operations are done under SA login.Any thoughts/ideas?Thanks

BULK INSERT with a UNC path does not work even with delegation

I have a stored procedure that uses BULK INSERT with a UNC path to another server. I have enabled both the server and the domain account the SQL Server services are running under to allow all delegation. The UNC path allows Everyone Full Control. I can log into my computer using the SQL Server service account and browse to the file and open it. I can log onto the SQL Server with the service account or my account and browse to the file and open it. Whenever I run the stored procedure or execute the BULK INSERT from SSMS I get the following: Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "\\nas\shared\ZipSales\FLAT.txt" could not be opened. Operating system error code 5(Access is denied.). I can log into the server console using my user account, open SSMS on the server, and the stored procedure or the raw BULK INSERT work just fine. It's only a problem when I execute it from an end user workstation. Moving the data file to the SQL server works fine, but that is not a viable long-term solution. After scouring the Internet for most of two days, I finally stumbled across <a href="http://social.msdn.microsoft.com/Forums/en-US/sqlsetupandupgrade/thread/512fe230-b94a-4fe0-83c8-201a150bbcec">another post</a> that has this bit:  "After watching forums, etc. for almost a year regarding this issue, we ran a MS Support Reque

Alternative to Bulk Insert

 Hi, I have an asp.net web site which uses a bulk insert to insert a large number of records into a SQL server (2008) database.   I'm now migrating this application to a vb.net application. I'm trying to avoid uploading large txt files to the sql server in order for the bulk insert to work. I'd like to have the txt file on the client machine. the problem is that when I specify the text file (for the bulk insert) on the client machine, I get an error saying that the file could not be found, due to the fact that the server is expecting the file on it's own file system.   The way the system works is this. The user specifies a an excel spreadhseet which is read by the application. It may contain several thousand records. I then want to insert the records into a sql server database. I want to avoid iterating through a datatable and using an insert query for each row in the datatable, as it just takes too long. On the web based system (IIS and SQL server on the same server) the datatable is written to a text file and then the text file is used for a bulk insert. I'd like to read the spreadsheet into a text file on the client, but cannot use this for the bulk insert as the server cannot recognise the file.   Is there a way of using a data source text file which is not on the sql server, other than using shared folders? Alternatively is there ano

Need an Idea for doing bulk compare and insert


Hi Friends,

I got a scenario where we want to move some 30k records from a staging table to main table while moving we need to compare the columns from staging to history table and insert some new columns along with the columns in staging tables.


I will give one example.

Lets take staging table with scol1, scol2, scol3

Main table have mcol1, mcol2, mcol3, mcol4,mcol5,mcol6

History table have hcol1,hcol2, hcol3,hcol4, hcol5,hcol6

so i wil move the staging table to Main table by comparing columns scol1, scol2, scol3 with hcol1,hcol2, hcol3 and mcol4,mcol5,mcol6 will contain the 1 if values differ or 0 if it contains same value



sql insert very slow on bulk table



We are using SQL Server 2005 x86 enterprise edition. We have one table which has 13 crores / 130 million records. When we insert a record in that table it is taking around 30 seconds. Can u please help optimize it.

The table currently has 1 primary key (Clustered Index) and 4 non clustered index.

Please help.



Keeping the Line Order in BULK INSERT


When loading 50MB file with

FROM 'C:\data\proddesc.txt' 

I get the beginning of the file shifted to the middle approximately. Otherwise the lines are in order (so the document starts in the middle, progresses to bottom, continues at the top).


Kalman Toth, SQL Server & Business Intelligence Training; SQL 2008 GRAND SLAM

Read Uncommited (Bulk insert) and Performance


Hi all,

I have issues with Concurrency.

there has been times when updates happen on my fact table while some users read data from it at the same time, it takes a very long time and there is some blocking issues.

1.My inserts happen as Bulk inserts. I would like to know if I change my query to use NOLOCK will it still be blocked by the Bulk insert when i read? so does Bulk insert do an exclusive lock, so i cant read data?

2. changing my query to use NOLOCK is it benefical?

Please help me

bulk insert work not correct


i have data-file with 12 fields and ";" is field terminator.

but in data-file in some rows last fileds can absent.

when i use "bulk insert" to insert to my table in DB by used format file where all fields described, some rows (one with absent fields and next one)  are inserted in the same row (into last fields).

for example





in table

1_1;2_1;3_1;;;;6_1;;;9_1(there are row terminator symbols) 1_2;2_2;3_2;(4_2;5_2;6_2;7_2;8_2;9_2;10_2;11_2;12_2) - these are in one field

1_3;2_3;3_3;4_3;5_3;6_3;7_3;8_3;9_3;10_3;11_3;12_3 - next row in table is normal

how i can decide this?

Incremental Bulk insert


I have an Excel File,with 30k records and a sql server table with same data.periodically i have to bulk insert only the updated records in the excel sheet to table using SSIS.

*businees demands that i should not truncate the table,only the updates should get reflected when the SSIS package runs

Bulk Insert Task Failure



I am trying to load a flat file with the Bulk Insert Task. However, I keep getting this error. I understand that it maybe a permission/security issue, so I made sure that I have Bulkadmin and sysadmin rights.

I even tried to place the file on my local drive (C), but I keep getting the same error:

[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\Deployment\AtmDialog20100713.txt" could not be opened. Operating system error code 3(The system cannot find the path specified.).".

Any ideas on how to fix this?


ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend