.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

Bulk Insert Problem

Posted By:      Posted Date: September 10, 2010    Points: 0   Category :Sql Server
I am trying to perform a bulk insert on pipe delimited file (approx 25 columns,5000+ rows of data).  Normally this is no problem however this file has a header and trailer record with no delimiters (1 column).  I could use a utility like grep to remove the header/trailer record but I would like to capture the entire file inside sql server. I can bulk insert the entire file into a single column (wide) that would include the delimiters.  My problem from there is to process the single column data into the appropriate rows /columns.  I am looking for a set solution as opposed to a cursor type solution. HTML?  Seems slow.  Cross Apply?   Changing the file format is not a viable solution. I’m likely to have more than a few of these oddball formats with a variable number of columns.  Any ideas?   I thank you for your time on this puzzle.

View Complete Post

More Related Resource Links










Msg 4866, Level 16, State 1, Line 1

The bulk load failed. The column is too long in the data file for row 1, column 34. Verify that the field terminator and row terminator are specified correctly.

Msg 7399, Level 16, State 1, Line 1

The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.

Msg 7330, Level 16, State 2, Line 1

Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".






Problem with SQL Bulk Insert


I am trying to execute a query to perform a bulk insert from a CSV/TXT File on the local machine.  I am trying to run the process in a VB.net application, but the query fails when executed in the Query Manager (Also run locally).  I run the query and get the following:

Msg 4861, Level 16, State 1, Line 1

Cannot bulk load because the file "c:\aamledger\AAMLedger.csv" could not be opened. Operating system error code 3(The system cannot find the path specified.).

I have found a lot of advance about security settings and I am the domain admin and the SQL Services run with domain admin logins. 

Problem insert POCO to Velocity Cache



I want to put POCO object to Velocity cache. How can I serializable ICollection to Velocity cache, or exclude ICollection from serialization?

I got this exception

Type 'System.Collections.Generic.ICollection`1[Model.Collection]' cannot be serialized. Consider marking it with the DataContractAttribute attribute, and marking all of its members you want serialized with the DataMemberAttribute attribute.  If the type is a collection, consider marking it with the CollectionDataContractAttribute.  See the Microsoft .NET Framework documentation for other supported types.



Nested listview fk-problem with insert



My problem is this, i have a nested listview that works fine but if i need to do an insert and the nested listview dont have retrived any values (that is no posts created) i cant get hold of the fk that i need. If i put a hiddenfield in the parents listview with the fk how can i find it in codebehind when i do an insert? 




Can I bulk insert to specific columns when bulk inserting a text file or do I have to bulk insert into all columns of a table? The extra column in the table is an identity integer. So I had to create a view without the identity integer to make it work. Maybe there is a more direct way?? If so what would be the syntax? I am a beginner to intermediate SQL guy.   strSQL = "BULK " & _        "INSERT CustomersUpload " & _        " FROM 'C:\MydataFile.txt' " & _        " WITH " & _        " ( " & _        " FIELDTERMINATOR = '|', " & _        " ROWTERMINATOR = '\n' " & _        " )" Thanks

Insert into and Store procedure problem

Hello I am trying to create store procedure wich will insert data in temp table CREATE PROCEDURE GetDataForUpdate AS if exists(select * from sys.objects where name='GetDataForUpdate_temp_tb') begin DROP TABLE GetDataForUpdate_temp_tb end go WITH cte as (select Sp.[Item No_], Sp.[Starting Date], It.[No_], It.[Manufacturer Code], It.[Description], Sp.[Unit Price], row_number () over (partition by Sp.[Item No_] order by Sp.[Starting Date] desc) as rn from dbo.[Main-db$Sales Price] AS Sp JOIN dbo.[Main-db$Item] AS It ON Sp.[Item No_] = It.[No_] WHERE Sp.[Sales Code]='RETAIL' AND Sp.[Item No_] LIKE 'I%' ), cte2 as ( SELECT [Item No_],SUM(ISNULL(Quantity, 0)) AS Qty FROM [Main-db$Item Ledger Entry] WHERE [Item No_] LIKE 'I%' AND ( [Location Code] = 'WH-AB-#2' ) GROUP BY [Item No_] ) SELECT cte.[Manufacturer Code],cte.[Description], CONVERT(int, cte2.[Qty]) AS 'Qty',CONVERT(int, cte.[Unit Price]) AS 'Unit Price' INTO GetDataForUpdate_temp_tb FROM cte JOIN cte2 ON cte.[Item No_]=cte2.[Item No_] WHERE rn = 1 ORDER BY cte.[Item No_],[Starting Date] But after execute, lefts only this part of query in store procedure: set ANSI_NULLS ON set QUOTED_IDENTIFIER ON go ALTER PROCEDURE [dbo].[GetDataForUpdate] AS if exists(select * from sys.objects where name='GetDataForUpdate_temp_tb') begin DROP TABLE GetDa

SQL Insert with Index Scan Problem....

Hi,   I'm having a very strange problem when inserting rows into a table. Let me try to explain a little better: Table1 (for example Book) with a lot of fields and one foreign key to Table2 (idVolume) and a primary key (identity - idBook). This table has a clustered index only on the primary Key. (this table has something like 10 Millions rows) Table2 (for example Volume) with a lot of fields and a primary key (idVolume) identity. (this table has hundreds of rows)   Now, the problem starts when I make a insertion on Table2. When I made an insertion, it was taking minutes to finish, so I stopped it. After looking into the execution plan of the query, there was something very very strange, there seems to be a clustered index scan on Table1... the problem is that since Table1 has so many rows, this index scan takes forever... Can anyone explain me why does the insertion on Table2 makes an clustered index scan on Table1? It really doesn't make any sense to me.   Thanks in advance. João Teixeira

Basic data type problem with database insert

Hi,I'm trying to write some values into a database, but I think there's an issue with the data types.  Here's what I've got: public static string Hello(string pageId, string inOut) { string itemId = ""; string period = ""; var cookie = HttpContext.Current.Request.Cookies["visitorGUID"].Value; SqlConnection conn; SqlCommand comm; conn = new SqlConnection(GlobalSettings.DbDSN); comm = new SqlCommand("INSERT INTO myTable (memberGUID) VALUES (@memberGUID);", conn); comm.Parameters.Add("@memberGUID", System.Data.SqlDbType.Text); comm.Parameters["@memberGUID"].Value = cookie; try { conn.Open(); comm.ExecuteNonQuery(); return "Item added: cookie = " + cookie + " / pageId = " + pageId + " / inOut = " + inOut + " / pageId = " + pageId + " / itemId = " + itemId + " period = " + period; } catch { return "Item caught: cookie = " + cookie + " / pageId = " + pageId + " / inOut = " + inOut + " / pageId = " + pageId + " / itemId = " + itemId + " period = " + period; } finally { conn.Close(); } } At the moment I'm just trying to insert the value of the cookie into the database as a test.  It works fine when I

Errors during bulk insert

Hi, While writing a stored procedure in SQL 2008, for a bulk insert, I am getting several errors. Part of the SP is written below for your reference. Need to know where am I going wrong. Any help will be thankfully acknowledged.-Learn24x7----------CREATE PROCEDURE Sp_BulkInsert(@Directory nvarchar(256), @TblName nvarchar(50), @Collation nvarchar(50), @FieldSeparator nvarchar(50),    @RowTerminator nvarchar(50))ASBEGIN    SET NOCOUNT ON;                BULK INSERT [@TblName]    FROM ''+@Directory+'' --Error1: see below    WITH --Error2: see below     (     FIRSTROW = 2,     CODEPAGE = (SELECT CAST(@Collation AS INT)), --Error3: see below     FIELDTERMINATOR = ''+@FieldSeparator+'',     ROWTERMINATOR = ''+@RowTerminator+''     )  ENDGO----------FROM ''+@Directory+'' --Error1: An object or column name is missing or empty. For SELECT INTO statements, verify each column has a name. For other statements, look for empty alias names. Aliases defined as "" or [] are not allowed. Change the alias to a valid name.WITH --Error2: Incorrect syntax near the keyword 'with'. If this statement is a commo

Unable to prepare the SSIS bulk insert for data insertion

Hi,   I am using SQL Server Destinations in my data flow tasks. I'm running this package in the server until i encountered this error:   OnError,,,LOAD AND UPDATE Dimension Tables,,,10/24/2007 1:22:23 PM,10/24/2007 1:22:23 PM,-1071636367,0x,Unable to prepare the SSIS bulk insert for data insertion. OnError,,,Load Dimensions,,,10/24/2007 1:22:23 PM,10/24/2007 1:22:23 PM,-1071636367,0x,Unable to prepare the SSIS bulk insert for data insertion. OnError,,,Discount Reason, ISIS Condition, ISIS Defect, ISIS Repair, ISIS Section, ISIS Symptom, Job Status, Parts, Purchase SubOrder Type, Service Contract, Service Reason, Service Type, TechServiceGrp, WarrantyType, Branch, Wastage Reason,,,10/24/2007 1:22:23 PM,10/24/2007 1:22:23 PM,-1073450974,0x,SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Dim_T_ISISDefect" (56280) failed with error code 0xC0202071. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure. What could be the reason for this? I don't usually have an error.   cherriesh

SQL ServerDestination error - Unable to prepare the SSIS bulk insert for data insertion.

Having searched the forum, this one clearly has form... However beyond assisting those who have fallen at the first hurdle (i.e. forgetting/not knowing that they cannot execute the package remotely to the instance of SQL Server into which they are inserting), the issues raised by others have not been addressed. Thus I am bringing nothing new to the table here - just providing an executive summary of problems which others have run into, written about, but not received answers for.   First the complete error: Description: Unable to prepare the SSIS bulk insert for data insertion.  End Error  Error: 2008-01-15 04:55:27.58     Code: 0xC004701A     Source: <xxx> DTS.Pipeline     Description: component "<xxx> failed the pre-execute phase and returned error code 0xC0202071.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  4:53:34 AM  Finished: 5:00:00 AM  Elapsed:  385.384 seconds.  The package execution failed.  The step failed.   Important points It mostly works - It produces no error more than 9 times out of 10. It fails on random dataflows - My package has several dataflows, (mostly) executing concurrently. Where the error occurs it does not do so on the same dataflow each time: on one run it'll fail on dataflow A whilst B,C,D and E succeed, then A-E will all succeed (and continue doing so for the next ten runs thereafter), and

Get error message from bulk insert task.

Hello, I've created a dtsx package that bulk inserts some data from a csv file into a sql table (using the BULK INSERT TASK). Because this package is scheduled to run automatically, I also added a EXECUTE SQL TASK that will insert a row in a 'log table'. It is linked to the BULK INSERT task with a 'Succes constraint (green arrow)', so I know about all the successes. But I am more interested in the failures, so I also added another EXECUTE SQL TASK and linked it to the BULK INSERT task with the 'failure constraint (red arrow)'. Question: How can I get the error message from the BULK INSERT TASK (like 'file does not exist' or 'error converting .....') and store it in a field in the 'log table'? Thanks in advance, Peter

How to do a parameterized bulk insert into SQL Server 2008 R2 from C# (.NET 4.0)

Hi, I'm trying to execute a BULK INSERT with a parameterized SqlCommand from a C# .NET 4.0 project. This is the code: private System.Data.SqlClient.SqlCommand sqlCmd2; private string sqlCmd2String = "BULK INSERT ExchangeTrackingLogs.dbo.ParsedLogs FROM @name WITH (FIELDTERMINATOR=',', ROWTERMINATOR='\n', FIRSTROW=2)"; private System.Data.SqlClient.SqlParameter[] sqlCmd2Params = new System.Data.SqlClient.SqlParameter[] { new System.Data.SqlClient.SqlParameter("@name", System.Data.SqlDbType.VarChar) }; (...) sqlCmd2Params[0].Value = source; sqlCmd2.Parameters.Clear(); sqlCmd2.Parameters.AddRange(sqlCmd2Params); sqlCmd2.ExecuteNonQuery(); where source is a string containing the full path to a file, i.e. a value "C:\\TEMP\\MSGTRK20100701-1.LOG_EXHUB01". When I execute this code, I get an Exception with Message "Incorrect syntax near '@name'." In the database trace I see an RPC:Completed event with TextData equal to exec sp_executesql N'BULK INSERT ExchangeTrackingLogs.dbo.ParsedLogs FROM @name WITH (FIELDTERMINATOR='','', ROWTERMINATOR='' '', FIRSTROW=2)',N'@name varchar(36)',@name='C:\TEMP\MSGTRK20100701-1.LOG_EXHUB01' I tried using verbatim string literals, but doesn't seem to help. Any idea?

Bulk insert CSV files

Hello, I am trying to Bulk Insert a CSV file, and I can't seem to find the row terminator. When I use \m I get an error message that an unexpected EOF has occurred, which usually means a carriage return at the end of the file. I removed the carriage return from the file, but I am still getting the same error message. I have tried the following row delimiters... \r,\n,{CR},{CR}{LF}. These work as the data is indeed bulk loaded into the table, however it dumps the majority of the data in the last column, so 7 or 8 columns are stuffed into 1. Has anyone had this issue before? Any help would be appreciated. DaveDave SQL Developer

Permission Denied to SA on Bulk insert

Dear All,I am experiencing strange problem (SQL 2005, build 9.0.1399), receiving following error:Msg 4834, Level 16, State 4, Line 1You do not have permission to use the bulk load statement.I am running the stored procedure SP1.  This stored procedure calls another one, SP2.  Stored procedure SP2 contains a dynamic SQL statement to insert data into the temporary table, created by SP1 and given to SP2 via parameter name.Something like:SET @szSQL = 'BULK INSERT ' + @TName + ' FROM ' + '''' + @FilePath + '''' +  ' WITH ( ' +   ' DATAFILETYPE = ''char'', ' +      ' FIELDTERMINATOR = ' + '''' + @Terminator + ''', ' +      ' ROWTERMINATOR = ''\n'' ' + ' ) 'EXEC (@SQL)If I execute SP1 I receive an errorMsg 4834, Level 16, State 4, Line 1You do not have permission to use the bulk load statement.But I can execute SP2 with no problem.All operations are done under SA login.Any thoughts/ideas?Thanks

BULK INSERT with a UNC path does not work even with delegation

I have a stored procedure that uses BULK INSERT with a UNC path to another server. I have enabled both the server and the domain account the SQL Server services are running under to allow all delegation. The UNC path allows Everyone Full Control. I can log into my computer using the SQL Server service account and browse to the file and open it. I can log onto the SQL Server with the service account or my account and browse to the file and open it. Whenever I run the stored procedure or execute the BULK INSERT from SSMS I get the following: Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "\\nas\shared\ZipSales\FLAT.txt" could not be opened. Operating system error code 5(Access is denied.). I can log into the server console using my user account, open SSMS on the server, and the stored procedure or the raw BULK INSERT work just fine. It's only a problem when I execute it from an end user workstation. Moving the data file to the SQL server works fine, but that is not a viable long-term solution. After scouring the Internet for most of two days, I finally stumbled across <a href="http://social.msdn.microsoft.com/Forums/en-US/sqlsetupandupgrade/thread/512fe230-b94a-4fe0-83c8-201a150bbcec">another post</a> that has this bit:  "After watching forums, etc. for almost a year regarding this issue, we ran a MS Support Reque

Alternative to Bulk Insert

 Hi, I have an asp.net web site which uses a bulk insert to insert a large number of records into a SQL server (2008) database.   I'm now migrating this application to a vb.net application. I'm trying to avoid uploading large txt files to the sql server in order for the bulk insert to work. I'd like to have the txt file on the client machine. the problem is that when I specify the text file (for the bulk insert) on the client machine, I get an error saying that the file could not be found, due to the fact that the server is expecting the file on it's own file system.   The way the system works is this. The user specifies a an excel spreadhseet which is read by the application. It may contain several thousand records. I then want to insert the records into a sql server database. I want to avoid iterating through a datatable and using an insert query for each row in the datatable, as it just takes too long. On the web based system (IIS and SQL server on the same server) the datatable is written to a text file and then the text file is used for a bulk insert. I'd like to read the spreadsheet into a text file on the client, but cannot use this for the bulk insert as the server cannot recognise the file.   Is there a way of using a data source text file which is not on the sql server, other than using shared folders? Alternatively is there ano
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend