.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

Is SQLXMLBulkLoad4Class performance competitive for large file importing

Posted By:      Posted Date: December 04, 2010    Points: 0   Category :Sql Server

I have posted SQLXMLBulkLoad4Class question in XML and the .NET Framework forum.  No one answered.  Try my luck here.

According to http://msdn.microsoft.com/en-us/library/ms171721.aspx -- "You can insert XML data into a SQL Server database by using an INSERT statement and the OPENXML function; however, the Bulk Load utility provides better performance when you need to insert large amounts of XML data."  I thought "SQLXML Bulkload 4.0 Type Library" was a component to use for fast importing large xml file.

I have created 10 tables with auto increase primary keys and foreign keys in SQL Server database and used the following codes to shred a 390 MB file into those tables without any transformation.  It took me around an hour.  What a disappointment!  Flatting this file into an in-memory table and importing it with SqlBulkCopy or using SSIS only took me around 6 minutes. 

Did anyone experience this?  Is SQLXMLBulkLoad4Class simply just so slow or is it misused by me?




                SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class objBL = new SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class();

View Complete Post

More Related Resource Links

How to read and analyze a very large log file with high efficiency and performance?

It is a very large .txt file (more than 3M), and produced everyday, the content is user's system log like below: 2007-11-01 18:20:42,983 [4520] INFO GetXXX() SERVICE START2007-11-01 18:21:42,983 [4520] WARING USER ACCESS DENIED2007-11-01 18:22:42,983 [4520] ERROR INPUT PARAMETER IS NULL CAN NOT CONVERT TO INT322007-11-01 18:23:59,968 [4520] INFO USER LOGOUT   Any idea? Thanks.

Importing large dat file into sql server database

I have a dat file which consists of thousands of columns which i have to insert into a database. I have to insert the data into multiple datatables. I am looking for the best possible way to do that. I have looked into bulk insert also but is there a better way? something like normalizing the data in the data access layer and inserting into the tables or creating ssis packages? Any help is highly appreciated.

While Importing Excel 2007 file to Datatable - headerrow problem


Hi there,


I am trying to simply extract an excel data from an uploaded file an put it into a datatable. In this case the excel file has 3 rows but when I fill the datatable I only see row count of 2.

I tried changing HDR:NO; to HDR:YES and vice versa, but no luck. 

What am I doing wrong? (Note: the excel file cannot have a  headerrow)


string connstr = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + pFilePath + ";Extended Properties=\"Excel 12.0;IMEX=1;HDR:NO;\"";
            OleDbConnection conn = new OleDbConnection(connstr);
            DataTable dtTables = conn.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Tables, null);
            string strTablename = dtTables.Rows[0]["TABLE_NAME"].ToString();
            string strSQL = "SELECT * FROM [" + strTablename + "]";

            OleDbCommand cmd = new OleDbCommand(strSQL, conn);

            DataTable dt = new DataTable();
            OleDbDataAdapter da = new OleDbDataAdapter(cmd);
            //At this point row count=2 which doesn't make sense




importing .csv file into MS SQL 2005

Greetings :-) I need to import data from my excel .csv file into a database in MS SQL server 2005. Can someone show me the steps? Is there a way to do this with a wizard since I'm not a DBA? Thanks in advance for your assistance   .

Upload large file via webservice problem in vb.net 1.1

Hi All, I am uploading large file via webservice using below my code in vb.net 1.1. Problem is vb.net 1.1 don't know Using Object loop. I would like to know any other loop similar Using loop to replace in vb.net 1.1Public Function UploadLargeFile(ByVal FileName As String, ByVal buffer As Byte(), ByVal Offset As Long) As Boolean Dim retVal As Boolean = False Try Dim FilePath As String = Path.Combine(System.AppDomain.CurrentDomain.BaseDirectory.ToString() & "\fpath\", FileName) If Offset = 0 Then File.Create(FilePath).Close() End If Using fs As New FileStream(FilePath, FileMode.Open, FileAccess.ReadWrite, FileShare.Read) fs.Seek(Offset, SeekOrigin.Begin) fs.Write(buffer, 0, buffer.Length) End Using retVal = True Catch ex As Exception Throw ex End Try Return retVal End Function  

Uploading Large File (40mb) fails... (webhttpbinding) (Azure)

I'm using javascript to upload a file to my WCF service hosted on Azure.  (40 mb file) Here is the snippets from the web.config for the service: <system.serviceModel> <bindings> <webHttpBinding> <binding transferMode="Streamed" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647" openTimeout="00:25:00" closeTimeout="00:25:00" sendTimeout="00:25:00" receiveTimeout="00:25:00" name="WebConfiguration"> </binding> </webHttpBinding> </bindings> <behaviors> <endpointBehaviors> <behavior name="RestfulEndpointBehavior"> <webHttp/> </behavior> </endpointBehaviors> <serviceBehaviors> <behavior name="CPUploadServiceBehavior"> <serviceMetadata httpGetEnabled="true" httpGetUrl=""/> <serviceDebug includeExceptionDetailInFaults="false"/> </behavior> </serviceBehaviors> </behaviors> <serviceHostingEnvironment aspNetCompatibilityEnabled="true"/> <services> <service name="CPUpload" behaviorConfiguration="CPUploadServiceBehavior"> <endpoint add

Performance Point: Code blocks are not allowed in this file.

Hi All,I'm trying to use Performance Point 2010 in Sharepoint 2010...I have start the application in centrel adminI have created a site collection based on the Bisuness Intelligence Center templateI can surf the new Site Collection but when I try to open the "Start using PerformancePoint Services" link I get the following error:"An error occurred during the processing of /PerformancePoint/Pages/ppssample.aspx. Code blocks are not allowed in this file."is there any way to fix it???is there any good source to understand and deploy Performance Point 2010??thanks AllVit

Importing a .txt file

Hi, I'm currently struggling with this problem: - I currently using Microsoft Visual Studio 2008 and the extension for my database in my web application is .mdf - My supervisor gave me two files, student.txt and staff.txt - I needed to import both of these files into my web application and I am new to this language and I have no idea on how to do this. Please provide some codes sample regarding this problem. Thank you.

importing excel file into sql server 2005 in asp .net using c#

Dear friends i want to import excel file into sql server 2005 in asp .net using c# can anyone help me? rgds,RK

Getting error while importing xml file under my SSP

Using Application Definition Designer,I connected to a database and added a table. Then i exported the xml file from Application Definition Designer to my local folder. After that, in Central Administration under my SSP, i tried to import the xml file from my local folder.During the importing process,i got the following error. Application definition was successfully imported. 2 warning(s) found. Consider fixing the warnings and updating the application definition. Warnings  No method instance of type SpecificFinder defined for for application 'BDC ', entity 'GetInfoDetails'. Profile page creation skipped.  No method instance of type SpecificFinder defined for for application 'BDC ', entity 'GetQualification'. Profile page creation skipped.   Please help me to find a solution for this asap.  

400 Bad Request when trying to send large file to service using basichttp streaming.


Hi All,


I am trying to send the large file (approx 1.5GB) to the WCF service for upload using basic http streaming.

I have increased all the timeouts and quotas in the server and client's config files to quite large values, but still I am getting Bad Request error on the client.


Also the Bad request message is returned always in 1 minute 30 seconds after sending the request to server.


What the most unusual thing is when I use Fiddler or Charles to debug communication, everything works fine. Is it something like fiddler is pinging the IIS or service and hence its not getting timeouted or like that?




Following are the details:

Server Config:


 <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />

retrieve large file from database ?


I have a database table that works as a file repository.  Currently there are binaries stored in there and I want to pull the "large" ones out in chunks.  Some of these files are in excess of 500 MB.  I have business rules that dictate if the file is >5MB to transmit in chunks.  <5MB and I can load into memory and rip out.  I got the uploading in chunks to work, but how do I get it to pull it out of the DB in chunks?

Right now I'm getting hit with a 'System.OutOfMemory' exception.  But when I recreate the byte array of the SAME size (empty though) it doesn't break.

Download Chunks (DAL)

public byte[] getBytesByDataID(int chunkSize, string dataID)
            string query = "SELECT data.data " +
                " FROM data " +

Large File upload Problem.


I have problem with large upload files. few user complianing about it that either their file fails to upload or taking 20 to 30 mins.

I setup everything correct and i check on different machine its worked fine but only few customer out of 5000 complain it. any idea about if i can check at client machine?

i set my upload limit as 100mb and properly configured and i test it but one strange thing happen, when i try to upload group of files more then 100MB (but no individual file in the group more then 100), sharepoitn let it do and i can see 300MB group of files at once with explorer view.

my understanding when you set the limit it must be for individual or group of files. any thought?


thanks in advacne



i am at SharePoint administrator

Using response.outputstream.write to stream large file fails in firefox



I have some code to stream files (for security of the files), the code works for all files in IE, and small files in firefox. but If the file is large (eg 750Mb) firefox hangs before it shows the open/save dialog box. 

However if i take out response.clearheaders and the response.flush in the code it does download but all the files downloaded by firefox are corrupt because the file size is too large.   Filenames are guids but I took the "-" out incase that was an issue.

If any one can shed some light on this I would be very grateful.

Function is Below. 


Private Sub StreamFile(ByRef response As HttpResponse, ByVal sFile As String)
            Dim fs As FileStream = Nothing
            Dim sFileName As String = Path.GetFileName(sFile)

            Dim oFile As New FileInfo(sFile)

                fs = New FileStream(sFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)

            Catch ex As Exception

            End Try


            'response.ClearHeaders() 'makes file size wrong in firefox
            response.AddHeader("Content-Length", oFile.Length)
            'To forcefully download, even for Excel, PDF files, reg

SSIS flat file performance improvement


I have 10 text files which are on average of 6  GB in size. Out of those 10, 4 files average up to 10 GB in size.

We have 10 SSIS packages which loads those text files onto 2 staging tables. Each 5 of them loads 1 table.

Inside the package, its only a flat file source, data conversion for all columns from source, and write into the destination table.

And this process takes around 5 hours to complete. the size of 2 tables after the load completes is 60 M and 40 M each.


The other thing which I have observed is all the packages are scheduled through one schedule and the schedule  have 10 diff sql server jobs( for 10 packages). Does it have an impact on the processing time ,as all these jobs will be

Importing XML file getting an error


I'm trying to import an XML file using XML Source but I'm getting an error when I try to generate XSD.

"Unable to infer the XSD from the XML file. The XML contains multiple namespaces."

I have been researching on using an XML Task  instead but I couldn't figure out how to be able to strip the

xmlns:dt="urn:schemas-microsoft-com:datatypes" & dt:dt="xxxxxx"

from the XML source. Without it, I have no issues. Any ideas would be appreciated. Thanks!


  <?xml version="1.0" encoding="ISO-8859-1" ?>

File Import Performance



I'm looking for some tips / suggestions for the best way to handle file imports. Essentially, I have two scenarios where files are imported and would like to have one unified solution for importing. The first scenario is that we get files dropped into an FTP folder that are PGP encrypted. During processing, I want to decrypt these in memory so as never to have an unencrypted file sitting on the server. The second scenario will be when a user uploads a file via the .NET FileUpload control. These files will NOT be encrypted. Natively, I can grab FileUpload.FileBytes to get a byte array of the data. I'm using a 3rd party library (File Helpers) to parse the file and import into the database. The File Helpers library will accept any of the following: file path, string, or stream (TextReader). In this case, I think the stream is the way to go.

Ultimately, my question is what is the best way to go from a byte array to TextReader? I am doing the following:

MemoryStream ms = new MemoryStream(FileBytes);
StreamReader strmReader = new StreamReader(ms);

FileBytes is a byte array taken from the FileUpload control. With the PGP files, I can decrypt that right into a MemoryStream so then I'll just toss that to a StreamReader and will be good.

Thanks for the help!


ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend