.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Out of memory when trying to validate very large XML file

Posted By:      Posted Date: October 15, 2010    Points: 0   Category :ASP.Net

Hello everyone,

I am trying to perform an XSD validation against very large XML files (over 400MB).  When I run the following code (which I pulled out of my application and hard coded the file names for this example), I get a "System.OutOfMemoryException" exception:

                string XSDFile = @"D:\TestFiles\DatasetFiles\DiCi_XSD_Test\dc-schema-1.6.xsd";
                string XMLFile = @"D:\TestFiles\DatasetFiles\DiCi_XSD_Test\10304.0.1.xml";

                XmlReaderSettings settings = new XmlReaderSettings();
                settings.Schemas.Add(null, XSDFile);
                settings.ValidationType = ValidationType.Schema;
                XmlDocument document = new XmlDocument(

View Complete Post

More Related Resource Links

file upload in chunks or not buffering in memory before writing to disk?


What are the options for handling file uploads to reduce the memory footprint?  Is there a way to upload in chunks?  Is there a way to stream upload directly to disk instead of loading entire file in server memory?


Consistently running out of page file memory with full text indexer

Using MS SQL Server 2008 SP1 x64 Standard Edtion on Windows 2008 R2 Enterprise, I'm about to full-text index for the first time 1 table and 2 views. The table contains about 250'000 entries with a data space of 180 MB. As soon as I activate the full text indexing, the fdhost.exe task starts to consume slowly but surely all the available page file space (this can be easily watched using the Resource Monitor and the Commit Charge graph on the memory tab). Once all the virtual memory has been consumed, the server becomes unusable since it can't open any new windows any more, and RDP stops working. The machine specs are as follows: 12 GB of RAM 80 GB free on hard disk out of 136 GB 8 CPUs Custom size paging file with sizes between 24 GB - 60 GB (originally, this was system managed size, but then the server ran out of memory sooner) Max SQL server memory set to 6 GB (first 10 GB, then 8 GB) I've set the max fulltext crawl range to 8. During the indexing, the 8 CPUs are bit busy for a while, but not excessively. What is astonishing is that there is almost no use of physical memory during the indexing (I can see an increase from 2 GB to 3 GB which still leaves plenty of RAM available). Does anybody have an idea how I can convince fdhost.exe to consume physical memory and leave the paging memory alone? Or what else can I try?

Upload large file via webservice problem in vb.net 1.1

Hi All, I am uploading large file via webservice using below my code in vb.net 1.1. Problem is vb.net 1.1 don't know Using Object loop. I would like to know any other loop similar Using loop to replace in vb.net 1.1Public Function UploadLargeFile(ByVal FileName As String, ByVal buffer As Byte(), ByVal Offset As Long) As Boolean Dim retVal As Boolean = False Try Dim FilePath As String = Path.Combine(System.AppDomain.CurrentDomain.BaseDirectory.ToString() & "\fpath\", FileName) If Offset = 0 Then File.Create(FilePath).Close() End If Using fs As New FileStream(FilePath, FileMode.Open, FileAccess.ReadWrite, FileShare.Read) fs.Seek(Offset, SeekOrigin.Begin) fs.Write(buffer, 0, buffer.Length) End Using retVal = True Catch ex As Exception Throw ex End Try Return retVal End Function  

Uploading Large File (40mb) fails... (webhttpbinding) (Azure)

I'm using javascript to upload a file to my WCF service hosted on Azure.  (40 mb file) Here is the snippets from the web.config for the service: <system.serviceModel> <bindings> <webHttpBinding> <binding transferMode="Streamed" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647" openTimeout="00:25:00" closeTimeout="00:25:00" sendTimeout="00:25:00" receiveTimeout="00:25:00" name="WebConfiguration"> </binding> </webHttpBinding> </bindings> <behaviors> <endpointBehaviors> <behavior name="RestfulEndpointBehavior"> <webHttp/> </behavior> </endpointBehaviors> <serviceBehaviors> <behavior name="CPUploadServiceBehavior"> <serviceMetadata httpGetEnabled="true" httpGetUrl=""/> <serviceDebug includeExceptionDetailInFaults="false"/> </behavior> </serviceBehaviors> </behaviors> <serviceHostingEnvironment aspNetCompatibilityEnabled="true"/> <services> <service name="CPUpload" behaviorConfiguration="CPUploadServiceBehavior"> <endpoint add

How to read and analyze a very large log file with high efficiency and performance?

It is a very large .txt file (more than 3M), and produced everyday, the content is user's system log like below: 2007-11-01 18:20:42,983 [4520] INFO GetXXX() SERVICE START2007-11-01 18:21:42,983 [4520] WARING USER ACCESS DENIED2007-11-01 18:22:42,983 [4520] ERROR INPUT PARAMETER IS NULL CAN NOT CONVERT TO INT322007-11-01 18:23:59,968 [4520] INFO USER LOGOUT   Any idea? Thanks.

400 Bad Request when trying to send large file to service using basichttp streaming.


Hi All,


I am trying to send the large file (approx 1.5GB) to the WCF service for upload using basic http streaming.

I have increased all the timeouts and quotas in the server and client's config files to quite large values, but still I am getting Bad Request error on the client.


Also the Bad request message is returned always in 1 minute 30 seconds after sending the request to server.


What the most unusual thing is when I use Fiddler or Charles to debug communication, everything works fine. Is it something like fiddler is pinging the IIS or service and hence its not getting timeouted or like that?




Following are the details:

Server Config:


 <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />

retrieve large file from database ?


I have a database table that works as a file repository.  Currently there are binaries stored in there and I want to pull the "large" ones out in chunks.  Some of these files are in excess of 500 MB.  I have business rules that dictate if the file is >5MB to transmit in chunks.  <5MB and I can load into memory and rip out.  I got the uploading in chunks to work, but how do I get it to pull it out of the DB in chunks?

Right now I'm getting hit with a 'System.OutOfMemory' exception.  But when I recreate the byte array of the SAME size (empty though) it doesn't break.

Download Chunks (DAL)

public byte[] getBytesByDataID(int chunkSize, string dataID)
            string query = "SELECT data.data " +
                " FROM data " +

Sizing a Pagefile on Servers with Large Amounts of Memory


I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory.  However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary.  With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile.  Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed.  If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile.  10% of this is 4.8GB, which I would hope I never see consumed.


Any thoughts?


Thanks,    Dave

Large File upload Problem.


I have problem with large upload files. few user complianing about it that either their file fails to upload or taking 20 to 30 mins.

I setup everything correct and i check on different machine its worked fine but only few customer out of 5000 complain it. any idea about if i can check at client machine?

i set my upload limit as 100mb and properly configured and i test it but one strange thing happen, when i try to upload group of files more then 100MB (but no individual file in the group more then 100), sharepoitn let it do and i can see 300MB group of files at once with explorer view.

my understanding when you set the limit it must be for individual or group of files. any thought?


thanks in advacne



i am at SharePoint administrator

Validate every error in XML file



I am uisng XmlReader to validate the XML message against schema using handler, when ever a error occurs it will throw an exception and can't read XMLReader. But my requirement is I have to list out every error in the XML message. Could anybody please help me out ?

Thanks & Regards,


How to avoid the memory exhausion for handling large Image files in C# ?


I have been developing a Form application for handling a large number Image files. The number of Image files could be more than 1,000 and each Image size is about 2MB. The code is as follows:

PictureBox[] pb = new PictureBox[iPictureBoxNumMax];
Label[] lb = new Label[iPictureBoxNumMax];

for (int i = 0; i < iPictureBoxNum; i++)
    lb[i] = new Label();
    pb[i] = new PictureBox();
    pb[i].ImageLocation = @sImageListAll[i];


for (int i = 0; i < iPictureBoxNum; i++)

(1) If the number of Image files is less than 300, the PictureBox generation code (the 1st portion) works. If the number is larger than that, an error message of "Memory exhausion" is displayed.

(2) However, the second protion of the code (pb[i].Dispose()) doesn't seem to clear, since the re-run of the first portion gives an error message of "Memory exhausion".

What should I do ?


Using response.outputstream.write to stream large file fails in firefox



I have some code to stream files (for security of the files), the code works for all files in IE, and small files in firefox. but If the file is large (eg 750Mb) firefox hangs before it shows the open/save dialog box. 

However if i take out response.clearheaders and the response.flush in the code it does download but all the files downloaded by firefox are corrupt because the file size is too large.   Filenames are guids but I took the "-" out incase that was an issue.

If any one can shed some light on this I would be very grateful.

Function is Below. 


Private Sub StreamFile(ByRef response As HttpResponse, ByVal sFile As String)
            Dim fs As FileStream = Nothing
            Dim sFileName As String = Path.GetFileName(sFile)

            Dim oFile As New FileInfo(sFile)

                fs = New FileStream(sFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)

            Catch ex As Exception

            End Try


            'response.ClearHeaders() 'makes file size wrong in firefox
            response.AddHeader("Content-Length", oFile.Length)
            'To forcefully download, even for Excel, PDF files, reg

Large files over httphandler and IIS7 or of memory



I have a problem with large respones and IIS7, the server runs out of memory.

I've written the test code below that works pretty much like my real code... When i start to download the file i can se the memory usage rise until it hits 100% and Firefox complaints about lost connection to server, looks like IIS7 does not release cache or something.. Works in IIS6 by the way...

Thanks in advance, Anders

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Threading;

namespace HttpHandlerTest
    public class FileHandler : IHttpHandler
        public bool IsReusable
            get { return true; }

        public void ProcessRequest(HttpContext context)
            int buffSize = 8192;
            int itterations = 1024 * 500;

Importing large dat file into sql server database

I have a dat file which consists of thousands of columns which i have to insert into a database. I have to insert the data into multiple datatables. I am looking for the best possible way to do that. I have looked into bulk insert also but is there a better way? something like normalizing the data in the data access layer and inserting into the tables or creating ssis packages? Any help is highly appreciated.

SSRS 2008 log file - maximum memory limit



In my SSRS 2008 log file, in few starting lines I see these lines;

INFO: Reporting Services starting SKU: Enterprise
resourceutilities!ReportServer_0-3!1874!10/05/2010-13:32:29:: i INFO: Maximum memory limit is 2097152Mb

What does it mean by maximum memory  limit is 2097152Mb?

I have set the maximum memory limit for SSRS to 5 GB.



Infopath 2007 form error - Selected file is too large


We are using Inforpath form to capture some metadata along with couple of file attachments. We implemented that and its working fine

When we tested by trying to upload file more than 5 MB we got the following error

"The selected file is too large and is causing the form to exceed the amount of allowable resources. Select Another File"

During its analysis we found that this can be resolved by changing one of the "Infopath forms service " configuration named "Maximum size of form session state".

Is there any issue (like performance issues) in increasing this limit to allow us to upload the large attachment


In other words

In the Maximum size of form session state text box, type the maximum session state size in kilobytes. Form-filling sessions that exceed this value will terminate, an error message will be generated, and all form data entered during the session will be lost. The default value is 4096 kilobytes.

If I make this value to 10240 kilobytes (10 MB) ; will this cause any performance issues ??

Silverlight Uploading Large File to a ListItem Erroring Out


Hi everyone,


Having a problem when trying to upload a large file like say 4 megs for a list item via a Silverlight webpart, it is using the Silverlight Client Object Model, I keep getting the following error "The remote server returned an error: NotFound.".


It's fine for small files.


I've already gone into central admin to see if I can adjust the max upload size, and I've increased it, looked at web.config and increased it as well, recycled the app pool, restarted the website, done everything.


What could be wrong?



ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend