.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

XP x64 issue with large tiff files

Posted By:      Posted Date: September 30, 2010    Points: 0   Category :WPF

I've got some code which opens and processes a large tiff file (a bit over 2GB with around 1000 frames in).

This works fine on Windows 7 x64 but fails with an overflow error on XP x64.

Other software (including the windows image browser) has no such problem.

Is there something I can do to make my WPF code cope with larger files in XP x64?

 using (Stream instream = new FileStream(sourcePath, FileMode.Open, FileAccess.Read, FileShare.Read))
  TiffBitmapDecoder tiffInDecoder = new TiffBitmapDecoder(instream, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.None);


Iain Downs

View Complete Post

More Related Resource Links

How to go around the file size issue when uploading large files?



In my application the user uploads three files ( Resume, Cover Letter, Selection Creteria).

I want users not to upload more then 4 MB files, so In my web.confing file I have allowed max of 5 MB. <httpRuntime maxRequestLength="5000"/>. I did this so that I can validate the file and give user a message that they are trying to upload more then 4 MB file.

It all works fine if the user is only uploading resume. But if the user uploads all three files of size 4MB then my validation does not work and it goes to connection time out.

How can i handle the validation to check the file size of all 3 files?


Issue with infopath and large attached files (over 27MB)



I have an annoucement list, which  has been customized in infopath ("customize Form" button). Nothing funcy, just adding some comments.

The form opens , and I attach a file, over 27 MB (22 MB works with no problem).

I press save (wait for 20 secs) and and error occurs:"Error processing the form"

I press retry, a popup window appears saying: "sending data to the server" and then an error window appears:


A Microsoft SharePoint Server State Service error occurred while processing your request. For more information, contact your server farm administrator. 

Click Start Over to load a new copy of the form. If this error persists, contact the support team for the Web site.

Click Close to exit this message. Correlation ID:cdb26c5b-ea08-4bb9-8525-34d9b16a30c8

I open the logs and copied-pasted here:


11/04/2010 10:39:58.28 OWSTIMER.EXE (0x0FFC)                   0x130C

Document Conversion of large files



My MOSS 2007 environment fires OutOfMemory exceptions when attempt to launch a custom Document Converter on files larger then 50 MB (Video files) . Because my Document Converter is implemented by a standalone executable, I'm figuring these exceptions are fired by Sharepoint Work process, on downloading/uploading from/to temp folder, not by my custom converter.

If this is the case, can I apply any workarounds on server configuration ? I tried with /3GB switch without luck. Maybe SP1 will address this issue too ?



How to manage large .trc files?

Hello everbody,        Sir, with raid 10  with enough arrays to hold files saperately.    Is an array of raid 10 will be identified as one logical drive? such as c:, d: e.t.c.   Actually I want to store trace files on server to capture representative workload for DB Tuning with DTA. As i know that will be very large in size as my OLTP DB gets very busy at 8 a.m. to 7 p.m. and less busy at diff. time and i don't want any event to be missed out due to lack of resources. So how to manage this?   Can i attach ext. HD to the dedicated sql server to hold .trc files? Is it advisable?   Pl provide any available link for config. best practices for raid 10. Or standard config. for raid 10 to get optimum performance. Or any suggestion regarding this issue.   Thanks in advance  

Server generate frequently large mdmp files

Hi, I have running MS SQL Server 2005. Server generate frequently large mdmp files which took c: drive space. kindly tell how i stop creating these and keep the c: drive space. thx iffi

Storing large files (10gb) in MOSS.

Hello, We are planning to develop a portal with few lists in the portal. We have a requirement to store huge files of approximately 10GB in size. Can any one let me know If we can have a mechanism where in we can only save the file meta data to a MOSS list and save the actual file to a file share (Data Center)? I am looking forward to a generic service(WCF) which can perform this functionlity. If it is possible can some one provide a series of steps to achieve this. Thanks,kesari suresh

What is the best Hosting model when Uploading Large Files via WCF?

I am building a WCF service where besides regular CRUD operations, it porovides methods to upload large (upto 15MB each) files. The service will serve a number of "clients" that will send data to our back-end. Now, while some of the clients are ok with sending one file at time, others would like to be sending these files in batches (i.e. several files at a time).... My questions: What is the best option for hosting this service considering the environment (see below)?  Will IIS be able to support it or will it time out? I know you can increase timeout limit in config file but how will it handle a batch of large files at a time? Is windows service a better option here? What are the dis/advantages of IIS vs Win Srvc in this scenario?  Current Environment: Win 2003 Server w/IIS 6.0 and WCF (.NET 3.5) Thanks in advance!

Advanced Search tiff files sharepoint 2010

hi, I have a question about advanced search in sharepoint 2010 would love if you could help me: I have a document management system that contains files of different types. Performing a standard search system finds documents, Advanced search performed on the document properties the system can not find such documents tiff. We installed ifilter tiff by the following link: http://www.borghoff.com/post/2010/04/02/Windows-TIFF-IFilter-and-SharePoint-2007.aspx But it does not work. Maybe you have another solution for me? Thank you.

Upload large files from a web page

I am trying to figure out a solution to upload large files under a web page. I know WCF + Streaming is a proper solution for large file transfers, but I am not sure how to get the WCF client implemented under ASP.NET. Here is the link: http://mark-csharp.blogspot.com/2009/01/wcf-file-transfer-streaming-chunking.html Please advice if you have any solution.   Besides, is there anyway I could implement a progress bar showing the upload progress while the file is being uploaded, and voiding page timeout?

Large transaction log files after backups

I have noticed that the size of the transaction logs on my databases have rocketed lately, I do have full and differential backups in place for the databases in question, any reason for the sudden increase in size and is there anything that I can do to mitigate it. Some of the databases have a simple recovery model on, in which case transaction logs shouldn't be maintained, but I note that the sizes are still huge and this only started when I began taking full backups and differentials. Thanks.

Report Builder 3.0 issue - can view existing rdl files but cannot create new reports

I have Report Builder 3.0 deployed on my local machine pointing to a remote SQL Server 2008 R2 instance outside of my local domain.  Within this local Report Builder 3.0 instance I am able to open up previously created rdl files from the remote SQL Server 2008 R2 instance that I am connecting to.  However, if I try to create a new report on my local instance of Report Builder 3.0 and attempt to use the same data source as was used to create these existing rdl files, I receive the error: "Unable to connect to data source. The user credentials provided in the connection string do not have permission to connect to Reporting Services."  So why can I open existing rdl files from this remote SQL Server 2008 R2 instance, but not create new ones even though I'm using the same data source as the existing rdl files?  Thanks, Mark

Timeout issue with large data


Getting timeout error while retriving large chunk od data. it workd fine for smaller chunks.

We tried to increase the following but didnt help.. Any suggestions are appreciated.




" at System.Net.HttpWebRequest.GetResponse()\r\n at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)"


Issue with designer files not updating -- and therefore cannot code



I have a problem in which some nested controls on a page are not updated in the designer file. I have deleted the designer file, selected Convert to Web Application, but still the controls are missing.

I know there is a supposed fix at http://blogs.msdn.com/b/webdevtools/archive/2010/03/05/hotfix-for-issue-with-auto-generated-designer-files-not-adding-controls.aspx

Unfortunately the patch won't install and I still have the issue.

Any thoughts? I am running VS 2010 Version 10.0.30319.1 RTMRel.



Upload large files


Hi all,

What is better way to large upload file.

using a web service or in application itself.

If in application, how can we check that files is to upload.

actually i dont want user to wait for complete uploading, when it starts uploading user will get response of uploaded and uploading will be done  in backgroud.

I am not sure this type of task can be done in webservice also so that user doesnot need to wait for complete uploading.

and one more query which event fires when the page redirects to another page.

Is it Page_UnLoad or Dispose

Any suggestions is appreciated.

How to avoid the memory exhausion for handling large Image files in C# ?


I have been developing a Form application for handling a large number Image files. The number of Image files could be more than 1,000 and each Image size is about 2MB. The code is as follows:

PictureBox[] pb = new PictureBox[iPictureBoxNumMax];
Label[] lb = new Label[iPictureBoxNumMax];

for (int i = 0; i < iPictureBoxNum; i++)
    lb[i] = new Label();
    pb[i] = new PictureBox();
    pb[i].ImageLocation = @sImageListAll[i];


for (int i = 0; i < iPictureBoxNum; i++)

(1) If the number of Image files is less than 300, the PictureBox generation code (the 1st portion) works. If the number is larger than that, an error message of "Memory exhausion" is displayed.

(2) However, the second protion of the code (pb[i].Dispose()) doesn't seem to clear, since the re-run of the first portion gives an error message of "Memory exhausion".

What should I do ?


Large files over httphandler and IIS7 or of memory



I have a problem with large respones and IIS7, the server runs out of memory.

I've written the test code below that works pretty much like my real code... When i start to download the file i can se the memory usage rise until it hits 100% and Firefox complaints about lost connection to server, looks like IIS7 does not release cache or something.. Works in IIS6 by the way...

Thanks in advance, Anders

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Threading;

namespace HttpHandlerTest
    public class FileHandler : IHttpHandler
        public bool IsReusable
            get { return true; }

        public void ProcessRequest(HttpContext context)
            int buffSize = 8192;
            int itterations = 1024 * 500;

ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend