.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

How to avoid the memory exhausion for handling large Image files in C# ?

Posted By:      Posted Date: September 27, 2010    Points: 0   Category :Windows Application

I have been developing a Form application for handling a large number Image files. The number of Image files could be more than 1,000 and each Image size is about 2MB. The code is as follows:

PictureBox[] pb = new PictureBox[iPictureBoxNumMax];
Label[] lb = new Label[iPictureBoxNumMax];

for (int i = 0; i < iPictureBoxNum; i++)
    lb[i] = new Label();
    pb[i] = new PictureBox();
    pb[i].ImageLocation = @sImageListAll[i];


for (int i = 0; i < iPictureBoxNum; i++)

(1) If the number of Image files is less than 300, the PictureBox generation code (the 1st portion) works. If the number is larger than that, an error message of "Memory exhausion" is displayed.

(2) However, the second protion of the code (pb[i].Dispose()) doesn't seem to clear, since the re-run of the first portion gives an error message of "Memory exhausion".

What should I do ?


View Complete Post

More Related Resource Links

Large files over httphandler and IIS7 or of memory



I have a problem with large respones and IIS7, the server runs out of memory.

I've written the test code below that works pretty much like my real code... When i start to download the file i can se the memory usage rise until it hits 100% and Firefox complaints about lost connection to server, looks like IIS7 does not release cache or something.. Works in IIS6 by the way...

Thanks in advance, Anders

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Threading;

namespace HttpHandlerTest
    public class FileHandler : IHttpHandler
        public bool IsReusable
            get { return true; }

        public void ProcessRequest(HttpContext context)
            int buffSize = 8192;
            int itterations = 1024 * 500;

How to detect and avoid memory and resources leaks in .NET application

Despite what a lot of people believe, it's easy to introduce memory and resources leaks in .NET applications. The Garbage Collector, or GC for close friends, is not a magician who would completely relieve you from taking care of your memory and resources consumption.

I'll explain in this article why memory leaks exist in .NET and how to avoid them. Don't worry, I won't focus here on the inner workings of the garbage collector and other advanced characteristics of memory and resources management in .NET.

Document Conversion of large files



My MOSS 2007 environment fires OutOfMemory exceptions when attempt to launch a custom Document Converter on files larger then 50 MB (Video files) . Because my Document Converter is implemented by a standalone executable, I'm figuring these exceptions are fired by Sharepoint Work process, on downloading/uploading from/to temp folder, not by my custom converter.

If this is the case, can I apply any workarounds on server configuration ? I tried with /3GB switch without luck. Maybe SP1 will address this issue too ?



How to manage large .trc files?

Hello everbody,        Sir, with raid 10  with enough arrays to hold files saperately.    Is an array of raid 10 will be identified as one logical drive? such as c:, d: e.t.c.   Actually I want to store trace files on server to capture representative workload for DB Tuning with DTA. As i know that will be very large in size as my OLTP DB gets very busy at 8 a.m. to 7 p.m. and less busy at diff. time and i don't want any event to be missed out due to lack of resources. So how to manage this?   Can i attach ext. HD to the dedicated sql server to hold .trc files? Is it advisable?   Pl provide any available link for config. best practices for raid 10. Or standard config. for raid 10 to get optimum performance. Or any suggestion regarding this issue.   Thanks in advance  

Server generate frequently large mdmp files

Hi, I have running MS SQL Server 2005. Server generate frequently large mdmp files which took c: drive space. kindly tell how i stop creating these and keep the c: drive space. thx iffi

Storing large files (10gb) in MOSS.

Hello, We are planning to develop a portal with few lists in the portal. We have a requirement to store huge files of approximately 10GB in size. Can any one let me know If we can have a mechanism where in we can only save the file meta data to a MOSS list and save the actual file to a file share (Data Center)? I am looking forward to a generic service(WCF) which can perform this functionlity. If it is possible can some one provide a series of steps to achieve this. Thanks,kesari suresh

How to avoid sluggish scrolling in a large DataGrid?

If you have a large DataGrid with many virtualized rows and a screen full of columns, scrolling becomes very choppy - only a few updates per second. Here's a repro: Create a WPF app with this XAML: <Window x:Class="WpfApplication1.MainWindow"         xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"         xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"         Title="MainWindow" WindowState="Maximized">     <DataGrid Name="_grid" AutoGenerateColumns="False" ColumnWidth="60" /> </Window> and this in the code behind: public partial class MainWindow : Window {     public MainWindow()     {         InitializeComponent();                     for (int i = 0; i < 30; ++i)         {             var c = new DataGridTextColumn();             c.Binding = new Binding();             c.Header

What is the best Hosting model when Uploading Large Files via WCF?

I am building a WCF service where besides regular CRUD operations, it porovides methods to upload large (upto 15MB each) files. The service will serve a number of "clients" that will send data to our back-end. Now, while some of the clients are ok with sending one file at time, others would like to be sending these files in batches (i.e. several files at a time).... My questions: What is the best option for hosting this service considering the environment (see below)?  Will IIS be able to support it or will it time out? I know you can increase timeout limit in config file but how will it handle a batch of large files at a time? Is windows service a better option here? What are the dis/advantages of IIS vs Win Srvc in this scenario?  Current Environment: Win 2003 Server w/IIS 6.0 and WCF (.NET 3.5) Thanks in advance!

WPF TreeView working with Image Files

Hi, I'm working with a TreeView in my WPF App to control images files in my disk. The ItemsSource of TreeView is binding with a dynamic List<T> like these: <TreeView ItemsSource="{Binding Path=Thumbnail,ElementName=Tree}" x:Name="treeViewDocument" Width="230" Height="500" Margin="0,0,0,10"> <TreeView.ItemTemplate> <HierarchicalDataTemplate ItemsSource="{Binding Path=MoreThumbnail}"> <DockPanel HorizontalAlignment="Center" VerticalAlignment="Center"> <Image Source="{Binding Path=MyImage}" Width="{Binding Path=MySize}"/> </DockPanel> </HierarchicalDataTemplate> </TreeView.ItemTemplate> </TreeView> in the code behind I have: private List<MeusItens> _MyImages; public Window4() { InitializeComponent(); } public List<MyImages> Thumbnail { get { if (_MyImages == null) { MyImages obj = new MyImages("C:\\Temp\\1.tif", "70"); _MyImages = new List<MyImages>(); _MyImages.Add(obj); } return _MyImages; } } public class MyImages { public MyImages(string strImg, string strSize) { MyImage = strImg; MySize = strSize; }

Upload large files from a web page

I am trying to figure out a solution to upload large files under a web page. I know WCF + Streaming is a proper solution for large file transfers, but I am not sure how to get the WCF client implemented under ASP.NET. Here is the link: http://mark-csharp.blogspot.com/2009/01/wcf-file-transfer-streaming-chunking.html Please advice if you have any solution.   Besides, is there anyway I could implement a progress bar showing the upload progress while the file is being uploaded, and voiding page timeout?

Large transaction log files after backups

I have noticed that the size of the transaction logs on my databases have rocketed lately, I do have full and differential backups in place for the databases in question, any reason for the sudden increase in size and is there anything that I can do to mitigate it. Some of the databases have a simple recovery model on, in which case transaction logs shouldn't be maintained, but I note that the sizes are still huge and this only started when I began taking full backups and differentials. Thanks.

Huge PDF files when using a Map or Image



Using SQL Server 2008 R2 Reporting services and Report Builder 3 - when a report is run that contains either images (from the database) or a map (which physically is no more than 1/5th of an A4 sheet, any generated PDF is vast in size >5Meg.  This causes issues with emails, and (in my experience), does not reporesent anything like the amount of graphical information on screen.

Is there any way of reducing the rendering size of the Images/Map, or the finised size of the PDF, as currently we cannot progress with a very simple set of reports that contain several images or a map, and this will cause some distruption to our rollout.


any ideas greatfully recieved






Client image files uploaded to Server are staying locked when accessed again.


I have a web form that, upon user request, uploads an image file from the client to a server folder, makes a thumbnail and saves to another folder than sets an imageurl of an image control to that thumbnail. All fine up to here, problem is that at times (not always but could not find the exact pattern) when uploading an existing file, it throws a ' ... file used by another process ... ' error, both on the original image file and the created thumbnail. Is there a way to prevent the locking of these files or alternately, which object need I destroy to release the image file and avoid these contentions? I do destroy the image object immediately after thumbnail is created. Ideas anyone?

Sizing a Pagefile on Servers with Large Amounts of Memory


I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory.  However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary.  With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile.  Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed.  If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile.  10% of this is 4.8GB, which I would hope I never see consumed.


Any thoughts?


Thanks,    Dave

Upload large files


Hi all,

What is better way to large upload file.

using a web service or in application itself.

If in application, how can we check that files is to upload.

actually i dont want user to wait for complete uploading, when it starts uploading user will get response of uploaded and uploading will be done  in backgroud.

I am not sure this type of task can be done in webservice also so that user doesnot need to wait for complete uploading.

and one more query which event fires when the page redirects to another page.

Is it Page_UnLoad or Dispose

Any suggestions is appreciated.

XP x64 issue with large tiff files


I've got some code which opens and processes a large tiff file (a bit over 2GB with around 1000 frames in).

This works fine on Windows 7 x64 but fails with an overflow error on XP x64.

Other software (including the windows image browser) has no such problem.

Is there something I can do to make my WPF code cope with larger files in XP x64?

 using (Stream instream = new FileStream(sourcePath, FileMode.Open, FileAccess.Read, FileShare.Read))
  TiffBitmapDecoder tiffInDecoder = new TiffBitmapDecoder(instream, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.None);


Iain Downs
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend