.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Sizing a Pagefile on Servers with Large Amounts of Memory

Posted By:      Posted Date: September 22, 2010    Points: 0   Category :Sql Server

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory.  However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary.  With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile.  Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed.  If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile.  10% of this is 4.8GB, which I would hope I never see consumed.


Any thoughts?


Thanks,    Dave

View Complete Post

More Related Resource Links

Crystal Reports can grow fields with large amounts of text layout problem


A can grow field in the details section of a crystal report may have so much text that the section will be larger than the remaining space available on the page (the space below the previous record).  The section has vertical lines on it with Extend to bottom of section when printing set to true.  The section starts on a new page (and may continue on the following one).  I want it to start on the page where the last record was laid out, and continue to the new page without leaving a gap with vertical lines on it.  I'm using Crystal Reports 9 and I can't work out how to do it. Has anyone got a solution?

How to check how much memory from .NET process is allocated in pagefile?

Hi How to check how much memory from .NET process is allocated in pagefile? I was trying use task manager and perfmon but I could not find any counter/column that would tell me how much memory from specific .NET process is in pagefile. Regards

How to avoid the memory exhausion for handling large Image files in C# ?


I have been developing a Form application for handling a large number Image files. The number of Image files could be more than 1,000 and each Image size is about 2MB. The code is as follows:

PictureBox[] pb = new PictureBox[iPictureBoxNumMax];
Label[] lb = new Label[iPictureBoxNumMax];

for (int i = 0; i < iPictureBoxNum; i++)
    lb[i] = new Label();
    pb[i] = new PictureBox();
    pb[i].ImageLocation = @sImageListAll[i];


for (int i = 0; i < iPictureBoxNum; i++)

(1) If the number of Image files is less than 300, the PictureBox generation code (the 1st portion) works. If the number is larger than that, an error message of "Memory exhausion" is displayed.

(2) However, the second protion of the code (pb[i].Dispose()) doesn't seem to clear, since the re-run of the first portion gives an error message of "Memory exhausion".

What should I do ?


Large files over httphandler and IIS7 or of memory



I have a problem with large respones and IIS7, the server runs out of memory.

I've written the test code below that works pretty much like my real code... When i start to download the file i can se the memory usage rise until it hits 100% and Firefox complaints about lost connection to server, looks like IIS7 does not release cache or something.. Works in IIS6 by the way...

Thanks in advance, Anders

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Threading;

namespace HttpHandlerTest
    public class FileHandler : IHttpHandler
        public bool IsReusable
            get { return true; }

        public void ProcessRequest(HttpContext context)
            int buffSize = 8192;
            int itterations = 1024 * 500;

Unable to retrieve large amounts of data?


We're using WCF with 'basicHttpBinding', and have set the maxBufferSize, maxReceivedMessageSize to 99999999. We've also set the ReaderQuota maxArrayLength to 99999999.

The method I'm calling is supposed to return an array of objects with a couple containing a few small strings, and a couple of bools and integers. If the array is less than ~500 elements, I don't have a problem, and I receive all of my data. However if the array is of any meaningful size, then I get the following CommunicationException:

"The server did not provide a meaningful reply; this might be caused by a contract mismatch, a premature session shutdown or an internal server error."

I'm thinking that I might need to retrieve the array's in manageable chunks, but before I start that rewrite process, I'd like to see if there would be a way to get this done.

Inefficient Query Plans due to Large Amounts of RAM (KB2413549)


Does anyone have any more details on this KB?  MS states this happens when you set max memory to a "large value".  A hard cutoff here would be more helpful.  Also, on the versions list in the KB, SQL2008 Std is not listed, although SQL2005 Std edition is listed.  Any help here would be great.  Thanks.



Out of memory when trying to validate very large XML file


Hello everyone,

I am trying to perform an XSD validation against very large XML files (over 400MB).  When I run the following code (which I pulled out of my application and hard coded the file names for this example), I get a "System.OutOfMemoryException" exception:

                string XSDFile = @"D:\TestFiles\DatasetFiles\DiCi_XSD_Test\dc-schema-1.6.xsd";
                string XMLFile = @"D:\TestFiles\DatasetFiles\DiCi_XSD_Test\10304.0.1.xml";

                XmlReaderSettings settings = new XmlReaderSettings();
                settings.Schemas.Add(null, XSDFile);
                settings.ValidationType = ValidationType.Schema;
                XmlDocument document = new XmlDocument(

running SharePoint 2010 and on the server w3wp.exe is consuming huge amounts of CPU and Memory bring


I'm running Server 2008 (64bit) with 8Gig of RAM and 4 processors.  We haven't had more than four people in Sharepoint at the same time and performance has been extremely poor (had moments where things run ok but it usually doesn't last for more than a few minutes).  We're not even doing anything advanced.  Just a very small document repository with basic searching.

When I log into the server I can see several w3wp.exe processes running and will spike up to 100% of the CPU.  Memory will be around 100-800 Mb for each process.

Any ideas on how to speed up SharePoint 2010 performance?  Our SharePoint 2003 box runs MUCH faster and has 100+ users and over 3000 documents???



out of memory loading datatable from large oracle table


I've created a winform c# application that connects to an oracle database and loads a datatable with all the data in the oracle table then exports it out to a delimited file, which will then get imported to a local mysql database.  This works just fine when the oracle table isn't so large.  However I keep getting an out of memory exception when I populate my datatable with a large oracle table.  I can't manually export data from oracle then manually load to mysql because this has to be seamless to the user.  I'm having difficulties grasping how to solve this problem.  Is there a better solution then what I have to get from oracle to mysql? 

large object "out of memory"


We've read a number of MSDN articles, blogs, and forum posts.  But we cannot find a problem or explanation that seems to match what we are seeing.

In a certain configuration our application allocates various arrays of 40 or 80 Mb.  At one point we have about 500 Mb allocated then the app asks for an 80 Mb array - an out of memory exception is thrown.  But, according to a memory analysis tool and perfmon, there is over 1 Gb physical memory available (not to mention an additional 4 Gb in the page file)

We are completely confused as to why this memory allocation fails.  It seems to make no sense.

Any ideas?  TIA.

How to detect and avoid memory and resources leaks in .NET application

Despite what a lot of people believe, it's easy to introduce memory and resources leaks in .NET applications. The Garbage Collector, or GC for close friends, is not a magician who would completely relieve you from taking care of your memory and resources consumption.

I'll explain in this article why memory leaks exist in .NET and how to avoid them. Don't worry, I won't focus here on the inner workings of the garbage collector and other advanced characteristics of memory and resources management in .NET.

file upload in chunks or not buffering in memory before writing to disk?


What are the options for handling file uploads to reduce the memory footprint?  Is there a way to upload in chunks?  Is there a way to stream upload directly to disk instead of loading entire file in server memory?


Visual Studio 2008 Memory problems


My memory in task manager reaches about 900,000K  and I don't know why it does this. Definately slows everything down, especially when I rebuild my tableAdapters, takes about 30 seconds sometimes to rebuild the project.

Persist large dataset across ASP.NET pages. No database


Can anyone give me suggestions on how to implement the following requirement?

1. A online wizard (step-by step operation) processes user requests across several ASP.NET pages.  The session data is stored in memory until the user submits the request.  Once the request is submitted, a XML file is created and the data are passed to XML.

# Case Study: The parent page has a "select" button.  Click "select" will pop up a child page.  The users can select 1000+ items from a datagrid in the child page.  Once the user clicks "confirm", the selected items in the child page will be populated to the selected item gridveiw in the parent page.  The user can keep selecting more in the child page and the selected items will be appended to the gridview in the parent page.


- What's the best way to extract strings from a text file and convert to a dataset in memory and then present it in a datagrid in UI?

- What's the best way to persist and manipulate a large dataset across pages?  Session? or ViewState? Can it be accomplished by LINQ?

The app doesn't interact with a DB.  Once the request is submitted, it will convert the in memory data to a XML file.  What technology can be used to accomplished this?&nb

Crystal Report gives System.AccessViolationException: Attempted to read or write protected memory. T


when a crystal report for same id is opened again at same time or refreshed then it gives the

"System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt."

Stack Trace is :-

Stack Trace:

[AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.]
CrystalDecisions.ReportAppServer.Controllers.DatabaseControllerClass.SetConnectionInfos(ConnectionInfos ConnectionInfos) +0
CrystalDecisions.CrystalReports.Engine.Table.ApplyLogOnInfo(TableLogOnInfo logonInfo) +362
CrystalDecisions.CrystalReports.Engine.ReportDocument.SetTableLogon(Tables tables, String user, String password, String server, String database, Boolean ignoreCase) +258
CrystalDecisions.CrystalReports.Engine.ReportDocument.SetDatabaseLogon(String user, String password, String server, String database, Boolean ignoreCase) +204

How to generate large pdf's?


Hi all,

I am using rdlc reports in my appln. Am trying to generate a huge pdf file which almost consists of 1500 - 2000 pages.
I have used ReportViewer and am adding datasource (Generic collection) to report. But am getting system out of Memory exception as it is a huge file.
Can any one suggest me how to generate large pdf files in asp.net c#

Maximum memory for string ? . System.MemoryOutOfException Occurs when reading the Content of the fil



I have a C++ code which reads a file [vtk file - its similar to text file only]. and try to store it in string.

here's the code:

String^ ParallelProjectionRenderer::GetVolumeDataAsString(String^ FileName)

    vtkSmartPointer<vtkDataSetWriter> sWriter = vtkSmartPointer<vtkDataSetWriter>::New();

    String^ FileName = "D:\\Users\\...";

    char * sptName = static_cast<char *>(Marshal::StringToHGlobalAnsi(FileName).ToPointer());







    String^ result= File::ReadAllText(FileName);   [it breaks here with system.memoryoutofexception when the file size is huge]

    return result;


this code executes fine when the file size is less than 8mb.

I wanted to know whats the maximum memory allocated to the string.

As a workaround i skipped reading out in C++ code. and i tried to read the file contents in asp.net C# code and stor

ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend