.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

Regarding performance issues by using memory stream

Posted By:      Posted Date: October 05, 2010    Points: 0   Category :.NET Framework


i created one project.

many persons will use my project at a time. in that project if any person got error,that error has written to one file by creating object to memory stream.

if all persons got error then that number of objects will be created at a time and all objects will write errors to same file at a time. so here i think there are 25 hits per second.

is there any problems and performance issues by using memory stream ? please help me.

View Complete Post

More Related Resource Links

CLR Inside Out: Investigating Memory Issues


Memory issues can manifest in a wide variety of ways. This column shows you how to collect the data you need to determine what types of mem¬ory issues you are experiencing.

Claudio Caldato and Maoni Stephens

MSDN Magazine November 2006

Conversion of Byte Array to Memory stream


how to convert byte array to memory stream in vb.net.

Troubleshooting IIS 7 network performance issues and TCP Chimney Offload, Receive Side Scaling, and

There is a lot of posts on http://forums.iis.net related to network performance.  Actually, there was two today!.  The problems can be different, but the common thread seems to be network performance.  Windows Server 2008 (and R2) enabled a new network feature by default which has been referred to as "Scalable Networking Pak".  Some people refer to the feature as TCP Chimney Offload.  Either way, in my experience this feature causes more issues than it solves.  If you are having a network related issue or performance, this is a easy setting to check and verify if disabling portions or all of them can resolve the issue.   KB article on TCP Chimney Offloadhttp://technet.microsoft.com/en-us/network/dd277645.aspx KB Article how to show your settings and disablehttp://support.microsoft.com/kb/951037 I strongly encourage test these changes in a non-production environment before making changes to your production systems! Enjoy, Steve SchofieldMicrosoft MVP - IIS

SSRS 2008 R2 Add-in Performance Issues

I have a very interesting issue with a customer.  They use lots of images in their reports.  The report performance is very poor after deploying the report to Sharepoint Integrated Mode and executing the report from Sharepoint Report Viewer.   The single server is setup with SQL Server 2008 R2.  Sharepoint and Reporting Services are running all on the same server.    The goal is to be able to use the Report Viewer Web Part and Sharepoint Filters to create Dashboards.   We have tried the following ways to get satisfactory performance out of these reports:  ·         When we run the reports in a page viewer web part in Sharepoint against a native SQL 2008 Report Manager on another server, they execute great ·         When we run the reports in a page viewer web part in Sharepoint against the Sharepoint Integrated Report server on same server, they execute way more quickly then when using the Report Viewer against the same Sharepoint Integrated Report server . ·         When we remove the images from the reports  and execute the reports via report viewer or in the Integrated library they execute satisfactorily in both Report Manager and Sharepoint Integrated - but th

Performance issues

I'm creating a data warehouse for our product and I am having some issues with system performance when posting data to the database. My test application is written in C#, not that I think that is the problem. I am executing stored procedures to perform the updates. The problem is that I am trying to get around three million updates to perform in a day. Part of my problem is that the SQL server starts to eat memory like no other, forcing the OS to page to the point that my whole system becomes unstable. To simplify client side coding, I have the stored procedure check to see if there is a maching, by key, record and if so, perform an update and if not perform an insert. Would it be better to have the client side attempt the insert first with an SQL insert statement then switch to the update if the key violation occurs?   A sample procedure is as follows: CREATE PROCEDURE [dbo].[sp_UpdateAgents]( @AppName AS VARCHAR(20), @AgentID AS VARCHAR(16), @FirstName AS VARCHAR(30) = null, @LastName AS VARCHAR(30) = null, @PhoneNumber AS VARCHAR(10) = null, @Extension AS VARCHAR(10) = null, @Station AS VARCHAR(10) = null, @PrimaryTeam AS VARCHAR(16) = null ) AS BEGIN DECLARE @rval AS INT; DECLARE @a AS VARCHAR(20); DECLARE C CURSOR FOR SELECT AppName FROM Agents WHERE AppName = @AppName AND AgentID = @AgentID; OPEN C; FETCH NEXT FROM C INTO @a; SET @rval = @@FETCH_

Can anyone explain how SQL Server performance scales with CPUs/Cores, memory, disk drives, etc?

I have experienced the situation where our application running on an expensive system with two 4 core CPUs ran slower than on a cheap machine with one dual core CPU. Both machines had similar memory capacity. In each case, the same query was being run by only one user. It would be nice to find a way to increase performance, but if adding cores doesn't help, what can a poor developer do? Customers get annoyed when they spend a truckload of money on a new server and nothing changes... Another potential problem is when customers have an expensive SAN system Sometimes the performance is much slower then on a system with conventional storage. Can anyone explain the relationship between SQL Server perfomance and hardware ?

Performance Issues


We have a .net website which uses a sql backend, masterpages, skins, etc.

Here is the problem. I open the same page with the same data and same controls on the page.

Sometimes it shows this performance pattern where it takes less than half a second to load the page (notice the highlighted row and duration column):


And sometimes it shows this performance pattern where it takes close to 6 seconds to load the page:


Why does this happen? It is the same exact same code and exact same data. Is there something that can be done to the server?

"I have no particular talent. I am merely inquisitive." -- Albert Einstein

Reporting Services 2008 Performance Issues

We are experiencing longer than should be, although intermittent, response times from Reporting Services server when requesting reports via HTTP.  Here's what we've done to try and locate the issue.
  1. We used the FireFox Net panel to locate the request that appears to be the culpret - it is the '...Pages/Report.aspx?ItemPath=...' request.  This request sometimes returns in 2 seconds and sometimes it takes 20+ seconds
  2. We have checked the Reporting Services ExecutionLogStorage table to see the actual report querying, processing and rendering times for the report and since the report has been up, the total time reported by Reporting Services is less than 3 seconds total (even less than 1 second most of the time)
  3. We're almost positive that there isn't a network issue - we're able to ping the server (although on a different subnet) while the report is taking it's 20+ seconds to return and the network has excellent ping results
  4. We've run a packet sniffer during the http request of the report - the browser sends a few messages to the server, the server responds.&n

Data flow task - performance issues


We recently migrated packages from DTS to SSIS 2008. However the performance seems to be drastically reduced (what was taking under 30 mins is now taking nearly 2 hrs)

On analysis we see that the degradation is mainly with the data transform tasks which load from flat file to sql server tables. (all other factors such as environment, etc are unchanged)

Any suggestion on how this can be improved?

performance issues with calling java webservices from WPF application?


I'm experiencing some major performance issues when calling a Java webservice from my WPF application. If I call the same webservice from a Java test application, I get a return in under 1 second. If I create a new .NET console application and make the same service call, I get a return in under 1 second as well:

class Program
 public Program()

 static void Main(string[] args)
  MyService svc = new MyService();
  for (int i = 0; i < 10; i++)
  Console.WriteLine("calling echo()");
  DateTime startTime = DateTime.Now;
  string result = svc.echo("Test " + i);
  DateTime endTime = DateTime.Now;
  TimeSpan ts = endTime.Subtract(startTime);
  Console.WriteLine("Execution Time = " + ts.TotalSeconds + " seconds, result=" + result);

However, when I try to call this same service from a WPF application, my response times range from 6-8 seconds. I tried moving the webservice calls to a BGW thread without any luck. I'm running the sample WPF application on the same machine as the sample .NET console application so it's not a network-related issue. Can someone explain why the execution times of calling the Java webservice below is so much slower when done from within a WPF application?

public partial class MainWindow : Window
 private MyServi

Performance issues



I am using ASP.NET 2.0 web application.  

The client says that performance of the site is low. I have done gzip compression as well as fixed some of the issues after checking with fiddler.

I have done the steps told in 


Still my client says, performance has not increased.

Is there any tool which can measure or monitor other factors like network bandwidth, process running in client system etc.

Also What other things can be done to increase the performance?

Thank you

Sharepoint solution that calls WCF services has performance issues on production machines


I've a SharePoint solution that consists of set of webparts and other features. The webparts backend call a set of WCF web services. When I tried to install it on a production machine, it has some performance issues, I can open the page and wait for 30 min till it response to me.

If I call the same WCF services from a console application they works fine in both machines, if I creates a SharePoint team site it works fine in both machines, everything works fine on the dev machine.

The WCF services do some CRUD operations in a database and some API calls to SharePoint and PerformancePoint APIs

Development machines Specs:


Windows Server 2008 R2 Enterprise, 64-bit

Workflow Designer 4.0 Load() Performance Issue: possible memory leak?


Hello everybody,

I am trying to implement a workflow designer in a windows form application and use the same designer to view different workflows. However, I found the program became slower and slower after loading several workflows.

my steps are as follows:

1. create a wpf user control WFDesigner to package the workflowdesigner;

2. In the user control  WFDesigner , implement a InitiateDesigner method which create a new workflowDesigner instance each time called. I suppose the old workflowDesigner instance would be collected by GC;

3. create a winform application and install the user control created on step 2 to an element host, and put the element host on the form. 

4. add two buttons for test: one will execute  WFDesigner.InitiateDesigner() once and the other will execute it 100 times.

SQL Server 2008 R2 on Windows Server 2008 R2 Memory Usage issues and not giving memory back


Okay so i have a physical server i am trying to replace with a virtual server. Its doing a large load process every night, about 190GB on one disk.

The VM is:

On vSphere 4.1 ESXi

O/S is Windows Server 2008 R2 Enterprise

SQL version is SQL Server 2008 R2 Enterprise.

4 vCpus

Now 20GB of memory

Disks assigned to this virtual are all seperate LUNs except for the O/S which is on a shared LUN which is not busy. So 1 LUN/Datastore per 'disk drive on this SQL virtual. Data/Log/TempDB/Backup are on their own LUNs.

All the recommended exclusions are in place for McAfee on a SQL Server.

ESXi 4.1 Enterprise host is not overcommitted. Has 4 x 6core 2.6Ghz AMD processors. BL685c G6 and 128GB of memory. Almost nothing else is running at the time i am testing this load and host is not overcommited or stressed or anywhere near it.

SQLServer process shows 127,236K Working Set and 130,040K Peak working set and memory Private working set 88,828k and Commit Size of 302,792k.

It seems when i check the memory usage in task manager it doesn't show much. In Windows 2003 X64 in would show the GB being used, like 8GB right in task manager. But i don't see that in Windows 2008 R2 task manager. I downloaded SysInternals and it shows the memory on the Windows 2008 r2 box as 15Gb of AWE memory. How can this be? The case is th

How to write memory stream to a physical path


Hello I have the  a stream called mstream that I would like to right to a physical path.  How would I do it. 

This is my approach

 byte[] buffers = mStream.ToArray();

        using (FileStream fs = new FileStream(filePath + "emailAttachments\\calendar" + usersession + ".ics", FileMode.CreateNew))
            foreach (byte[] buffer in buffers)
                fs.Write(buffer, 0, buffer.Length);

But I'm gettin an error cannot  covert byte[] to byte.  Please assist.  Thanks.

SSIS 2008 SCD Performance Issues



We are having 20 dimension tables and each table will be having around 20 million records.

These tables would be loaded on a daily frequency with 5 files, each of 3 million records.

We are currently using SCD transformation for TYPE2 load of data.( to maintain history in the dimension table.)

But SCD is taking a long time to insert the data and below are the statistics that I recorded when I executed the package with sample files:

Run1: File1(0.5 million records)  -2 minutes  (Dimension Table is empty)
Run2: File2(0.5 million records)  -13 minutes (Table has  589,000 records)
Run3: File3(0.5 million records)  -26 minutes (Table has 1,140,000 records)
Run4: File4(0.5 million records)  -37 minutes (Table has 1,680,000 records)
Run5: File5(1 million records)   -51 minutes (Table has 2,780,000 records)

Package elapsed time : 2 hr 9 min

1. How do i improve the performance of the SCD? If not, is there any way of loading a table parallely from file so that i can achive performance?

2. In informatica, we have a partitioning feature to load the data parallely which greatly improves performance. Is there any equivalent feature or workaround in SSIS?

Any help would be greatly appreciated.



2008 SSIS and NUMA Memory issues and BLOB Data Types


We're running SSIS 2008, moving data from an Oracle 10g database to a SQL 2008 database.  The SSIS is running on the same machine as the SQL Destination server.  The server has 8 gigs and is windows 2003 sp2

The issue we're having is when our package pulls a blob data type from oracle it will just quit with no errors at about 1.4 million records.   We know there are 6 millions records in the dataset. 

A friend of mine said it was a NUMA Memory issue that it is running out at some point and SSIS thinks the incoming data is finished.  Unfortunately, she said there was no answer. 

I was wondering if someone else had a simliar situation moving blobs from Oracle?


ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend