.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Handling a large number of structs

Posted By:      Posted Date: October 01, 2010    Points: 0   Category :.NET Framework


I have to parse a CSV file, filter out lines,  add values and store it to disk again. My idea is to create a struct for the values of each line:

struct LineValues
  int Value0;
  double Value1;
  string Value3;

Then I want to initialize LineValue instances with values from the file (reding it via StreamReader.ReadLine()) and put them into a List<LineValue>.

The generic List is a reference type so it is put onto the heap. But where do the LineValue instances go? I am afraid this is gonna cause a big GC job after doing this on a file with thousands of lines.

Is there a better solution in terms of performance?

Thx for your help,


View Complete Post

More Related Resource Links

ASP.NET Events and Pageload not firing when Datagrid containing large number of rows of data

Hi All, I have a datagrid in aspx page.Inside of datagrid i am using around 15 controls such as Button,dropdownlist and text box controls.Once the datagrid binds then the events in the aspx page not firing.Issue occured when the number of rowsgreater than 500.The number of rows less then its works fine.If anybody knows the solution please let me know."Platform i am working on ASP.NET1.1"Regards Hareesh

Large number of items in SharePoint calendar


We have a department with a calendar that has over 400,000 items.  We're thinking it somehow got that big due to them linking with outlook.  We've turned that off but the items are still in SharePoint in the calendar.  This is causing a great deal of blocking when that calendar is accessed (CAML queries).  They take several minutes to run and block anyone from doing anything in SharePoint (like saving documents or even accessing SharePoint).  We'd like to just delete the 400,00+ items but I'm not sure what all tables are involved.  I've found that the tables docs and UserData each have the 400,000+ records relating to the calendar in question.  I don't want to simply delete the records from these tables only.  Does anyone know what I need to do to clear the calendar?  Is there a way in SharePoint to just delete the whole calendar and it delete the records where needed and then add the calendar back (and it be empty)?



How to avoid the memory exhausion for handling large Image files in C# ?


I have been developing a Form application for handling a large number Image files. The number of Image files could be more than 1,000 and each Image size is about 2MB. The code is as follows:

PictureBox[] pb = new PictureBox[iPictureBoxNumMax];
Label[] lb = new Label[iPictureBoxNumMax];

for (int i = 0; i < iPictureBoxNum; i++)
    lb[i] = new Label();
    pb[i] = new PictureBox();
    pb[i].ImageLocation = @sImageListAll[i];


for (int i = 0; i < iPictureBoxNum; i++)

(1) If the number of Image files is less than 300, the PictureBox generation code (the 1st portion) works. If the number is larger than that, an error message of "Memory exhausion" is displayed.

(2) However, the second protion of the code (pb[i].Dispose()) doesn't seem to clear, since the re-run of the first portion gives an error message of "Memory exhausion".

What should I do ?


Handling large volume of data using WCF


Hello there

I am trying to load data ranging from 5MB to 40MB using WCF to my web page.

Just a brief on what I am exactly trying to do:

I created a custom object filled with data which is passed through WCF to my web page(client). My web page access the WCF using javascript.

There are couple of issues which I am facing

1. I am unable to transfer data when it exceeds 10MB

2. It accumulates more memory when handling objects with a size of  15MB or more

Please advice...

Many Thanks


Following are the configurations which I am using in the WCF Service


 <compilation debug="true" />

 <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />
  <binding name="basicbinding" closeTimeout="00:01:00"
     openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"
     allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
     maxBufferSize="2147483647" maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647&q

Handling a large amount of traffic


Hi All,


I will soon be hosting a website that will incur a large amount of traffic.  I will be using many servers to handle this traffic.  I have figured out how to partition the database accross many servers, I will be using a CDN to send out the static files but there is still one huge bottleneck that will slow down my site that I can't figure out how to partition this problem.  When the user types in his browser www.mywebsite.com all traffic will be directed to one page on one server and this will kill performance.  Is there such a service out there free or not where the DNS lookup of my site can be can be load balanced accross many servers so when a user types in www.mywebsite.com he is not always directed to the same server? 

Emulating load in browser of a large number of top-level sites - how to?



We have migrated several web apps - containing several thousand site collections - to SP 2010.

Using a browser to test that all the top-level sites can load is not really an option.

Does anyone know of any technique/tool that can do this?

Large number of Insert and Update commands


I have a task where I need to grab commands from an XML file, process the commands into SQL Insert/Update commands and then send the insert/updates on the SQL Server.  My problem is there can be hundreds of thousands of Insert/Update commands coming from the XML file and some of the insert and update commands can not be decoupled.  For example there is a command where I need to insert values accross multiple tables and need to use something like SELECT @DataID = scope_identity(); in the query.  What is the best way to execute large amounts of INSERT/UPDATE commands where some of the commands cannot be decoupled.


Slow processing due to large number of threads.


Hello Friends!

I am working on a project in which I am creating around 25 threads in my application. All the threads are communicating with each other. The problem is that due to large number of threads the application is executing very slow and my CPU utilization reaches around 100%.  How can I overcome this problem? Can I create multiple programs instead of threads, if yes then how will they communicate with each other?


SSIS : Pulling large number of tables from Source system



we are migrating a datawarehouse application from asp - sql 2000 to dotnet2.0 - sql 2005
currently we are in the process of coming up with an optimum solution for backend design using sql server 2005.
This application pulls data from 3 different source systems (oracle,sql server and mainframe db2).

we have 3 different staging databases corresponding to each of the source application database.
For eg : RetailGarments application is an oltp application with backend oracle database .
we have a staging SQL server database called Retailgarments_stg for our datawarehouse application.
similarly OnlineTravelBooking is an OLTP application with DB2 database .we have corresponding
OnlineTravelBooking_Staging database for staging this data.

Each of these source systems have more than 100 tables and we are currently pulling most of the
tables to our side from the source .

These staging data from different datasources are accumulated in to a cleansed Schema database
which is used for Reporting and other OLAP requirements.

Our question is related to data pull from the source system.
Current SQL Server database pulls these data from sorce system using Stored proc dynamic queries containing
link server openquery.

We would like to improve the performance as part of sql server 2000 to

SSRS Designer runs very slowly with large number of groups


I've developed a workaround for the recursive Parent-Child issue where groups do not export out to Excel properly (no expand\collapse navigation, no stepped groups) that involves setting up a group for each parent and then applying a filter for the parent on the UniqueName.  Children groups can then be set up with the Group set to ParentUniqueName.  This works great, except that I now have 30+ groups.  The designer (both the BIDS and ReportBuilder) have slowed down to a crawl where each click or keyboard navigation etc. takes 7+ seconds.  This has made report building extremely painful!  It is not a hardware problem, as I have tried it on a number of high powered machines with the same results.  The report itself runs relatively quickly.

Is there an upper limit for the number of groups that SSRS designer is meant to work with, or a workaround that I am not aware of?

A large 76mb xml file best way to count the number of nodes before processing




For the general processing I am using a XMLReader due to the size of the file and technically only need a once time read per file.

But the external company who made this starts it by doing a count of the nodes to say how many inserts/updates are to be done this is done by basically looping through the xml file counting them and then they recreate the xmlreader again to finally process it.

It just seems a bit clumsy to me, understandably the size of the file means we can't just load it into memory but is there another low cost method to count the nodes without having to recreate the xmlreader each time.


Could we stream the file instead and use XMLDocument or Xdocument (giving us linq capabilities) to improve the efficience as I imagine there must be some sort of performance hit re-creating a XMLReader 2 to 3 times.



For example would this use more memory than say xmlreading and looping the whole thing through to get a count.

using (XmlReader reader = XmlReader.Create(filename, settings))
                    XPathNavigator nav = new XPathDocument(reader).CreateNavigator();
                    XPathNodeIterator xPathIt = nav.Select("//root/theNode");
                    int c = xPathIt.Count;

InfoPath 2010 - decimal field (2digit) converting large number to scientific notation


Hi Everyone,

I hope there is a simple solution to this issue.

We have built an InfoPath form for the web. Data from the form is mapped and written to SQL Server 2008 tables. The values are retrieved when the form is reopened. The form is accessed world wide and uses various currency types. Some of the values that we need in the decimal fields are quite large, e.g. 999,999,999,999,999.00. (we don't use currency sign because different curencies use different signs.)

When the amount shown is entered and saved an error occurs and the values have been converted to scientific notation. The field lengths are long enough for the value but still the scientific notation appears. The largest number that I've been able to enter without it converting is 9,999,999,999.00.

Any thoughts on how this conversion to scientific numbers can be prevented. This only occurs on a web based form.

Thanks in advance for your assistance.



Richard D. Lucky Quatier

MOSS 2007 This list or library contains a large number of items.


I have a library showing "This list or library contains a large number of items.".

The library contains only 1300 items.


1. Would the performance, e.g., upload files/change item properties, be affected?

2. Any way I can better organize the items?

3. I kown we can create folders in a library, how can I create subfolder? Or subfolder refers to Content Type?

4. Would create content type improves performance?


Many thanks.


Every self-respected programmer should include exception handling techniques. Sometimes your application will generate an error. Regardless of who was responsible for this error, the programmer or the user, it is up to the first to include the necessary exception handling techniques to keep his/her program from crashing. The .Net environment provides useful techniques for avoiding disastrous errors such as try-catch statements and user-defined exceptions.

Global Exception Handling with ASP.NET

After your global exception handler has done its work, you'll want to redirect the users of your website to a friendly page that tells them that something has gone wrong, and then provide them with customer support information as well as a link back to your web application's home page.

SQL Server 2005 Try and Catch Exception Handling

I'm pretty excited to see that there is some real error handling for T-SQL code in SQL Server 2005. It's pretty painful to have your wonderfully architected .NET solution tainted by less-than-VBScript error handling for stored procedures in the database. The big difference being the addition of TRY..CATCH blocks. Let's take a look:

Convert English to Arabic number without changing any regional settings in .net

Well, most applications that I worked with was multilingual that supports English UI and Arabic UI.

And one of the major issue that we have faced is displaying Arabic numbers without the need of changing the regional settings of the PC.

So the code below will help you to display Arabic number without changing any regional settings.
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend