.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Handling large volume of data using WCF

Posted By:      Posted Date: September 30, 2010    Points: 0   Category :WCF

Hello there

I am trying to load data ranging from 5MB to 40MB using WCF to my web page.

Just a brief on what I am exactly trying to do:

I created a custom object filled with data which is passed through WCF to my web page(client). My web page access the WCF using javascript.

There are couple of issues which I am facing

1. I am unable to transfer data when it exceeds 10MB

2. It accumulates more memory when handling objects with a size of  15MB or more

Please advice...

Many Thanks


Following are the configurations which I am using in the WCF Service


 <compilation debug="true" />

 <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />
  <binding name="basicbinding" closeTimeout="00:01:00"
     openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"
     allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
     maxBufferSize="2147483647" maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647&q

View Complete Post

More Related Resource Links

Data Points: SQL Server 2005 XML Support, Exception Handling, and More


SQL Server 2005 includes several important improvements to the Transact-SQL (T-SQL) language. One added feature is a new kind of trigger that fires when data definition language (DDL) statements run.

John Papa

MSDN Magazine May 2006

Data Points: Using XQuery, New Large DataTypes, and More


SQL Server 2005 introduces a lot of new features, but it also enhances the popular and oft-used Transact-SQL (T-SQL) language. Changes include the introduction of new datatypes to store large values using the MAX indicator, the integration of enhanced XML querying and data modification with XQuery, and the new XML datatype.

John Papa

MSDN Magazine March 2006

Data Points: Handling Data Concurrency Using ADO.NET, Part 2


Enterprise development has been moving towards a discon-nected model in recent years and ADO. NET development is no exception. While the disconnected model of the ADO. NET DataSet offers great flexibility, that adaptability also means looser control over data updates than you get with a connected data access model.

John Papa

MSDN Magazine October 2004

Data Points: Handling Data Concurrency Using ADO.NET


One of the key features of the ADO. NET DataSet is that it can be a self-contained and disconnected data store. It can contain the schema and data from several rowsets in DataTable objects as well as information about how to relate the DataTable objects-all in memory.

John Papa

MSDN Magazine September 2004

XML in Yukon: New Version Showcases Native XML Type and Advanced Data Handling


The next version of Microsoft SQL Server, code-named "Yukon," represents quite a few steps forward in the evolution of XML integration. Yukon supports native storage of XML data using the XML data type, which makes it possible to run native queries on XML data using the emerging industry standard XQuery language. Data integrity of the XML data type can be enforced through schema validation and XML-based check constraints, and special indexes can be defined that help speed up queries. In addition, Yukon has the built-in ability to expose its data through Web services. This article discusses these and other XML features of Yukon.

Bob Beauchemin

MSDN Magazine February 2004

Multiprocessor Optimizations: Fine-Tuning Concurrent Access to Large Data Collections


Application performance involves more than just speed. In a Web server environment, top performance also means ensuring that the maximum numbers of users can be served concurrently. This can be accomplished through efficient use of multiprocessor machines and thread management. This article presents techniques that can solve a number of concurrency problems. One approach, using thread management, controls access to a database on a per-thread basis, which protects the integrity of the data. In the article, reusable thread classes are built and presented. The classes are then tested and their performance in a live environment is examined.

Ian Emmons

MSDN Magazine August 2001

WCF issues sending large data - "An existing connection was forcibly closed by the remote host"

Hi Guys I have posted the following on www.asp.net but thought it might be productive posting here too.I have been pulling my hair out trying to fix an issue with sending data over WCF, I have read pretty much every THREAD on this forum regarding or similar to this issue without any successful solution.  I have a simple data object which has a [DataMember] with a data type of byte[]. I use this to send binary data from my web app as follows: Asp.net MVC website --> WCF Client (Has my Service References) --> My Services (MyServices.svc) --> Business/Data Tier Everything is being executed from within VS 2008 Pro.   Now I have successfully sent data up to 2.2MB, however it fails if i try and send a file such as 2.3MB or larger i get the following exceptions: General Exception Exception rethrown at [0]:    at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)   at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)   at SoftApp.WCFClient.MyServiceReference.IMyService.Document_Save(DtoDocument dtoDocument, Int32 usercode)   at SoftApp.WCFClient.MyServiceReference.MyServiceClient.Document_Save(DtoDocument dtoDocument, Int32 usercode) in c:\dev\softapp\softapp.wcfclient\service references\myservicereference\refe

System.Data.SqlClient.SqlError: There is insufficient free space on disk volume 'C:\' to create th

I received the following error...*****System.Data.SqlClient.SqlError: There is insufficient free space on disk volume 'C:\' to create the database. The database requires 2739929088 additional free bytes, while only 801185792 bytes are available. (Microsoft.SqlServer.Smo)*****I have looked at the solution recommended on this forum http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=79848 but I am still baffled as to why it says that I don't have enough disk space?  I ran the RESTORE FILELISTONLY command and it told me that the "size" was 1.4 GB and I've got 60GB left. I am starting to resolve that my .BAK is corrupt.  Are there any other reasons as to why I am unable to restore my backup?Thanks for any help in advance!

Large volume of INSERT/DELETE

 I support the DB for a 3rd party application. Lately users have been complaining about the application being sluggish. Checking into it I found that the vendor decided to write their own locking system (much like peoplesoft does/did). In short, I have a very small table (maxes out at around 3000 rows, 5 columns (int, varchar(30), int, varchar(30), int, int). They store locking records in the table, they are inserted and deleted at a rate of about 10 per sec/600-900 per min. The PK is composite across the columns required to enforce uniqueness. There is no identity column. The IO involved in these inserts and deletes results in enormous amounts of writelog waits. Since the application is built around this custom record locking, and this table is the record of who has what locked, delays on these inserts and deletes translates to delays on everything. I already have plans to address the disk subsystem. In the interim, is there anything I can do to relieve IO this table generates from a design perspective?

SSRS 2008 Export to PDF fails for large data with System.Exception: Parameter is not valid

Hi, We are working on Range Bar Chart using SSRS 2008. The report has huge data. It is properly displaying the data, but when we tried to export to pdf it is failing with the following exception. Exporting to excel works fine. Also if the report has small/medium data it is property exporting to PDF. It is failing only in case of huge data (we have custom page size of 28" X 14"). Of course the report has lot of expressions we used to customize the colors/text as per our requirements.   Server Error in '/Reports_SQLDEV2008' Application. Parameter is not valid. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Exception: Parameter is not valid. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace:   [Exception: Parameter is not valid.]   [Exception: An error occurred during rendering of the report.]   [Exception: An error occurred during rendering of the report.]    Microsoft.Reporting.WebForms.ServerReport.ServerUrlRequest(Boolean isAbortable, String url, Stream ou

data contract code generation for large/complex schema (HR-XML/OAGIS) - is there an alternative?

Hello, and thank you for reading.I am implementing a Service based on a predefined specification (HR-XML 3.0). As such, I am starting with the schema, and working my way back to code. There are a number of large Schema documents (which import yet more Schema documents) related to my implementation, provided by this specification.I am able to generate code using xsd.exe, by supplying the "main" and "supporting" xsd files as arguments. But there are several issues, and I am wondering if this is the right approach.- there are litterally hundreds of classes - the code file is half a meg in size- duplicate classes (ex. Type, Type1 - which both represent the same type)- there are classes declared as inheriting from a base class, but that base class is not generated/definedI understand that there are limitations to the types of Schema supported by svcutil.exe/xsd.exe when targeting the DataContractSerializer and even XmlSerializer. My question is two-fold:1. Are code generation "issues" fairly common when dealing with larger, modular xsd files? Has anyone had success with generating data contracts from OAGIS or HR-XML schema?2. Given the above issues, are there better approaches to this task, avoiding generating code and working with concrete objects? Does it make better sence to read and compose a SOAP message directly, while sti

ASP.NET Events and Pageload not firing when Datagrid containing large number of rows of data

Hi All, I have a datagrid in aspx page.Inside of datagrid i am using around 15 controls such as Button,dropdownlist and text box controls.Once the datagrid binds then the events in the aspx page not firing.Issue occured when the number of rowsgreater than 500.The number of rows less then its works fine.If anybody knows the solution please let me know."Platform i am working on ASP.NET1.1"Regards Hareesh

Wpf Charting for Large Data

Hi, Currently I am working on Charting control. Here I need to display 10000 data's using LineChart which is provided by the toolkit. But unfortunately the application gets slow because of this huge data. I tried using Dynamic Data Display which is provided by codeplex. Since it is in beta stage, is there any other way I could accomplish this without affecting the application performance. Please help me. Regards, Raaj

WCF Service fails with Unexpected connection closed with large amount of data

Good day all,   I'm sure this question is relatively simple, but I'm quite new to WCF (been on a Java project the last few years and it's nice to be back in MS territory even if a lot has changed!).   Anyway, here's all the details of my scenario I was hoping someone could shed some light on.   1)  This WCF service is hosted as a webservice and is being testing in the WCF Test Tool. 2)  There is a method call on the service that, while rarely used, can return a fair amount of data (over 20,000 objects) - this is a requirement of the system 3)  The object that is being serialized has the following signature/contract (property changed for example): [DataContract] public class Foo { private string prop1; [DataMember] public string Prop1 { get { return prop1; } set { prop1= value; } } private string prop2; [DataMember] public string Prop2 { get { return prop2; } set { prop2= value; } } private string prop3; [DataMember] public string Prop3 { get { return prop3; } set { prop3= value; } } private string Prop4; [DataMember] public string FaaCode { get { return prop4; } set { prop4= value; } } private string prop5; [DataMember] public string Prop5 { get { return prop5; } set { prop5= value

Storing large data in Session State

Hi, We have a scenario where we need to store large tables of data on Session in asp.net page. on a highlevel we have to store following data  1> A table (table 1) with 1000 rows and 8 to 10 columns. 2> A table(table 2) with 500 rows and 8 to 10 columns (actually user can add the data from table 1 to table 2, so as user keeps on adding from table 1 we remove the data and put it in table 2) 3> Another table(table 3) with 1000-1200 rows each having 3 columns. We are having a webfarm, so we need to store it in either SateServer of Oracle DB, please advise which one is better. What would be the performance implications of storing such huge data on the server. One more thing is at any point of time max of 70 -80 people will be accessing our website, so will this cause any performance degradation? Thanks in anticipation Harsha

Timeout issue with large data


Getting timeout error while retriving large chunk od data. it workd fine for smaller chunks.

We tried to increase the following but didnt help.. Any suggestions are appreciated.




" at System.Net.HttpWebRequest.GetResponse()\r\n at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)"


ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend