I want to store the data in temp table, which is best way to use
hash table or datatable.
Looking forward the reply
View Complete Post
I have read the data of a excle file and captured the data into the dataset to a datatable, now that data is to be inserted into a SQL Server table using bulk copy option. I am using -
public bool BulkEnterData(DataTable dt, string tblName)
SqlBulkCopy bulk = new SqlBulkCopy(con);
bulk.DestinationTableName = tblName;
ERROR Getting-A transport-level error has occurred when receiving results from the server. (provider: Shared Memory Provider, error: 0 - The pipe has been ended.)
How to resolve the above problem or any other solution is available.
I'm making the leap from using all the 'wizards' that are built into VS, and have been doing more with code-behinds. But, I'm curious as to when/why I should use a reader, versus a databable, table adaptor to "manage" the data returned from a stored procedure. Typically, the data is to be displayed on a webpage, and not written back to the database. What's the difference? When should I use one over another?
I'm going to use TVP:s to enhance performance in my c# .net fw 4 application which uses stored procs in SQL Server 2008. My first test was very successful and I inserted a batch of rows from by using a TVP in a stored proc. So far so good. Now
to my question: when looking for the performance profit with SQL Profiler I saw it was really fast, but the generated sql-code made me a little bit confused. My "table" (DataTable class) from the .net-app was "retabled" like this in sql:
declare @table MyTvpType;
insert @table values (1, 2, 3...); ... x number of times
exec MyProc @table;
This was quite surprising, but when examining my .net code, I think there should be som way to "import" my MyTvpType into .net code, and avoid the extra delaration and inserts done as it is now. I'm using this code in .net:
I've created a winform c# application that connects to an oracle database and loads a datatable with all the data in the oracle table then exports it out to a delimited file, which will then get imported to a local mysql database. This works just
fine when the oracle table isn't so large. However I keep getting an out of memory exception when I populate my datatable with a large oracle table. I can't manually export data from oracle then manually load to mysql because this has to be seamless
to the user. I'm having difficulties grasping how to solve this problem. Is there a better solution then what I have to get from oracle to mysql?