.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

OLE DB Destination issue in ForEach Data Flow

Posted By:      Posted Date: August 31, 2010    Points: 0   Category :Sql Server
Has anyone encountered the following problem:I have a ForEach container with a single task in it - a Data Flow. The Data Flow uses a Source Script transformation to read each file and writes out to several output streams, each of which is connected to a OLE DB Destination.The ForEach executes for each file (in my example 3 times) but only the data from the last execution is in the DB tables.This from is my output window:SSIS package "PIF.dtsx" starting.Information: 0x4004300A at ETL Audit, DTS.Pipeline: Validation phase is beginning.Information: 0x4004300A at Load V6 Policy, DTS.Pipeline: Validation phase is beginning.Information: 0x40043006 at Load V6 Policy, DTS.Pipeline: Prepare for Execute phase is beginning.Information: 0x40043007 at Load V6 Policy, DTS.Pipeline: Pre-Execute phase is beginning.Information: 0x4004300C at Load V6 Policy, DTS.Pipeline: Execute phase is beginning.Information: 0x402090DF at Load V6 Policy, stg_RiskLocation [5449]: The final commit for the data insertion has started.Information: 0x402090DF at Load V6 Policy, stg_PDMRFeed [5277]: The final commit for the data insertion has started.Information: 0x402090DF at Load V6 Policy, stg_Policy [5208]: The final commit for the data insertion has started.Information: 0x402090E0 at Load V6 Policy, stg_RiskLocation [5449]: The final commit for the data insertion has ended.Information: 0x402090DF at Load V6 Pol

View Complete Post

More Related Resource Links

Data Reader Destination issue

Hi, I'm new to BI, I created a package to test the data reader destination the dtsx is executing well every thing is green but what after that. The question is how to use this Data reader destination? As I know as a .net programmer there  is a class named data reader that could be used to retieve data in connected mode programmatically using a  connection and a command objects, but in the case of data reader destination how to use it. Or am I confusing the data reader that I kno in the ADO .Net with this one used in BIDS  Should I add a script task after the data reader destination or should I consume this data reader within my proper separate code with a dll or an exe that I create as a .Net project ? For instance, two properties are remaquables for me those are locted within the custom proeprties  datareader and  usercomponentdatatype   The complexity resides in the simplicity

SQL 2008 R2 Custom Data Connection Issue with Data Flow - Data Sources


I have a custom data connection that we used in SQL 2005 to connect to a non-standard source to pull data.  We are in the process of upgrading to SQL 2008 R2 and I am running into an issue that hopefully someone can help me with. 

I have upgraded my custom connection code for SQL 2008 R2.  I can set up a connection using the custom code inside of an SSIS package.  However, when I go to a data flow task and try to select a connection for a data source object, OLEDB or ADO.Net data sources, my connection is not available.  If I select new on the connection option, my connection type is not available.  To get around this in SQL 2005 I used the data reader source.  The data reader source is not available in SQL 2008 R2.  So I am stuck. 

My custom connection is implemented as an IDBConnection.

Dynamic data flow destination?



Is there a way to dynamically change the desitnation table so i can write Data Flow task inside a For Each Loop and change the source table and destination table dynamically?

How to programmatically execute data flow task whose destination is a data reader


Hello everyone

There is a SSIS package containing (among other executables) a data flow task which writes a string to a DataReader Destination. I need to execute just that data flow task and read the string from DataReader Destination programmatically.

I can run the whole package from C# and then read the value. Also the data flow task in question must be embedded in the package.

Does anyone know how to do this?

SSIS Object model - dynamically identifying the source and destination tables of the data flow/ tran


Hello All,

I would like to identifying the source & destination tables of the data flow or the transformation task.

I need to create a custom component which i can put into existing packages which can do some validation.

Please suggest ways i can accomplish this.




SSIS Data flow OLE DB Destination error


I am getting the following error In the data flow task , its failed at ole db destination. Please help me as soon as possible. Thanks in advance


Error: 0xC0202009 at Dataflow task, OLE DB Destination: An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available.  Source: "Microsoft SQL Native Client"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC0209029 at Dataflow task, OLE DB Destination [9478]: The "input "OLE DB Destination Input" (9491)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (9491)" specifies failure on error. An error occurred on the specified object of the specified component.
Error: 0xC0047022 at Dataflow task, DTS.Pipeline: The ProcessInput method on component "OLE DB Destination" (9478) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Dataflow task, DTS.Pipelin

Data Flow Task Hangs - Oracle destination


Transferring 2 million records from SQL server to Oracle using simple OLE DB Source and Destination in SSIS. 

The package hangs in yellow showing 7000 rows passed to destiantion, nothin happens for hours, next day I came in and saw 70000 records processed and still in yellow. 

Today the same thing is happening. It works(2 million records processed in 40 secs) when I change the destination to SQL Server. I am guessing its something to do with the Oracle destination. I am really new to oracle environment.

So what can I do? is it something I have change in SSIS pakcage or the destination itself. If you guys need more info about this. please ask. Thanks

Sybase IQ As Destination For Data Flow Task




We are having SQL Server 2008, When I try to write data to Sybase IQ as OLE DB destination I get following error ..



OnError,HQDT092,OFOTO\aagrawal,Package,{FB018B82-7870-48EA-8343-AC8D669147DA},{BC315AAB-FAF7-4713-94DC-C0A582A8D0FC},10/10/2008 10:43:51 AM,10/10/2008 10:43:51 AM,-1071636471,0x,SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.


Please help ...

Data Truncation issue with Enterprise Library Logging WriteLog stored Proc


Hi ,

I'm using Enterprise Library Logging  feature for logging. The issue i am facing is when the Logging message is too large(more than 65534 chars) ,complete data  is not logged in the Formatted Mesage column which is  of data Type nText .

I am able insert complete data if i try inserting from Sql insert Query from sql management studio. Do i need to add any attributes to data base listener or do i need to change the sp.

 Is there any way to increase the WriteLog stored proc param size in EnterpriseLibrary.Logging config file ? . Please let me know.


Thanks In Advance.

Dynamic WPF: Create Flexible UIs With Flow Documents And Data Binding


Flow documents offer enormous flexibility in arranging text layout and pagination, but they don't support data binding, so you can't dynamically change content. Here we build a component to solve that problem.

Vincent Van Den Berghe

MSDN Magazine April 2009

How do I add two XML tasks to a Data FLow task?

I am new to SSIS and am developing a package which takes input from an XML file and then populates an existing SQL table with these records.  However, this is a very large XML file.  I keep getting an Out of Memory exception.  (This file is 210 MB)!  So what I tried doing was separating this XML file into 8 smaller files.  Now, I can successfully load data from these smaller files one at a time.  But how can I do all of them in the same package?  I connected two XML tasks to the Data Flow task and get the error below.  I also increased my timeout and number of errors to no avail.  Advice? Error: 0xC002F304 at XML Task 1, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.". Task failed: XML Task 1 Warning: 0x80019002 at zSkywardEnrollment: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. Error: 0xC002F304 at XML Task, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.". Task failed: XML Task SSIS package "

XML Source (Data is not loading to Destination ,Package Runs sucessful)

Hi First I run execute SQL Task SELECT TOP(10) [Message],opportunity_Id  FROM SBP FOR XML PATH ('GS_Opportunity'), ROOT('GSOpportunities') That will make 10 different file into single file then i have assigned this output to variable and it will output as XML Now I put Data flow task ,in it I put XML task and in source i have put XML Data From Variable and I select XSD file location,now XML task is showing me different source table and i have selected appropriate destination table but when i run this package it will not transform data from source to destination,Can you suggest me what is the problem i try to put data viewer it is not transforming any data  while insted of if i select direct file location instead of variable it is running good    

How to detect that a Data Flow Task has failed (from inside a custom componet)

Hi, in the PostExecute Method of a custom component i would like to check if the Data Flow Task has been failed. Is that possible? Also it would be great to be able to check if this custom component already got all data or has been marked as "green".  Thanks, Dennis

Custom SSIS Data Flow Component Not Showing in Toolbox or GAC

Hello - I have created a very simple data flow component for SSIS (Actually, I am following this example:  http://www.microsoft.com/downloads/details.aspx?familyid=1C2A7DD2-3EC3-4641-9407-A5A337BEA7D3&displaylang=en).  However, when I register the DLL to the GAC, I am unable to find the assembly in C:\Windows\Assembly - even though the GACUTIL says "Assembly Registered Successfully".  Furthermore, after copying the DLL to the PipelineComponents folder for SSIS (C:\Program Files (x86)\Microsoft SQL Server\100\DTS\PipelineComponents), it does not show in the "Choose Items . . ." dialog box of SSIS.   I am running SQL Server 2008 Dev edition, Visual Studio 2010 with .NET 4.0, and Windows 7 Enterprise 64-bit edition.  Any assistance/thoughts would be appreciated. Thanks!

Data flow task with multiple destinations randomly hangs

I have a package whose control flow consisting of a single data flow task with multiple destinations continues to hang at random locations.  The data flow task is fairly simple: - One OLE DB source (one query of selecting about 10 fields from one table) - 9 lookups that serve basically as left joins to capture which data "drops off" (does a row redirect on on lookup errors to an "error flow") - Two destinations: one the captures all the data that doesn't drop off from the lookups, and another to capture the data that does drop off; these write to two tables unrelated to each other, and unrelated to any of the tables I'm reading from I have tried the following: - Changed one or both destinations from OLE DB to flat file -- still randomly hangs - Removed both destinations altogether -- this always suceeds.   - Removed destination A and leave destination B -- this always succeeds - Removed destination B and leave destination A -- this always succeeds I took great care in making sure I get no warnings when the package runs, even resolving the unneeded column warnings, and the duplicate keys on all my lookups, but it still hangs. However, once I give up and stop the package, the following error is thrown (among others that appear to be a byproduct of the root issue): Error: 0xC02020C4 at Data Flow Task, OLE_SRC AP12 [1]: The attempt to

Importing xml data with bcp issue

I am trying to transfer one table's data from one server to another. The table structure on both servers are identical: CREATE TABLE [dbo].[_CachedQueriesArchive]( [id] [int] NULL, [statement_text] [varchar](max) NULL, [execution_count] [bigint] NULL, [avg_logical_reads] [bigint] NULL, [last_logical_reads] [bigint] NULL, [min_logical_reads] [bigint] NULL, [max_logical_reads] [bigint] NULL, [plan_handle] [varbinary](64) NULL, [query_plan] [xml] NULL, [cursor_type] [varchar](max) NULL ) I am using bcp DBNAME.._CachedQueriesArchive out e:\temp\cq.dat -N -T statement to export data. And bcp DBNAME.._CachedQueriesArchive in e:\temp\cq.dat -N -T to import. All records were imported successfuly. The issue is that column "query_plan" (of "xml" type) is filled in source DB in all 22 records. But in target DB it is filled only in 2 (two!) records. Other columns were imported perfectly. Tried bcp with -e option. The error file was empty. Tried BULK INSERT [_CachedQueriesArchive] FROM 'e:\temp\cq.dat' WITH (DATAFILETYPE='widenative') for import - same result. Tried bcp with -w option instead of -N. No success. Tried importing file on the source server - everything was fine (all 22 records was imported with their "query_plan" data). The problem occurs only on target server. Servers have different versions: source - 9.0.3257 target - 9.0.4053

Convert Data from TXT file and face a length issue! Need help!

Hi All, I am trying to convert some TXT files in to sqlserver database by using SSIS.  The problem I am facing right now is that some TXT files I have are fixed with line length 286.  However, for the some other TXTs they have 296 characters per line. The reason why this happened is some one added one more field before generating those TXTs.  Instead of creating two SSIS packages for converting those TXTs, can I find another way to get around this?   
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend