.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

nvarchar(max) & SCD Transformation

Posted By:      Posted Date: September 27, 2010    Points: 0   Category :Sql Server

Anyone has a work around for using a source field with data tye navarchar(max) with SCD transformation. I tried Casting the field as navarchar(200), but still i recieve the same error as below.

Error at DimOpportunity [Slowly Changing Dimension [6116]]: The input column "input column "description" (6912)" cannot be mapped to external column "external column "description" (6689)" because they have different data types. The Slowly Changing Dimension transform does not allow mapping between column of different types except for DT_STR and DT_WSTR.

Thank You

View Complete Post

More Related Resource Links

WPF Geometry Transformation Tool

The geometry transformer is a simple tool I wrote to scale-, translate- and rotate-transform a geometry in the path mini language.

calculation, field and map traverse adjustment, and coordinate transformation

Free Pocket PC land surveying software -- COGO calculation, field and map traverse adjustment, and coordinate transformation -- for students and professionals.

Bug: Does not change column type from nvarchar to ntext


Scenario: If you create a table with a column of type nvarchar, populate it with some content and try to modify it using the WebMatrix editor, from nvarchar to ntext, it does not display any error message or warning that this is not possible. It silently saves and reverts the change, back to nvarchar. Wasted some hours on this issue as I thought my column was ntext, but in fact was still nvarchar.

Expected behavior: It should change to ntext, or if SQL CE doesn't support that, it should notify the user that no change was applied.

I need help about nvarchar field



how can specified nvarchar data type without N''?

Extreme ASP.NET: Text Template Transformation Toolkit and ASP.NET MVC


The Visual Studio T4 code generation engine lets you parse an input file and transform it into an output file. We give you a basic introduction to T4 templates and show you how ASP.NET MVC uses this technology.

Scott Allen

MSDN Magazine January 2010

SSIS Lookup Transformation Issue

I am having a strange issue, In my data flow i have a Lookup Transformaton which will match on key columns of the fact and is followed by a condetional split that would deside if it should insert (old db destination) or go to update (oledb command) or ignore if no change. I have packages running for the last 1 year with the same logic. But in the recent packages I am experiencing a problem.  example: Key columns for join are - type_no (varchar 16) with all numeric values except one record wihh ' '(space) in it  and type_cd (decimal(18,0)) with values(0,1,2,3,4,5) It worked fine when I test the package. After couple of day running in schedule I get integrity violation and huge file with failed records which are supposed to be blocked at the condetional split as they are already in the fact. When i add a data viewer what i found is for all the llokup columns its having nulls (no match found). Workaround that is working for me for now is - I select full cash and say ok in the lookup transofrmation and again open it and set it back to no cash. Then it starts working as expected. Did anyone come accross this kind of issue? is the some standard that I have to follow to make sure this doesnot happen again  

SSIS Changing Column Order during Transformation

First let me say, I really can't believe this chain of events myself--and they are happening to me. I am upgrading several DTS packages to SSIS on what will be my new production server.  These packages create tables, export them to a flat file, and ftp them off to other locations. What is happening (on the SSIS side) is that the OLE DB Source is reordering some of the columns on its own (moving them to the end of the table/file.  Then when my pickup/load routines run, the data is out of place and they fail. Can anyone please explain what is happening here with the mapping.  I have evaluated the table and the columns are in the order that I expect.  When I preview the source table in the OLE DB Source Editor the columns are in the correct order/alignment, but when them in the OLE DB Source Editor --Columns section within BIDS the order is changed arbitrarily. I have been somewhat successful (2 out of 3) in being able to re-map the data, but this last table just doesn't want to change.  Thanks in advance for any help and/or information you can provide

Conversion Failed when Converting the nvarchar value 'N' to data type int.

Hi.. I am facing a strange situation. when i executing the following query it is returning the error SELECT  s.row_id as shipment,    SUM(CAST(la5.attr_value AS INT)) AS Cases     FROM    shipment s WITH(NOLOCK)    INNER JOIN shipping_shift ss WITH(NOLOCK) ON s.shipping_shift = ss.row_id    INNER JOIN shift WITH(NOLOCK) ON ss.shift_id = shift.shift_id    INNER JOIN shipment_lot sl WITH(NOLOCK) ON s.po_id = sl.po_id AND s.cust_id = sl.cust_id AND          s.so_line_no = sl.so_line_no and s.ship_date_local = sl.ship_date_local     INNER JOIN lot_attr la5 WITH(NOLOCK) ON sl.lot_no = la5.lot_no AND sl.item_id = la5.item_id AND la5.attr_id = (SELECT attr_id FROM attr WITH(NOLOCK) WHERE attr_desc = 'CaseQty' AND attr_grp = 3)  WHERE s.spare2 = 'RELEASED'       AND shift.shift_desc = 'c'   AND DAY(ss.production_date) = DAY('2010-08-09 00:00:00.000')    AND MONTH(ss.production_date) = MONTH('2010-08-09 00:00:00.000')    AND YEAR(ss.production_date) = YEAR('2010-08-09 00:00:00.000') GROUP BY s.row_id  Msg 245, Level 16, State 1, Line 1 Conversion failed when converting the nvarchar value 'N' to data type int. The column  type of attr_value of lot_attr table is a USER D

How to store binary data as nvarchar(max)?

I’m trying to store a blob of binary data as nvarchar(max) and it gets truncated after just a few bytes. I wonder why and how can I prevent it. I don’t use varbinary(max) because I’m trying to keep existing DB schema intact. Current DB contains a huge chunks of text in nvarchar(max) column and I’m trying to compress (zip) this text and store it back to the same field to reduce DB size.   DB is accessed using Sync Framework (and Entity Framework) on the client side, so actual DB engine is SQL Server Compact 3.5 SP2   Thank you! KonstantinThank you! Konstantin

How to use ESB Transformation Service in .Net Code

Good day everyone, I am creating a .Net application which receives different XML files. Some of them need conevrsion from one format to another format. I have created some BizTalk maps for them. But I don't know how to use them in .Net code. I found some info that there is some ESB Transformation services in BizTalk, which could be call from .Net code. Can some one tell me some sample code or some article which explains that how will we use these ESB transformation Services? Thanx in advance.  

can i use nvarchar(300) for a column having records as website links?

Hi guys, small question. is it good to use nvarchar(300) for a column having records as website links. i was told the size may exceed 255 characters. so just thinking is it possible?

How to recover from Derived Column Transformation Editor corrupting table metadata?

Hi there, In attempting to use the DCTE to replace the value in a field with a trimmed version (SSN = trim(SSN)), it seems that my meta-data has become corrupted by the Derived Column drop-down list. As a result, #1: I no longer see my incoming SSN field in the "columns" tree, and any reference to it is deemed "invalid", even though it's value does make it out of the Data Transformation process. How can I get back the reference to this field without redoing this entire task? And, #2: in the process, DCTE created new columns with "SSN" prefixed by the task name, such as "trim character fields.SSN". How can I delete these? It seems that one slip of the mouse in this form can lead to irreversible corruption of the meta-data, which the "debugger" then references and uses to "invalidate" subsequent work. I have tried everything I can think of to refresh this, including using the "Advanced Editor" and reloading the entire package. Any ideas? Thanks, Karl Kaiser


In the following code, should we change the parameters from SqlDbType = SqlDbType.Int to SqlDbType.NVarChar?   protected void Page_Load(object sender, EventArgs e) { SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["sqlapplicationConnectionString"].ToString()); try { SqlCommand cmd = conn.CreateCommand(); cmd.CommandText = "INSERT INTO Application (" + " First, Last, City, State" + ") VALUES (" + " @First, @Last, @City, @State"; cmd.Parameters.Add(new SqlParameter() { ParameterName = "@First", SqlDbType = SqlDbType.Int, Value = Convert.ToInt32(Request["First"]) }); cmd.Parameters.Add(new SqlParameter() { ParameterName = "@Last", SqlDbType = SqlDbType.Int, Value = Convert.ToInt32(Request["Last"]) }); cmd.Parameters.Add(new SqlParameter() { ParameterName = "@City", SqlDbType = SqlDbType.Int, Value = Convert.ToInt32(Request["City"])

Connect to an existing connection manager from within a custom transformation?

I'm developing a custom data flow transformation and have hit a roadblock.  I need to create an ADO connection to one of the connection managers that's defined in the package already.  I can either hardcode the name of the connection manager, or define it as a custom property....the problem I'm having is accessing it at all.  What I've found is that ComponentMetaData.RuntimeConnectionCollection[0].ConnectionManager only has the connection managers defined by the component itself, not already existing in the package.  How do I create a connection to a connection manager that's defined at the package level at design time? Thanks

How do I suppress the creation of nchar(), nvarchar()

Some of the wizards and import tools that create tables use nchar() and nvarchar() for character fields. Also the upgrade wizard from Access does the same thing. Is there a way to prevent the creation of nchar() and varchar() fields and to use char() and varchar()? Is there some TSQL I can use to pass over the exisitng tables and conver the datatypes?

Can a Synchronous and Asynchronous Transformation in one data flow

Hi Guys, Can a Synchronous and Asynchronous exist in one data flow? I mean, in my data flow, i have 5 Synchronous and 1 Asynchronous Transformation in the end all in one flow. The output is not the way it was use to be because im introducing a Asynchronous Transformation at the end for 2 columns. Any suggestions? Thanks,

Making my own CMS - What if an article is more than NVARCHAR(MAX)

Hi,I'm looking to create my own cms and just thought of something that could be a problem.What if I write an article that is in excess of what NVARCHAR(Max) can store?What would you do?Run a quick bit of code to separate the article into smaller chunks?Or is there a better way?I know I could create some sort of key system where there is an "article" primary key and in the same table, there would be an "article part" integer, then run a query when I want to display the article to stick it back together, but is it a good idea or am I over complicating it?Thanks
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend