Hi. One measure of mine is defined as a data type of money in my star. There are many occasions where a single value can go out to 4 decimal places. But by design the sum of this measure across the entire fact table
goes to at most two decimal places. SQL returns a sum of $1,123,219,801.94000000 when I take sum across a subselect in which the measure is cast as decimal(20,8). When I drag the measure to my pivot area in the cube I get $1,123,219,801.09.
Why? What are my options for consistency between the two?
My metadata for this measure in the cube shows a display format of "currency" and a source data type of "double".
View Complete Post