.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Performance Problems with calculate member in a cube

Posted By:      Posted Date: August 31, 2010    Points: 0   Category :Sql Server
Hi,  I'm having performance issue with calculated member (Running total) which is created on one dimension in the cube.  Cube is having three dimension Product Details(Product Model, Product Type) , Time(Year & Month) & Qty type(In Qty & Out qty). See below for details Dimtime Year (Values 2000 to 2020) Month DimProd Product No Product Type Product Group DimQty Type In Qty Out Qty Fact Table Product No Year Month Qty Type of Qty( In Qty or Out Qty ) My MDX query for calculated member is  CREATE MEMBER [Qty Type].[Qty Type].[All].[Total] as      SUM(NULL:[TimeDim].[Hierarchy].CurrentMember , ([Qty Type].[Qty Type].[In Qty], Measures.[Qty]) - ([Qty Type].[Qty Type].[Out Qty], Measures.[Qty])  The formula is --> Total = Previous Period Total Qty + Inqty - Out Qty   This works perfectly when i browse at higher level (Qty Type on Rows & Time On Columns),   but when  i browse the cube adding dimension Product Details( Product group or Product Type) to Drill-through the In Qty & Out Qty,  the query is running and after one hour also it's not finished. Can anyone help me to solve this or propose new structure to resolve this issue.   

View Complete Post

More Related Resource Links

Top .NET Performance Problems and how to avoid them

Every time I work with one of our .NET customers to help them with managing their application performance I come across the same problems as seen with other clients before: lots of ADO.NET queries, many hidden exceptions in core or 3rd party .NET libraries, slow 3rd party components, inefficient custom code

create a calculated member in a cube in SSAS

Hello   I’m trying to create a calculated member in a cube in SSAS and was hoping someone could help.   What I’m looking to do is count the number of occurrences of a value in a field.  I have Status field that contains New, Renewals and Enquiries and another field that contains a count, the problem I have and the reason for this query is the count is set to 0 for Enquiries.   The tSQL equivalent would be something like...   Select Date, Status , count(Status) From aTable Where Status = ‘ENQ’   And to summarise, what I’m looking to do would look something like the below   Date Status Count Commission Apr-10 NEW 1 100 May-10 NEW 1 100 May-10 REN 1 50 Apr-10 ENQ 0 0 May-10 ENQ 0 0 Jun-10 REN 1 50 May-10 NEW 1 100 Just looking at the ENQ by month. Date Count of ENQ     Apr-10 1     May-10 1     Jun-10 0       Any help gratefully received.

Problems deploying Adventures Works 2008 Cube

Hi,When I try to deploy (build is fine) the Adventure Works 2008 Cube, nothing happens (Green circle just spins, and no progress indicators). I had no problems installing the Adventure WOrks databases.I have a newly installed SQL Server 2008 box, and currently, all accounts/services are running under "Local System"I tried googling this issue, but haven't found anything useful.Ideas anyone?

Excel problems to browse the Cube.

Hi,    I have a cube and i'm facing problems with to access the SSAS 2008 cube from Excel 2003.  Initially i tried to access from one machine (Windows XP & Excel 2003)  it worked. When i try to access from another machine (Windows 7 & Excel 2003) i'm getting the below error message. Excel Was unable to get necessary information about this cube. The cube might have benn reorganized or changed on the server. Contact the OLAP cube administrator and, if necessary, set up a new data source to connect to the cube. Please help me.

How to calculate a SQL Server performance of a query based upon table schema, table size, and availa

Hi What is the best way to calculate (without actual access to a SQL Server) the processing speed of a query (self-inner-join) based upon table schema, table size, and hardware (CPU, RAM, Drives)? ThanksThanks Jeff in Seattle

Cube design problems

Hi. I am working on an assignment which is aimed at building a cube which will be used by 14\15 different department of the organization.As per meeting with the users, and their needs, we decided to have 15 dimension and one measure group for the prototype. Now , the users from those departments are different and the way they see this cube is also very diff. The finance wants to slice and dice data and seems to have understanding about BI.but the some departements want to see detailed data after a certain level.. they are providing huge challenge to us. they want the cube to deliver them detailed information about those high cardinal members with very less processing time. We thought of designing actions , but  the user can following multiple navigation to reach the detail data.We thought of putting that in a dimension table but it killed the cube..like the users always faced out of memory issues ..to me it seems, we do not have proper knowledge on design..I would request your help to understand this real time challenge, which i am sure ,some one must have come accross.  

Improving performance of UPDATE CUBE by executing preceding CREATE CACHE for the same cell / subcube

Hi All, Recently I came across one of suggestions by Microsoft Advisory consultant regarding optimization data write-back pattern for the SSAS cubes. It was recommended issuing CREATE CACHE command following by UPDATE CUBE for the same cell/subcube definition: CREATE CACHE FOR [MYCUBE] AS ("Cross join of all the dimension members defining subcube to be updated") UPDATE CUBE [MYCUBE] SET ("Cell") = 100, ("Another Cell") = 200,... etc. for every cell. This seems quite weird, but based on our benchmark the entire process (CREATE CACHE and UPDATE CUBE) takes less time than just UPDATE CACHE. Folks, can someone advice on the insights of the process? Shell this approach to be suggested for an enterprise-level planning application? Regards, -h

Named Set And Calculated Member within Analysis Services Cube project


Hi there.

I have got the mdx below (titled as MDX In QUERY SsMS) that works fine in SQL Management studio.

I am battling to include it in my Analysis Services project.

Following the logic in the mdx query, I have created the 3 named set and one calculated member (You will see them below), but with no luck.

Please any guidance will do, I desperatly need to include it in my cube



//Get Weeks with sales
[WeeksWithSales] as '
{[Product].[Product Hierarchy].[SKU DESCR]
,([Measures].[SLS R]) >0)}

//the first week
[FirstWeek] as 'HEAD([WeeksWithSales],1)'

[LastWeek] as
[NumWeeks] as 'count([FirstWeek].item(0).item(1):[LastWeek].item(0))'

[NumWeeks] on 0
 (select [Product].[Product Hierarchy].[SKU DESCR].&[BALL BRIDGESTONE 08 E7 +]&[1041597] on 0
 from (select ([FinancialDate].[Financial Calendar].[FINANCIAL MONTH].&[FY2010_MONTH06].parent.parent) on 0
   from [TPSMerch]))

SSAS - performance penalty standard dimension without custom member formulas VS standard dimension w


Hi all

I was wondering if there is a big performance penalty if one introduces custom member formulas to a standard dimension.


adding mor tables to datasource view causes writeback performance problems


I've got a very strange behavior in one of our SSAS databases. We had performance issues so i've tried to put only the essential parts into one OLAP Database

the result is:

Just by adding more tables to the datasource view the writeback performance drops from 0,2sec to 3 sec. No changes to the cube, no changes to the dimensions.
The t-sql statement fired for the writeback remains the same.
Any idea?



Web Synchronization Performance Problems


Hi all,

We have a Web Synchronization for merge replication system (67 tables, 55 of them should be Download-Only. Lots of them Dynamic Filter, and 10 of them have Join Filters) between sql server 2008 R2 (as publisher) and sql server express 2008 R2 (800 pull subscriptions with only web synchronization, with 3G connection). I googled for all of today about performance tips and tricks, and set our profiles, settings etc..

My problem is at initial synchronization process. First of all after Creating Replication triggers (before this other snapshot scripts are being applied) at Initial Web Sync Process, we wait 5-15 minutes after last ".. creating replication triggers on table 'tblOrder'  message. I think after creating replication triggers, Merge Agent connects to Web Sync Replisapi.dll and it computes the data needed, but this takes too long 5-15 minutes??

And after these download operations, after all necessary data is downloaded, i think merge agent is starting to download the snapshot. Why this is necessary? because before the initial synchronization, i create all tables manually, and after that i create merge subscription manually. And then, starting merge process, snapshot is being applied to subscription database, necessary data is being downloaded so why the whole data is being downloaded after data transfer again? It takes too long

Tackling performance problems with processor maxing out


I am tackling some performance based issues on my server, the situation is that when the application is running, it runs a loop from a c# application, then calls a stored procedure about 2,000 times, during this period cpu is pegged at 100% all the time.

All the queries are select queries on the database pulling off data to be fed into the application, what I was seeing before was that the stored procedure was denormalising data and puting it into a temp table, then selecting from the temp table out to the application. I thought at the time that this wasnt effficient, so first task was to remove the use of the temp tables and query tables directly in order to avoid write costs.


I then later noticed that when I call the stored procedure from management studio once i.e 1 thread, performance seems to be 60 - 70% better, but within the application its called with multiple threads and therefore multiple concurrent calls to the database, i.e sometimes 6 - 10 concurrent calls, then the proc that runs within a second now runs for about a minute. CPU is still pegged at 100%, checked waits, I see a lot of CXPACKET waits and ASYNC_NETWORK_IO waits as well. CXPACKET I dont think i should be worried too much because the app returns between 3k to 30k rows per call, and parallelism might not be a bad thing with constant reads (please correct if i'm wrong her

Cube Browsing in Performance Point 2010 as in SSAS and BIDS cube browser


Hi All,

Can we get the cube browsing experience as we get in SSAS and BIDS cube browser in SharePoint 2010 using performance point or excel services?

I am most interested in drag and drop features for dimensions and mesures.

More or less same browsing features were available in BIP viewer webpart but it is no more available.

So is there anyway we can drag and drop dimension members and measures in performance point dashboards?


Cube Browsing in Performance Point 2010 as in SSAS and BIDS cube browser


Hi All,

Can we get the cube browsing experience as we get in SSAS and BIDS cube browser in SharePoint 2010 using performance point or excel services?

I am most interested in drag and drop features for dimensions and mesures.

More or less same browsing features were available in BIP viewer webpart but it is no more available.

So is there anyway we can drag and drop dimension members and measures in performance point dashboards?


SSRS 2008 R2 and Sharepoint 2010 - performance problems


Hi all,

I'm currently in a DWH project using SSRS 2008 R2 x64 in Sharepoint Integrated mode along with Sharepoint 2010, and we're seeing massive performance problems even though traffic on the server is at a bare minimum. We're in development with a maximum of a few users online at any given time and the server itself isn't doing a whole lot...cpu is usually < 20%. We have timed the database queries and 90% of them are taking less than a second to execute while reports takes anywhere from 5 to 30 seconds to load (some heavier reports can occasionally take a lot longer). I am simply unable to get anything to load in less than 5 seconds, even the simples grid from a SELECT TOP 5 * FROM table query takes this long.

The server setup is suboptimal as SQL Server, Reporting Services and Sharepoint are all running on the same virtual machine but with the minimal amount of traffic we have this shouldn't really be a problem. The server is a Win 2008 x64 Datacenter with 4 CPU's and 16GB RAM.

A standard sharepoint page that gets data from the database is returned in milliseconds...why doesn't reporting services do the same? What can I do to increase performance?


Max "workflows in progress" - performance problems?



A site in our farm has a workflow that is set to pause until a certain date (based on some calculations). Currently there are about 1350 workflow instances in progress ("paused").

Our farm is also experiencing some performance issues. Both web front ends are operating at 90-100%. Do you think the number of workflow instances running could cause this kind of performance problem?

Thanks for your time.

Edit: I should add that the workflow was created with SharePoint Designer.

Is cube limit the number of calculation member?



I develop new application and user can create new calculation member. So some one can tell me "how many Calculation member a cube can contain?"


Much thanks.

ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend