I would like to know if there are any specific space optimization techniques that are available? I have read a white paper that says anything beyond 2M rows per partition or 2G per partition is not suggested.

I have a cube that has 19 dimensions, 5 measures, 13 calculated columns.

Fact table has 12 million rows per day on average. Database table size on average is 650M per day.

Building a cube for one day is 1.5G. I need to have atleast 40 days of data in the cube.

I would appreciate if anyone has any pointers in this regard.

**View Complete Post**

## Multi-threading in .NET: Introduction and suggestions