.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

Data missed on processing Partition

Posted By:      Posted Date: September 07, 2010    Points: 0   Category :Sql Server
HI, I have created partition in two partition in SSAS cube, one contain data of >=getdate()-10 to <getdate()-2 while other conains >=getdate()-2 to <getdate(),,,,[[getdate() converted to date part only ]] i have created one SSIS package in which i have used "Analysis Service Processing task" and processed that partition whcih contaons >=gedate()-2 to <getdate() in Full Process mode. but i have got error which i explained you below. supoose on 20 Aug,i have data from 11 Aug to 18 aug and when i process the SSIS package ,it runs fine and update 19 August data but also eliminate the data of 17 August.   Kindly let me know if there is some mistake comiited by me or suggest.   Regards Shraddha  

View Complete Post

More Related Resource Links

IS There is any way out to process only one partition of cube and data of other partition remain ava

HI ALL. Please let me know is there is any way out to process one partition only,while data of other partition remain available in the cube with the help of SSIS

processing a particular partition dynamically using the AS Processing Task and Property Expressions

HI ,  Do any one have example/screenshots/article  of "processing  a particular partition dynamically using the AS Processing Task and Property Expressions..."   I have 4 reports  and 12 partitions for the report one it uses 1-4 partitions for report2 it uses 5-8 partitions,, is there any way to process the partitions dynamically , if i say report 1 it should process 1-4 partitions, report 4 it should process 9-12 partitions,     Thanks

processing a particular partition dynamically using the AS Processing Task and Property Expressions

HI ,  Does any one have example/screenshots/article  of "processing  a particular partition dynamically using the AS Processing Task and Property Expressions..."     Thanks

Requested Conversion is not supported, error while processing the partition

I have read the other topics that are related this issue and I thought that my issue was related as well until I checked the datatypes of all my measures and made sure that they indeed could be inherited properly.   I even went as far as deleting all measures out the measure group and I still recieved this error.   Is there a way I get more detailed information on what is not being converted properly?  The error is so vague and again, I have thouroughly checked my datatypes and they are fine and why would the partition not process, throwing me the exact same error when I decided to take out all measures in the measure group.   Here the second strange thing, I know that it has to be data related because I also ran the process structure and that finished with success. Please let me know if there is a better way to find the problem child or problem children pertaining to my measure group when processing data Here is a last thought, do you think that it could have something to do with the datasize?  I know SSAS automatically converts the datasize and I see several in my measures that have a zero for the size, I am speculating that could be wrong but is there a way I can check the datasize by running the script on my fact table?   ThanksNetwork Analyst

Finding table data partition bounds key values using SQL

I have a table that has few million rows and I need the front end application to partition all the data for this table, or any other large table, using sql where clauses so I can retrieve all the data for the table in different threads for each "partition" in parallel (I can't/do not need to know whether the table data is actually partitioned on any specific column at the server). Assume that each "partition" can have n rows (dynamically), is there an easy way of find out what are the bounds for such partitions in terms of the table PK/unique index? Example, create table table_test (col1 char(8) primary key); Example table data: col1 (PK) a b c d e f g   The bounds for table_test for partitions of 2 rows would be four partitions that have the following values for col1: c, e, g  and the font end app would be able to use these bound to generate sql such as: select * from test_table where col1 < 'c'; -- first partition select * from test_table where col1 >= 'c' and col1 < 'e'; -- second partition select * from test_table where col1 >= 'e' and col1 < 'g'; -- third partition select * from test_table where col1 >= 'g'; -- fourth partition So the question is whether the values 'c', 'e', 'g' bounds of the table data can be easily determined by using sql by the client font end.        Farid Zidan Zi

master page data processing

hi...I have added code below and the problem isMy database insertion is working good with normal asp.net page but when it is a content page of master page, stucking me to insert data in database. One of the expert is suggesting me to take the advantage of SQL Server Profiler to accomplish it, but I don't know anything about sql profiler. I only know about stored procedures but don't know what is sql server profiler?*************************************My Code Behind*************************************public string GetConnectionString() { return System.Configuration.ConfigurationManager.ConnectionStrings["myDbConnectionString1"].ConnectionString; } private void execution(string name, string username, string password, string emailid) { SqlConnection conn = new SqlConnection(GetConnectionString()); string sql = "INSERT INTO myTb (name, username, password, emailid) VALUES " + " (@name, @username, @password, @emailid)"; try { conn.Open(); SqlCommand cmd = new SqlCommand(sql, conn); SqlParameter[] pram = new SqlParameter[4]; pram[0] = new SqlParameter("@name", SqlDbType.VarChar, 50); pram[1] = new SqlParameter("@username", SqlDbType.VarChar, 50); pram[2] = new SqlParameter("@password&quo

copy partition data from 2000 cube to 2005 cube



i am upgrading 2000 olap cube to 2005.  i built an ssas project and deploy to the server.

the 2000 olap cube store it's data in partitions - one for each month.

the face table delete it's data and store data for only the last 13 month so basically most of the data

is store onle in the 2000 cube.

while upgrading the cube and processed it , a lot of information had lost.

i know that i cant restore the fact from the partiotn data but is there a way to copy the partition has is

from olap 2000 to 2005?


Problem with displaying and processing dynamic data


I need to dynamically generate a list of images in table format.  The images can be found in either art or picts, in temp, or in temp and in one of the other two (duplicate image). 

The idea is to display any duplicates prior to copying anything over from temp to either art or picts, so I'm thinking of adding radio buttons if the same image is found in two folders.  I also want to add a submit button.  The user selects the image they want and clicks the button.  If they want the new file it's copied over then deleted.  If they want to keep the existing file the new one is just deleted.  If they select nothing, both versions of the file are deleted.  So the table will look something like this:

temp         art            picts         process
z-123                                           delete button
() z-234                       () z-234  

Optimizing Data Mining processing time for a Time Series model in SQL Server 2008 R2


Does anyone have any references around optimizing the initial processing time for a data mining model? Books, blogs, etc. I have the Wiley “Data Mining with SQL Server 2008” book, and while I’ve learned a lot from it, it doesn’t seem to cover much around trouble shooting things like processing time. I also have a few other books that have a chapter or two on data mining, but again just basic “here’s what it is and how to set it up”, nothing that quite covers trouble shooting or optimization. I’ve also checked out the various Data Mining blogs/sites.


I’ve got a Microsoft Time Series model I’m basing on a Cube. Very simple, trying to forecast sales. I have one dimension which is the list of products (about 1600), a second which is the time dimension, finally the measure is daily sales figures for each, about 3.2 million rows in total. On a brand new server, with two quad core processors and 16 gig of ram it took 40 hours to process. Seems rather high?

SqlClient access in Data Processing Extension


Hi there,

I think it's a common situation for making a data processing extension, I'm wondering whether something wrong with my server.

When I deployed the dll in "C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin", I updated rsreportserver.config and rssrvpolicy.config as,




Extension Name="FSI"

Best approach for processing data from a website for use in a web application


 I'm a part time prgramming enthusiat and have developed several web applications on earlier version of ASP.  I'm preparing to get back intoit and have been searching for ways to accomplish my goals.  One goal is to acquire data from a website feed and break it down for use of specific data in my application.  Specfically I want to be able to grab scores from NFL games so that I can use them in my application.  What I believe I need to do is write my ASP.NET MVC code to capture an RSS feed, then parse the feed (I'm guessing using C# string commands to break the feed down to variables that I can cast as integers and then use in my application.)  What I am uncertain of is whether this is the best approach.  Is what I am attempting to do best done with ASP.NET, XML and C# or should I be considering another approach?  Any pointers or suggestions for good examples/tutorials/books that you would recommend would be greatly appreciated. 

lengthy processing (getting data in to 3 different temp files)



I need to do lengthy processing (getting data in to 3 different temp files) for final report.

I have SQL script ready and it works on SQL Studio however, same query from ASP get timeout error.

I am looking for some ideas on how to handle and make this work. Where can I place this processing?



data shaping question - can this much conditional processing can be done within the linq query?



I have a scenario where I need to map the value of a field to a different value... easier explained with simple example:

when actual field data is "RED" I need to return "some other data"
when actual field data is "BLUE" I need to return "something else" 

so I've written linq to sql statements that project and shape, but I've never had to include any 'conditional' processing, can you?

Custom Data Processing Extension doesn't show but should be registered correctly.



I have a custom data processing extension that i can't see to get to work. I followed the steps like in :http://msdn.microsoft.com/en-us/library/ms155086.aspx to register the extension. Yet the extension does not show up in the data source list.

It's an extension made for ssrs 2008 and I want to use it on a SSRS 2008R2 instance. I don't have any reasons to believe that I did not configure the extension correctly, it's the same as the article and and the same on my 2008 instance.

I am using Reporting Services standard edition 2008 R2 with cumulative fix patch 4 so that should support extensibility.

The data sources that I migrated from 2008 that still refer to the extension give a message saying that the extension is not registered or configured correctly. But i swear I did it right.

Any other extension that I use did register fine, none of them are DPE's though.


My config:


  <Extension Name="SecurityDataProce

Partition processing using AMO



I'm trying to process a partition using AMO and I have a table containing only the new data so I created a table binding for it and tried to call the overload of the Process function of the Partition class that gets the process type and the binding, but it threw a NotImplementedException. Does anybody know what the problem might be? Also, I want to specify the writebackOption parameter when I call the function, but I don't see an overload that gets both the binding and the writebackOption. Is there any other way to specify both of them for the processing operation, maybe by using properties?


Binding WPF Controls to an Entity Data Model

In this interview, programming writer, McLean Schofield, demonstrates how to bind WPF controls to an entity data model, using Visual Studio 2010 Beta 1. You can also learn more in the topic: Walkthrough: Binding WPF Controls to an Entity Data Model.

Surrogate vs Natural Primary Keys - Data Modeling Mistake 2 of 10

In case you're new to the series I've compiled a list of ten data modeling mistakes that I see over and over that I'm tackling one by one. I'll be speaking about these topics at the upcoming IASA conference in October, so I'm hoping to generate some discussion to at least confirm I have well founded arguments.

The last post in this series Referential Integrity was probably less controversial than this one. After all, who can argue against enforcing referential integrity? But as obvious as surrogate keys may be to some, there is a good deal of diversity of opinion as evidenced by the fact that people continue to not use them.
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend