The below query is taking about 30 sec to execute in the management studio. I tried to visualize the Estimated execution plan and everything is a clustered index scan. The number of rows in the fact_dailycost table are 384962 and all the dimension tables
have less than 1000 rows.
Please let me know how should I start tuning this query. This I'm creating a view and in the SSRS Report, I have a parameter Contract Number that is applied on the top of this view. For the report to display the results it is approximately taking 21/2 minutes.
View Complete Post
WCF Service call routing to COM+ application.
netTCPBinding, throttling set pretty high.
[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode=ConcurrencyMode.Multiple)]
Service has two methods, one simply returns the string was input, the other calls the COM+ component and FOR TESTING returns the string "success" for every call. Therefore, both methods return a simple string.
Client starts 20 threads, each calling the WCF Service twice. (NOTE: A "first call" is made to prime the singleton service and also the COM+ component)
Using log4net and a
Stopwatch to log the timings of the clien
I am exporting a 12,000 row, 20 column report to Excel from SSRS 2008. Once opened in Excel, everything is fine. If I further filter the data through Excel (no external data connections) performance degrades. If I filter down to say 20
rows, it nearly completely bogs down my entire PC. I've emailed this spreadsheet to several others, who then experience the same issue.
If I export this same set of data from a different BI tool, such as Microstrategy, there are no performance issues what so ever.
Hi I am changing the view of my Performance Point Services report and want to save the same like it was possible in proclarity. I have been told that all features of Proclarity are there in Sharepoint Performance Point Services.
I was able to find the decomposition chart but not the option of saving the modified view. Has this been removed,If yes can i use Object Model and write code for it.
I developed a report with BIDS which uses a stored procedure on MS SQL 2008 Database to populate its dataset used as Datasource. The report runs very slow (10 mins) and if the stored procedure is executed separately on MS Server Management Studio
(SMS) it takes only a few seconds with the same parameters. Appreciate if anyone could help on this issue.
We recently upgraded a report server and corresponding reports from SQL Server 2000 (reports 2003) to SQL Server 2005. After upgrade one of our reports which use two subreport is very very slow.
The sp's used for these all these reports are fast. All the sp's together takes about 40 secs.
Both in the BIDS environment as well as the report site the report is very slow.
We are using a database in which data is inserted through replicatoin. In all our sp's we have given 'with nolock' along with the table names.
Can anyone please help me here?
We have a slow-performing query and, after some analysis, have narrowed the cause down to using parametrized queries in conjunction with LIKE and aggregate comparisons used in the WHERE clause. If we use a non-parametrized query (but keep the WHERE
clause the same) the query performs much faster and, in fact, executes a different plan. Using OPTIMIZE FOR UNKNOWN is not an option for us as we are using SQL Server 2005. Obviously, we'd like to use parametrized queries to prevent SQL injection
but the slow perf is unacceptable. We need the LIKE to support wildcard scenarios. If we need to we'll use dynamic SQL (scrubbed as much as we can) in lieu of parametrized queries. We're wondering, however, is there another option that would
give us the perf without sacrificing safety? I can provide a sample database, query, plans, etc. if necessary...I'm assuming SQL Server generates a less efficient plan with the parametrized query as it can't make any assumptions about the parameter values...
I understand how to upload files to the media web part and creating an asset library, however; i'm trying to figure out where these files are being stored and the performance of the users. I read the digital asset management library information on
technet and couldn't find the exact answer i'm looking for.
So, here's my question: If I want to upload 10 videos to the asset library, they each may have around 50mb or less for the file size, and we want 200 plus users to access it, will this affect the performance of when these users hit the site?
We want to display the video's without having to buy a media server. And the main question that will be asked is, who will host the content? And, where is it residing? My guess is that tt will be hosted in SP 2010.
I'm trying to incorporate a new intranet portal site for my company that will have a video to be played on the home page of the intranet site and the links will be located on the left and right side of the page with the video webpart in the center.
Will using the Blob cache and bit throttling be sufficient for this request? I think it will be, I just want to make sure before we role this out. Any help will be greatly appreciated. Thanks.
Performance on our program struck on Model3DCollection.Add method
We often adds and removs GeometryModel3D from screen, and adding models to Model3DGroup.Children -> Model3DCollection takes 20% of performance.
Reduce the number of models addid to Model3DCollections is impossible. All objects and materials are freezed.
Can someone helps?
It would be very useful to have a method like Model3DCollection.AddModels for adding a set of models without having to fire changed events.