.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Register
 
Win Surprise Gifts!!!
Congratulations!!!


Top 5 Contributors of the Month
david stephan
Santhakumar Munuswamy
Asad Ali
Fauzul Azmi
Post New Web Links

Which table design is good for performance?

Posted By:      Posted Date: October 14, 2010    Points: 0   Category :Sql Server
 

hi All,

I have situation to create table with datetime column for OLAP (more select stmt.) database with following conditions,

1.       Need to insert only date (no need time)

2.       Only SQL Server 2005 or less can be used.

3.       More than 10 million records capacity of the table.

----------------------------------------------------------------------------

use tempdb
go
drop table T1;
drop table T2;
go
create table T1(col1 int, colD tinyint, colM tinyint, colY smallint);
create table T2(col1 int, colDate datetime);
go
insert in


View Complete Post


More Related Resource Links

CURSOR performance on multiple delete/insert and maybe poor table design

  

We have table documents and table documents_additonal_data, joined via key (docnum, doctype), important to know is that in documents_additonal_data we just have two more fileds valuetype and value (varchar(MAX)), and every record in doc can have variable 0-n valuetypes so there is no schema enforced in any way.

Now I've made cursor that needs to transfer history data from one older table to docs, must use cursor with order by clause because transaction_id, some bussines logic etc.  In short it takes values, have some IF logic on cursor variables, checks if there is already record for that document, some more IF logic on variables (with few simple selects) and calls insert procedure.

About insert procedure, it doesn't update values but reads current, stores old to XMLvalue (xml made with select) in storage table, deletes current, and inserts new data, reason for that is probably multiple indexes  (docs is primarly 'read from' table) and millions of records, update was slower probably because in add_data on update it usually needs to delete, insert, update in the same time. Insert procedure takes XML parameter for additional_data which is then translated and inserted via exec sp_xml_preparedocum

Under the Table: How Data Access Code Affects Database Performance

  

In this article, the author delves into some commonly used ways of writing data access code and looks at the effect they can have on performance.

Bob Beauchemin

MSDN Magazine August 2009


Good html design software

  

Hi,


I've been using visual studio for ages and find it quite cumbersome to produce snappy aspx pages.


I'm wondering if anyone recommends from a developers perspective a nice product to handle the graphical html generation?


Thanks!


Query Performance & Overall Design - SSAS MDX WCF

  
Hi All, We have a Cube which is to be queried by the Online system , using WCF service. Peroformance of the queries running on the Cube is not really very good , as we have to calculate various percentile (which are all calculated members) on the cube, this howver works well within limits of 5-6 secs for small sample size. But this goes beyond the threshold of 30 secs when the record counts increases. What we have is our SSAS Cubes , and we have WCF Service querying the Cube using ADOMD.NET. This may not be one of the best way to achieve this requirement , but we are kind of completed with development and it may not be feasible to work on another approach, what we are looking for is optimize this design and make it work with in expected time limits of 5-8 secs. Kindly let me know, if i am not clear or if you need any more info to suggest something ! Thanks a lot for your help !   Kindly

Visual Studio 2010 Design Editor Performance

  
Are there some recommended settings or updates that address the performance of the Visual Studio 2010 Design Editor.  The smallest changes max CPU usage for literally minutes Are there third party editors available?  

How to calculate a SQL Server performance of a query based upon table schema, table size, and availa

  
Hi What is the best way to calculate (without actual access to a SQL Server) the processing speed of a query (self-inner-join) based upon table schema, table size, and hardware (CPU, RAM, Drives)? ThanksThanks Jeff in Seattle

Best table design for modelling a 1:0..1

  

Hi Gurus,

 

Apologies if this isn't the right place for this kind of question but I felt sure you guys would know the answer to this fairly simple design question.

 

I would like to model an item and a subject entity.  

An item can be assigned to zero or at most one subject.  

A subject may consume zero or one item.

 

How would it be best to model this kind of relationship in the database?  I was originally thinking of having an item FK in the subject table but that would potentially allow multiple subjects to be assigned the same item.  Then I was thinking of having a subject FK in the item table that would force only one subject to be assigned 1 item.

I think I'm just going round in circles here and it may be that my business logic should take care of these issues.  

 

My question is how is it best to model this kind of relationship?

 

Kind regards


Performance Drop when switching from a #temp table to a @temp table variable

  

I am trying to convert a stored procedure to a table valued function and the performance has taken a HUGE hit and I was wondering if there was anything that can be done about it. Since a table valued function can not use #temp tables it must be converted to a @temp table variable.

Here are some steps I have already taken...

The original stored proc starts off by populating a #temp table via "Select x Into #temp ..."

Leaving it a stored proc for now, I explicitly created the #temp table and did an "Insert Into ... Select From" to more closely model how it must work when using a @temp table variable. There was no discernible performance difference.

Still leaving it as a stored proc, I then swapped out the #temp table with the @temp table variable and now, all of the sudden, the performance drops from sub-second to over a minute!!!

The temp table only has one field defined as an int and it is distinct, so I tried making the field the Primary Key to see if that would help and it did not.

The temp table is created by scanning a table with around 11,000 rows and the temp table itself has about 4400 rows in it (if it makes a difference to anyone).

Does anyone have any suggestions (or hope) for me?

Thanks,
Jim


Performance Drop when switching from a #temp table to a @temp table variable

  

I am trying to convert a stored procedure to a table valued function and the performance has taken a HUGE hit and I was wondering if there was anything that can be done about it. Since a table valued function can not use #temp tables it must be converted to a @temp table variable.

Here are some steps I have already taken...

The original stored proc starts off by populating a #temp table via "Select x Into #temp ..."

Leaving it a stored proc for now, I explicitly created the #temp table and did an "Insert Into ... Select From" to more closely model how it must work when using a @temp table variable. There was no discernible performance difference.

Still leaving it as a stored proc, I then swapped out the #temp table with the @temp table variable and now, all of the sudden, the performance drops from sub-second to over a minute!!!

The temp table only has one field defined as an int and it is distinct, so I tried making the field the Primary Key to see if that would help and it did not.

The temp table is created by scanning a table with around 11,000 rows and the temp table itself has about 4400 rows in it (if it makes a difference to anyone).

Does anyone have any suggestions (or hope) for me?

Thanks,
Jim


How to create a Table of content in .rdl design in BIDS.

  

hello 

well am having a problem on.

How to create a Table of content in .rdl design in BIDS. either it should be dynamic or there should be bookmark of each section in TOC.

can any one help me out.?..

Thanks 

Anil


Help with design choice -- selecting parts for ordering table

  

The next part of what I'm working on will involve adding parts to an order, and will be more complex than probably anything I have done.

The amount of parts to choose from will be very extensive.

Anyone care to plant the seed,  have an example, or helpful input on easy ways to select the parts,  add them to a temp table/data table/data set, until all parts are selected (?), and insert into the database when done?

Dropdowns would, in cases, have a large amount of items. Select items from a gridview Master/Detail style? Checkbox lists?

I'm not looking for code, just a point in the right direction. I've looked at some shopping cart/e-commerce type designs, but it just doesn't fit what I'm trying to do very well.


Thanks,


Jay









How to create a Table of content in .rdl design in BIDS dynamically according to the Page no wise..?

  

hello,

well i am able to create the TOC statically by using the Document Map properties as describe in my older post issue 

but now i just want to create the Table of contents dynamically generate using the page no wise..is it possible...how can we do it while designing the .rdl in BIDS..

Thanks 

Anil


Updating the design of an SQL table

  

guys i would like to update the design of a table.

Trying to save the design i am getting an error that the operation isn't permitted.

A constraint exist between this table and another table

any idea ? thanks

 


Updating multiple Indexes on a table - Performance issue

  

Hi

I have a performance issue when trying to update a table with multiple indexes. The table itself has about 280 million rows. The selection of the records is fast, about 160 ms, as it has a suitable non-clustered index. However the update itself takes over 10 minutes.

When I look at the excution plan, it shows the update to the clustered index as well 5 other non clustered index which are being affected by the statement. Unfortunately it doesn't show me how those indexes are being accessed. I suspect that the update statement is performing a full index scan against each of the non-clustered indexes in order to do the update.

So my question is this, if i add the key columns of the other non-clustered indexes as included columns on the index used to select the records for update will sqlserver use them to access the additional non-clustered indexes?

any advice greatly appreciated

Conan


Thoughts on table design?

  

We are building a new OLTP database that has a design consideration that I am looking for feedback on. that consideration is tracking history of change on every row of every table in the design. CDC has been ruled out because its missing some feature that is needed.

My first thought for tracking history is to use triggers to capture changes to an audit table. however, the current design uses the stored procedures that insert/update rows to write to the audit tables. basically the current row is queried for and stored in a temp table...then a hash is created on the row...a hash is created on the new row and if they are different a row goes into the history table and the row is updated.

(i think this is bad design as the same code would have to be repeated for each procedure written that updates a table, and you have about 20-30 lines of code cluttering up every stored proc. for some reason some developers think triggers are evil though because they are "hidden")

the second thought on these history tables is that a row should be entered into the history table on insert of a row AND on update...the update should insert the new value.  I find that most people only want to store the old value of the changed record and not the new value.  the reason i would store the new value is so you dont have to look in 2 tables to find what a recor

Slow Performance with a sigle table

  

Hi,

We have SQL 2005 Server with a SAP database of 400 GB in out Test enviroment (production machine is DB2 iSeries). We have a very extrange problem just with a sigle table. The table has about 56Mill of Rows an a size of 11 GB of data and 25 GB spent in indexes. We have similar tables in the database (even bigger than this one) with a correct performance.

The performance of bulk deletes or inserts in this single table are extremly slowly. For instance, deleting 25 Mill of rows can take many hours (more than 20). CheckDB and othr maintence plans already applied without any issues.

Any ideas?

 


Performance issue populating fact table

  

I have an ETL job that runs fine until it gets to the step to populate the fact table. It runs for ages, without adding the new data. It used to run in 2 minutes, but I have not changed the definition of the fact table or the dimension tables referenced in the joins.

If I copy the FROM statement from the table definition and use it to perform a SELECT COUNT(*) I get the total back in 5 seconds. There are only 1.3 million rows at present.

Does anyone have any ideas as to why the count runs so quickly yet the actual data upload seems to stall?

Thanks in advance.


Categories: 
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend