.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Register
 
Win Surprise Gifts!!!
Congratulations!!!


Post New Web Links

SMO Table script generation performance

Posted By:      Posted Date: May 22, 2011    Points: 0   Category :
 

Hello,

currently I'm working on program for database compare. In this program I need to create scripts for databse objects. i found out that script generation is too slow but generating scripts using SSMS it's pretty fast. In one thread I have found, that SSMS uses for script generation SMO, but I couldn't find any suggestions for performance improvement.

 

Server srv = new Server(new ServerConnection("SERVER", "USERID", "PASSWORD"));
srv.SetDefaultInitFields(typeof(Table), true);
Database db = srv.Databases[database];
ScriptingOptions so = new ScriptingOptions();
so.AllowSystemObjects = false;
.
.
.
db.PrefetchObjects(typeof(Table),so);
.
.
.
foreach(Table tb in db.Tables)
{
  StringCollection sc = tb.Script();
  .
  .
  .
}

 

Interesting is that db.PrefetchObjects(typeof(Table),so) has almost no effect for script g


View Complete Post


More Related Resource Links

Under the Table: How Data Access Code Affects Database Performance

  

In this article, the author delves into some commonly used ways of writing data access code and looks at the effect they can have on performance.

Bob Beauchemin

MSDN Magazine August 2009


generate create script of table using c# and SQL server 7.0, can anyone help me?

  


I want to generate create table script using c#.net, I want to connect sql server 7.0 and generate table create script. Its urgent kindly help me urgently.


SMO classes to generate alter table script to add columns

  
How to generate ALTER TABLE script to add columns using SMO and c#.

Right click script alter table disabled

  
Does anyone know how to enable the "Script Table As" -> "Alter To" option in SQL Server 2008 SSMS? John M. Couch Insight Enterprises Inc. 6820 S. Harl Ave, Tempe, Az 85283 Email: john.couch@insight.com

Need help with dynamic table generation

  
I am building a test script to help myself figure out dynamic SQL.  My goal is to generate column names from an existing table query.  To work in this direction I made the following test: declare @sql varchar(1000) declare @int int set @int = 1 set @sql = 'declare @test table (' set @sql = @sql + 'testfield varchar(10),' while @int < 5 begin  if @int < 4   Begin    set @sql = @sql + ' field' + CAST(@int as varchar(10)) + ' varchar(10),'   end  else if @int = 4   begin    set @sql = @sql + ' field' + CAST(@int as varchar(10)) + ' varchar(10))'   end  set @int = @int + 1  end select (@sql) exec (@sql) insert into @test (testfield, field1, field2, field3, field4) values (1,2,3,4,5) When I run this query I get the following from the select (@sql) query: declare @test table (testfield varchar(10), field1 varchar(10), field2 varchar(10), field3 varchar(10), field4 varchar(10)) This looks good.  When I run the exec statement, this returns 1 Row affected, so, that seems to work.  However, when I run the insert I get this error: Server: Msg 137, Level 15, State 2, Line 23 Must declare the variable '@test'. I know that GO statements will effectively create batch dividers between statement groups, but, I haven't yet learned

How to calculate a SQL Server performance of a query based upon table schema, table size, and availa

  
Hi What is the best way to calculate (without actual access to a SQL Server) the processing speed of a query (self-inner-join) based upon table schema, table size, and hardware (CPU, RAM, Drives)? ThanksThanks Jeff in Seattle

Performance Drop when switching from a #temp table to a @temp table variable

  

I am trying to convert a stored procedure to a table valued function and the performance has taken a HUGE hit and I was wondering if there was anything that can be done about it. Since a table valued function can not use #temp tables it must be converted to a @temp table variable.

Here are some steps I have already taken...

The original stored proc starts off by populating a #temp table via "Select x Into #temp ..."

Leaving it a stored proc for now, I explicitly created the #temp table and did an "Insert Into ... Select From" to more closely model how it must work when using a @temp table variable. There was no discernible performance difference.

Still leaving it as a stored proc, I then swapped out the #temp table with the @temp table variable and now, all of the sudden, the performance drops from sub-second to over a minute!!!

The temp table only has one field defined as an int and it is distinct, so I tried making the field the Primary Key to see if that would help and it did not.

The temp table is created by scanning a table with around 11,000 rows and the temp table itself has about 4400 rows in it (if it makes a difference to anyone).

Does anyone have any suggestions (or hope) for me?

Thanks,
Jim


Performance Drop when switching from a #temp table to a @temp table variable

  

I am trying to convert a stored procedure to a table valued function and the performance has taken a HUGE hit and I was wondering if there was anything that can be done about it. Since a table valued function can not use #temp tables it must be converted to a @temp table variable.

Here are some steps I have already taken...

The original stored proc starts off by populating a #temp table via "Select x Into #temp ..."

Leaving it a stored proc for now, I explicitly created the #temp table and did an "Insert Into ... Select From" to more closely model how it must work when using a @temp table variable. There was no discernible performance difference.

Still leaving it as a stored proc, I then swapped out the #temp table with the @temp table variable and now, all of the sudden, the performance drops from sub-second to over a minute!!!

The temp table only has one field defined as an int and it is distinct, so I tried making the field the Primary Key to see if that would help and it did not.

The temp table is created by scanning a table with around 11,000 rows and the temp table itself has about 4400 rows in it (if it makes a difference to anyone).

Does anyone have any suggestions (or hope) for me?

Thanks,
Jim


Write-Back table generation and Measure Expressions

  

Hi Everyone,

It would be interesting to see if anyone can tell me a couple of things about how SSAS works. If not, at least it would be very nice if anyone could hook me up with a SSAS product team member, who can shed some light…

1. Write-Back Tables

How does SSAS create a write-back table for a partition? More specifically, if you have a role-playing dimension (I’ll use an example with a Date), you will get two Date keys with _ suffixes. How do these suffixes get generated? Is there any logic which SSAS applies consistently? This would be very helpful when moving data from the write-table to the fact table.

2. Measure Expressions

There are two conflicting blog posts by Christian Wade and Teo Lachev about how Measure Expressions work. According to Teo Lachev, Measure Expressions get calculated at runtime, while Christian Wade writes that the calculation results are stored on disk (assuming during processing). It would be good to confirm this as it may matter in some specific cases (Currency Conversions with heaps of data).

Thanks!


Boyan Penev --- http://www.bp-msbi.com

script in a table

  

HI,

Since Marquee is not supported in xhtml, i had to write a javascript and i need that to be at the bottom of my page.

unsure How to achieve that?

Can i have my script insde a table just above </body> ?

Also how to align it to the center of the page...

Advance thanks,

Neo

 


Database script generation

  


Dear All,

This post is basically a continuation of my earlier post titled "auto code generation" in this link http://forums.asp.net/t/1598255.aspx.

Now I want to know the possibilities of template based generation of database scripts in Oracle, Sql Server and postgres.

In other words what templates are available for generating PL/SQL procedures in Oracle, T-SQL procedures in SQL Server and PG-SQL/ OpenSQL functions in postgreSQL.


Thanks for your replies.



Which table design is good for performance?

  

hi All,

I have situation to create table with datetime column for OLAP (more select stmt.) database with following conditions,

1.       Need to insert only date (no need time)

2.       Only SQL Server 2005 or less can be used.

3.       More than 10 million records capacity of the table.

----------------------------------------------------------------------------

use tempdb
go
drop table T1;
drop table T2;
go
create table T1(col1 int, colD tinyint, colM tinyint, colY smallint);
create table T2(col1 int, colDate datetime);
go
insert in

Updating multiple Indexes on a table - Performance issue

  

Hi

I have a performance issue when trying to update a table with multiple indexes. The table itself has about 280 million rows. The selection of the records is fast, about 160 ms, as it has a suitable non-clustered index. However the update itself takes over 10 minutes.

When I look at the excution plan, it shows the update to the clustered index as well 5 other non clustered index which are being affected by the statement. Unfortunately it doesn't show me how those indexes are being accessed. I suspect that the update statement is performing a full index scan against each of the non-clustered indexes in order to do the update.

So my question is this, if i add the key columns of the other non-clustered indexes as included columns on the index used to select the records for update will sqlserver use them to access the additional non-clustered indexes?

any advice greatly appreciated

Conan


CURSOR performance on multiple delete/insert and maybe poor table design

  

We have table documents and table documents_additonal_data, joined via key (docnum, doctype), important to know is that in documents_additonal_data we just have two more fileds valuetype and value (varchar(MAX)), and every record in doc can have variable 0-n valuetypes so there is no schema enforced in any way.

Now I've made cursor that needs to transfer history data from one older table to docs, must use cursor with order by clause because transaction_id, some bussines logic etc.  In short it takes values, have some IF logic on cursor variables, checks if there is already record for that document, some more IF logic on variables (with few simple selects) and calls insert procedure.

About insert procedure, it doesn't update values but reads current, stores old to XMLvalue (xml made with select) in storage table, deletes current, and inserts new data, reason for that is probably multiple indexes  (docs is primarly 'read from' table) and millions of records, update was slower probably because in add_data on update it usually needs to delete, insert, update in the same time. Insert procedure takes XML parameter for additional_data which is then translated and inserted via exec sp_xml_preparedocum

Slow Performance with a sigle table

  

Hi,

We have SQL 2005 Server with a SAP database of 400 GB in out Test enviroment (production machine is DB2 iSeries). We have a very extrange problem just with a sigle table. The table has about 56Mill of Rows an a size of 11 GB of data and 25 GB spent in indexes. We have similar tables in the database (even bigger than this one) with a correct performance.

The performance of bulk deletes or inserts in this single table are extremly slowly. For instance, deleting 25 Mill of rows can take many hours (more than 20). CheckDB and othr maintence plans already applied without any issues.

Any ideas?

 


any compare tool which will cater table rename script

  
Any compare tool which will cater table rename script.
what i have done in my application new version is rename about 6 table.
I am use to run the AdeptSQL Diff tool to get the upgrade script.
Now I am stuck with table renaming.I have also called that table names in number of stored procedure.
Is there any tool I can generate the upgrade script with table renaming?

Kamran Shahid Senior Software Engineer/Analyst (MCP,MCAD,MCSD.NET,MCTS,MCPD.net[web])

Performance issue populating fact table

  

I have an ETL job that runs fine until it gets to the step to populate the fact table. It runs for ages, without adding the new data. It used to run in 2 minutes, but I have not changed the definition of the fact table or the dimension tables referenced in the joins.

If I copy the FROM statement from the table definition and use it to perform a SELECT COUNT(*) I get the total back in 5 seconds. There are only 1.3 million rows at present.

Does anyone have any ideas as to why the count runs so quickly yet the actual data upload seems to stall?

Thanks in advance.


Categories: 
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend