.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Normalizing Large Database

Posted By:      Posted Date: September 05, 2010    Points: 0   Category :Sql Server
I need to normalize (if that's the right word) a very wide table with more than 100 million records, turning it into a pair of relational tables. For many variables in the table, there are 20 fields that could be filled in. For simplicity's sake, the table looks like this: JoinID varchar(10) Value1 varchar(8) Value2 varchar(8) Value3 varchar(8) (so on, up to Value20) The way the table is currently set up, if I want to find records where "Value" equals 'x', I need to search across 20 fields. I want to create a table that has one record for each of the 20 Value fields that are filled in, ignoring those that are null. The only way I know to do that is something like below, which can take 20 hours on my table. Is there a more efficient (faster) way to do this? Thanks in advance for any advice. Tom DECLARE @sql varchar(Max) WHILE (@step < 21) USE SANDBOX DECLARE @table varchar(500) SET @table ='Receptacle' DECLARE @step int set @step = 1 BEGIN SET @sql = ' USE SANDBOX INSERT INTO ' + @table +' ( JoinID, Step, Value ) SELECT JoinID, ' + CONVERT(varchar(2),@step) + ' as Step, Value' + CONVERT(varchar(2),@step) +' as HCFPCL FROM SANBOX.dbo.TABLE WHERE VALUE' + CONVERT(varchar(2),@step) +' is not null ' EXEC(@SQL) SET @step+=1 END

View Complete Post

More Related Resource Links

Persist large dataset across ASP.NET pages. No database


Can anyone give me suggestions on how to implement the following requirement?

1. A online wizard (step-by step operation) processes user requests across several ASP.NET pages.  The session data is stored in memory until the user submits the request.  Once the request is submitted, a XML file is created and the data are passed to XML.

# Case Study: The parent page has a "select" button.  Click "select" will pop up a child page.  The users can select 1000+ items from a datagrid in the child page.  Once the user clicks "confirm", the selected items in the child page will be populated to the selected item gridveiw in the parent page.  The user can keep selecting more in the child page and the selected items will be appended to the gridview in the parent page.


- What's the best way to extract strings from a text file and convert to a dataset in memory and then present it in a datagrid in UI?

- What's the best way to persist and manipulate a large dataset across pages?  Session? or ViewState? Can it be accomplished by LINQ?

The app doesn't interact with a DB.  Once the request is submitted, it will convert the in memory data to a XML file.  What technology can be used to accomplished this?&nb

retrieve large file from database ?


I have a database table that works as a file repository.  Currently there are binaries stored in there and I want to pull the "large" ones out in chunks.  Some of these files are in excess of 500 MB.  I have business rules that dictate if the file is >5MB to transmit in chunks.  <5MB and I can load into memory and rip out.  I got the uploading in chunks to work, but how do I get it to pull it out of the DB in chunks?

Right now I'm getting hit with a 'System.OutOfMemory' exception.  But when I recreate the byte array of the SAME size (empty though) it doesn't break.

Download Chunks (DAL)

public byte[] getBytesByDataID(int chunkSize, string dataID)
            string query = "SELECT data.data " +
                " FROM data " +

Importing large dat file into sql server database

I have a dat file which consists of thousands of columns which i have to insert into a database. I have to insert the data into multiple datatables. I am looking for the best possible way to do that. I have looked into bulk insert also but is there a better way? something like normalizing the data in the data access layer and inserting into the tables or creating ssis packages? Any help is highly appreciated.

Large geo database (not spatial), index problems




I am have trouble getting my database selects fast. The SQL force is not strong with me.

The data is provided from geonames.org and contains about 8 mio locations.

The main problem is my ORDER BY population DESC. When covering a large location, where there are lots of populated places (pp), its fast, but when covering the sea, 0 pp, its really really slow.


Here is the table:

CREATE TABLE [dbo].[GeoNames](
  [geonameid] [int] NOT NULL,
  [name] [nvarchar](200) NULL,
  [asciiname] [nvarchar](200) NULL,
  [alternatenames] [nvarchar](max) NULL,
  [latitude] [float] NULL,
  [longitude] [float] NULL,
  [feature_class] [char](2) NULL,
  [feature_code] [nvarchar](10) NULL,
  [country_code] [char](3) NULL,
  [cc2] [char](60) NULL,
  [admin1_code] [nvarchar](20) NULL,
  [admin2_code] [nvarchar](80) NULL,
  [admin3_code] [nvarchar](20) NULL,
  [admin4_code] [nvarchar](20) NULL,
  [population] [bigint] NULL,
  [elevation] [int] NULL,
  [gtopo30] [int] NULL,
  [timezone] [char](31) NULL,
  [modification_date] [datetime] NULL


My clustered index:

CREATE UNIQUE CLUSTERED INDEX [pk_lat_lng_id] ON [dbo].[GeoNames]
  [latitude] ASC,
  [longitude] ASC,
  [feature_class] ASC,
  [geonameid] ASC,
  [population] DESC

Master Data Deletion in Large Database Applications


I am working on School MIS application and apparently, it has many Master profiles, like School, Student, Class, Division, Employee etc. In order to keep my database integrated and to maintain the consistency of data whenever someone deletes a master data then everything related to that data also gets deleted.

However, as the application is progressing I am wondering if it is a right way to do it because when someone deletes a master data then there is hell lot of data which would also get deleted. And the related data which would get deleted would in turn have other related data.

So if in my application if someone would delete a school then all the information related with a school like Students, Classes, Sections, Division, Employees should also get deleted and Students in turn would be associated with other data which should also get deleted.

I was wondering what would be the best approach? What if someone deletes a master record it by mistake, though we would take the confirmation before deleting but if by any chance someone does it by mistake then it would cost the organizationa lot:)

1) Should we really provide a feature to delete a data or when user tried to delete a data then maybe we can deactivate it and all other related data by setting some flag in the table.

2) If we have to delete the data then, should we just take confirmat

Large log file and unable to connect to database

I'm running SBS2008 and my SharePoint Config log is taking up 11.27Gb. I've
run into this issue before and was able to resolve it, but I cannot connect
to this database from the SQL Server Management Studio Express console.
When I open the console, I have to connect to the database by entering
\\.\pipe\mssql$microsoft##ssee\sql\query for the Server name. From there,
if I try to go to the properties of the SharePoint_Config database, I get
the following error:

TITLE: Microsoft SQL Server Management Studio Express
Cannot show requested dialog.
Cannot show requested dialog. (Microsoft.SqlServer.Express.SqlMgmt)
Property Owner is not available for Database
'[SharePoint_Config_29c26fca-17b8-48c1-9704-b869932abcb6]'. This property
may not exist for this object, or may not be retrievable due to insufficient
access rights. (Microsoft.SqlServer.Express.Smo)

Thanks for any help you can provide.

Robert Ramin

Need a large free sql server database for my thesis


Hello everyone,

It's my first post here and sorry if I don't know the rules yet and sorry for my not so god english :D.

I'm doing thesis on data mining with sql server and i need some database that has huge amount of data so i can present some of techniques. Is there any free sql server database that has those characteristics.

Thank You.

SQL 2008 R2: MDW Database grows large, Purge doesn't work


Hi All,

I have a SQL 2008 R2 Enterprise instance (named SQLUTILITY) which is monitoring six other SQL 2008 R2 instances.  This instance has a single database: Sysutility_MDW.  The database was created when I ran the wizard to make it a Utility Control Point.  I then ran the UCP wizard on each monitored instance to point to SQLUTILITY.

I then also enabled Data Collection on each of the monitored instance to gather additional statistics (Server, Query, Disk).  The collectors and UCP have been running on each instance just fine, since I set them up on January 18<sup>th, </sup>2011.

However, today is February 1<sup>st</sup>, 2011, and the Sysutility_MDW database is now 300GB in size.  In SSMS I can see that the data file is really full, too.  There is only 1% of slack spa

Auto update stats for large database



We have a large database (90 GB) which is very slow during busy hours. Auto update and auto create stats are enabled for this database. Should disabling it and running update statistics manually twice weekly would help improving the performance? I can't use profiler to ckeck the impacts of auto operations.


Multiple database support with Entity Framework

One of the features introduced in Entity Framework is being database independent. Which mean each database provider can support Entity Framework by implementing its provider.

This feature allows you build applications independent from the underplaying database provider. In this post I'm going to show how to build an application that support multiple databases using same conceptual model made by Entity Framework.

Export Visio Database Table Names to Excel

If you use the Enterprise Architect edition of Microsoft Visio for data modeling regularly, then there is a good chance that at some point you've wanted to export just the table names into Excel. You might want to do this to map logical ERD entities to physical data model tables, track project status by entity, or track overlap between database versions.

MS SQL Server: Disconnect Users From Database - Kill User Session

If you ever wanted to restore your database from a SQL backup file (.bak), but there are still users connected to your database, the backup operation will fail causing the error: Exclusive access could not be obtained because the database is in use.

Publishing SQL Server Database using Publishing Wizard : Tips & Tricks

We can use SQL Server Publishing Wizard to deploy our local Database to remote hosting server/production server.This feature is available in SQL Server 2008/VWD 2008.

Below is the complete step by step guide of this process.

Adding Rows in GridView without using a Database

This article describes on how to add rows in GridView without using a database. Basically the GridView will be populated with data based on the values entered in the TextBox on Button Click and retain the GridView data on post back.

Adding Multiple Columns and Rows in GridView without using a Database

This article describes on how to add multiple columns and rows in GridView without using a database. Basically the GridView will be populated with data based on the values entered in each TextBoxes on Button Click and retain the GridView data on post back.

ASP.NET Database Tutorial

This tutorial will show you how to display data using the .NET GridView Control, ASP.NET 2.0 and VB.NET

The GridView control is a powerful tool and is simple to implement.

First, you will need to import the System.Data.SqlClient namespace.

Connestion string create any database server

This links important to how to create connection string to any data base sever.
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend