.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
david stephan
Gaurav Pal
Post New Web Links

Critique this strategy

Posted By:      Posted Date: August 25, 2010    Points: 0   Category :ASP.Net

I want to populate a gridview by using jQuery and AJAX.  From my calling page jQuery will call a handler (.ashx) that will deliver XML data or HTML markup for the gridview. As I see it I have two choices: 1) deliver XML which is then bound to the design-time gridview on the calling page, or 2) deliver HTML for the gridview.  My first question is: which method is easiest?

Now there are two factors which complicate things. First, the gridview must be sortable on all columns.  Second, the data will be filtered (some columns will be hidded) by user configuration options which are also to be stored in the database.  Knowing this, does your answer to the first question change?

Any comments, insights or gotchas are appreciated.


View Complete Post

More Related Resource Links

What does strategy exist to deploy SSIS package and my own data flow components into a enterparise s

I created a SSIS package and several data flow componenets for this package.    What does strategy exist to deploy SSIS package and data flow components into a enterparise server?   Thanks in advance.

Package Strategy Question

I am developing a rather large ETL solution using SSIS.  So far I have been building everything inside a single package but it occurred to me that perhaps I ought to check and see if that is wise.  Are there performance gains or scale reasons to split an ETL into multiple packages and then have a master package that calls each one and keeps them in sync?  I assume building everything in one package is OK? Thanks in advance... - Charles

Text Search Strategy Question for the SSIS Gurus...

BACKGROUND: As I have mentioned in some of my other posts I am using SSIS 2005 to replace an existing MS Access 2003 / VBA based ETL engine which I developed some years back.   Part of my existing Access-based ETL performs a text search of the source records and I am now attempting to replicate that functionality in SSIS (replicate in terms of the end-result and not necessarily the methods used to get to that end result).  I have an idea of how I plan to go about this but since am relatively new to SSIS so would greatly appreciate the feedback of those more experienced... DETAILED DESCRIPTION: In the source (Sybase ASE15) database, there is an "object" table (not the actual table name but for illustrative purposes it will suffice).  Within the object table there is a "description" column which is a char(60) datatype.  The description column simply represents a description of the object as defined by the source system end-user. For my ETL solution I allow the ETL administrator to define one or many (1...n) key words or phrases which represent search criteria.  These search criteria are stored in a reference table in my target SQL Server 2005 database (the same database to which my ETL will transform results and store them).  My objective, is as follows:  For each of the 1...n search criteria defined, try and find th

Which high availability strategy for multi tenant architecture

Hi, I run a multi-tenant application : each client runs the same code but each one hase its own database for personnal parameters and data, that makes about 40-50 databases and growing (i hope so) I want to set up a high availability strategy, for that, I have two VMs each one running SQL Server 2008 Entreprise. I had started to configure log shipping, but I was told that it was not a good idea when there are many dbs involved. How can I choose between HA strategies ? What tests do I have to do to see if i'm still elligible so log shipping (number of transactions per hour ?) If log shipping is not an option, what should I choose ? Failover cluster ? Thanks ! Don't hesitate to ask if you need more information.

Advice needed on backup strategy and implementation

This is my plan: Full database backup is performed at 5AM every day. Then log backups are taken every 30 minutes starting from 5:30AM till 11:30PM What I want to achieve is to have all backups for the same day to be contained in the same file on the disk. That is on Feb 18 2010 file c:\db_backup_02_18_10.BAK will be created at 5AM with full database backup. All backup logs that occur on Feb 18 will be appended to the same file. Next day c:\db_backup_02_19_10.BAK will be created at 5AM. Is this "correct" approach? The reason I like it is that everything needed to restore to any point in time for specific day is contained in one file and can be copied easily to a different computer. I tried to achieve this result via Maintenance Plan wizard by creating one task for full backup (scheduled once a day at 5AM) and another task for log backup (scheduled every 30 minutes starting from 5:30AM till 11:30PM) (both tasks are in the same maintenance plan). This, however, saves result from each job in a separate file called $DBNAME_backup_YYYY_MM_DD_HHMMSS_NNNNNNN.trn (for transaction log backup; will probably have extension .BAK for full backup) Can this even be done via Maintenance plan? I think I can achieve desired result via SSIS where I can generate file name programmatically and pass it as a parameter to "BACKUP DATABASE ..." or "BACKUP LOG ..." stat

Strategy In creating indexes

Hi Team, Need a help strategy in creating indexes.. I have a Query which returns me more then 4 billion rows, which i am trying to insert into a table. There are views and table in the query. I am working on SQL Server 2000. So i thought if we create indexes on primary key columns on the table(which are used in views), on the columns used where condition of the main query. Will creating clustered index on the table which more than one primary key helpful? Is my strategy correct? any suggestions will be of great help! Regards, Eshwar.

MVVM strategy help

This is more of a guidance question than a how to question, and some reflections on the whole MVVM pattern in the WPF framework.

We built a test UI to test a service that we are working on.  The service is highly threaded (makes use of many threads), and when building the WPF UI we ran into some issues.  At the time of writing this post, the UI works fine, but I am a little concerned that I have not read any forum posts or articles that relate to our MVVM implementation.  I will use a database of egg timers to design my example. 

Example Setup
Imagine that we have a class EggTimer which would simply count down time until the alarm should sound.  These timers will run on their own thread because (in our stupidity) our implementation of the egg timer is a class that sleeps for 1 second until the countdown timer reaches zero. (p.s. this is just a fake example to illustrate our MVVM design, so don't try and suggest how to make a better egg timer please ).  Our service model (ServiceModel ) would manage an ObservableCollection<EggTimer> and allow new EggTimer 's to be added to the list.  Once an egg timer has sound its alarm (EggTimerElapsed event?) it will be marked for cleanup.  Our service model only allows 10 egg timers to run at a ti

Please Critique My First Trigger


Although I'd prefer to avoid one, it appears a trigger is the only way to accomplish what I need.

Below is my first attempt at writing a trigger. It prevents the total RSVPs for an activity (RsvpGuests) from exceeding the value in a column in the Activity table (ActCapacity).

It appears to work. Any suggestions? In particular, any way to make it more efficient?


ALTER TRIGGER dbo.LimitRsvps
ON dbo.Rsvps
	-- Nothing to do if no rows have changed


	DECLARE @ActivityID int;
	DECLARE @Capacity int;
	DECLARE @Rsvps int;


Code Critique


I'm hoping this is the appropriate part of the forum to post this. I've been rewriting a page in ASP.net and C# for a project I took over, switching it from a wizard to a single but dynamic page design as I was asked to do. I'm new to both ASP.net and C# so I could really use a good critique, and a second set of eyes to look over what I've done so far.

The page is a form for a person to add a household to the database and allows for any number of family members. It also allows for each family member to have any number of income sources. I still need to write some code that updates a few of the text boxes as data is entered, like an age field when the date of birth is entered, but this is a majority of the code.

Thanks for any insight and help. If it matters it's for a non-profit project.

using System;
using System.Collections;
using System.Configuration;
using System.Data;
using System.Linq;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.HtmlControls;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Xml.Linq;
using System.Data.Sql;
using System.Collections.Generic;
using System.Data.SqlClient;
using Microsoft.Reporting.WebForms;

//Page and Code Created By Scott Mark 2010
//Modified By Levi Kemp October 20

[.Net 4, Named Pipes, Single-instance app] Critique the code, please


The code below is a prototype of single-instance component. Of course there's a lot of work to get it done: separate into an component, add different names per user session, run server in background thread and so on and so on.

Currently, I'm interested in finding wrong options supplied, invalid usages, security holes or anything I missed. E.g. I'm almost sure there's a bug: the code allows for remote connections. I'll prove it on monday, as I've no access to local network for now.

using System;
using System.IO;
using System.IO.Pipes;
using System.Security.AccessControl;
using System.Security.Principal;

namespace ProcessSingleInstance
 class Program
  static void Main(string[] args)
   string[] data = new[] { "a", "aaaaa", "qweqeqweqeqweqwert" };


Transactional Replication Log Backup Strategy


I have the log backup setup every 15 minutes on the production server by using the below command


The reason I use this command is we dont need point in time recovery and top of that we dont have storege for Log backups.

My question now is can I setup transactional replication with this setup in place or do I need to store the TLogs physically?




Partition Strategy



We currently have a large cube with quite large amounts of data.  We currently store the last 2 complete years and the current year.  The current partitioning strategy is one partition per year of the first 2 years, the current year excluding the last 21 days, then the last 21 days.  This last partition will build up with the current data and is processed daily.  Once a week we do full cube rebuild and the last partition will reset to just 21 days again.

Current Size
Year Num Rows

We don't have aggregations as we are using Measure Expressions (Exchange Rates) or have Distinct Count measure groups.

This is working ok, but obviously not optimal.  I was going to start by splitting these partitions down smaller (say to by Quarter) and for the distinct count measure groups, using the SQLCat recommended optimizations.  Then I got wondering.  Would it be better to split the non DC measure group by the DC optimizations too?  My thinking is that because we do not have aggregations, we would probably be putting more strain on the storage engine due to repeated fe

Is detaching DB, and copying it to another location a viable backup strategy?


Hi all.

Instead of doing a backup of my database, I've found that I'm more comfortable doing a dbcc detach on the database, manually copying the data over to my backup location, then attaching again to copied version of the database, and running dbcc checkdb to ensure everything is okay.

I have about 120 GB of data I deal with, and running the dbcc checkdb makes me feel better that at least some sort of validation is done against the backup.  When I do a SQL Server backup, unless I restore the database, I can't tell if maybe there was an underlying error in the backup location that could cause problems.

My question is, is this an acceptable strategy, or is there some other benefits that backup/restore has or some deficiency that copying the detached database has that I don't know about?

Thanks in advance,


Strategy for saving information by thousand of users in a short time.



I am working on an application which will be like users will be working offline (most likely a window app) all day saving 10-50 rows of data on to a text file and at the end of the day when they come online they will be uploading the data to main sql server database using the app. There may be hundereds of users saving information at a time within a short span of 20-30 mins.

I am totally new to developing an application which has that amount of activity. Can anyone tell me the best possible strategy so as to avoid issue on concurrent saving in case of too many users. Will Sql Server 2005 will do ok or my app is likely to face any issue. Any errors that I might run in to.

 Any help is appreciated !

Workflow Strategy -- External Task Change


I'm using MOSS 2007. I have a doc library (BuyerFeedback) that accepts Excel Workbooks. The first workflow checks to make sure the creator has submitted a valid xlsx file, and then either sets up a task or sends the creator an error email then deletes the item.

The task item triggers an outside program using Sharepoint Web Services running periodically. The outside program (in C#) sees the new task item, and processes the workbook (as input to an Access application). It records any errors in a multi-line text field ("Errors") in the task item, and changes the status to either "Processed" or "Processed With Errors". Since the status is changed, it never touches that item again.

I planned a second workflow triggered by the creation of the task item to watch for a status of "Processed" or "Processed With Errors", then send an email to the creator with the status of the workbook. I also want to make sure nothing is wrong, so I want to know when the task item has been sitting too long, and send a warning to the sysadmin telling him the C# process must not be running....

This second workflow is befuddling me (I am new at this....) It can "wait" for a due date, or it can "wait" for a change of status. I can't figure out how to make it wait for (one or the other....) I don't think I can

design strategy to overcome a server side control that can be manipulated by the client - (Security


Ok so we have a dot net aspx app whereby we have some server side button controls which in some states may be disabled

However per Internet Explorer a user could  go to the developer tools and change / delete the disabled property of the button and then click the button to fire the action event.

What would be the best recommended strategy to prevent this.

many thanks

ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend