.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Huge temdb and msdb data file after Cleanup Task

Posted By:      Posted Date: October 06, 2010    Points: 0   Category :Sql Server

I've run the History Cleanup Task has a maintenance job.

After the job ran successfully, I noticed that the data files of msdb and temdb are over 2 GB of size. I've tried to shrink both databases and I didn't get much. It is normal to have the tempdb over 3 GB after run History Cleanup Task even when this is was not the size of the database tempdb before run the History Cleanup Task ??

Best regards

View Complete Post

More Related Resource Links

Populate a Data Flow Task's variables with values from a .dtsConfig file?

I have an SSIS Project to import exchange rates from an XML file into SQL. The project works when I have certain values hard-coded into variables, such as the URL of the XML File, proxy username and passwords etc. I decided to put in a .dtsConfig file, and have it update the values of the variables in the data flow task. So, one of my SSIS Packages in this solution is called ECBDailyRates.dtsx. It contains one control flow, which is a data flow task. The data flow task is a script task which does some XPATH on an XML file, and then provides outputs to an OLE DB Destination, which in turn puts the data into a table in SQL. That much worked before I put in the dtsConfig file. There are a few variables declared inside the data flow task, such as; PricesXMLUriDaily, String ProxyAddress, String ProxyAddressPort, Int32 etc. etc. These are now being updated from the dtsConfig, which is below. It was my understanding that SSIS would run through the config, and update any of the variables as required, and then run my data tasks. However when SSIS runs my script task, none of the variables have been populated, and so it falls over... as it were :) <?xml version="1.0"?> <DTSConfiguration> <DTSConfigurationHeading> <DTSConfigurationFileInfo GeneratedBy="...." GeneratedFromPackageName="dim_Institution" GeneratedFromPackageID="

Delimiter File Read Task Installer failed to show in SSIS Data Flow Items


I downloaded the Delimiter File Read and followed the instruction and installed it. The file is on the correct directory (>:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents) The file DelimitedFileReader.dll is there.

However, when I go to BI Development Studio and try to add the new task I can't see it in the SSIS Data Flow Items...it is not visible. Does anyone know why I can add/see it?



Reading a huge file from data base


Hi all.

I have a 1.5 GB file into a table. I want to read It from server and create a file stream with it for save.

When I use DataAdapter.Fill(DataSet); code OutOfMemoryException occurs.

What should I do for read a hage file from data base?

Which file stores data ?


Create a website.Open the TaskList Window from View > Task List.
Write some Tasks.

now I couldnt find Which file stores <Task List> data ? in website1's directory or where is it ?
in windows registry ? If I move project to another PC how can I be sure I also have tasklist items too ?

any clue Which file stores <Task List> data ?

huge number of data from database ... so how to minimize load time


Hi ...

I have used the above method to configure my crystal report ...

Its working fine , But i have huge number of data in database ....

So , it take long time to load the report ...

So , i have decided to show only the last 50 records inserted into the database , not all the 500 records to be loaded every time.

Plz help me with this issue  ...


Also , Plz explain me abt index legend in crystal report.

Need Oracle Data Provider .CS File for Oracle 10g Database connection !



I need a 'Wrapper.cs' file which takes care of the Database connection ( Oracle 10g) where

i can just call the method with my SQL Query


Gridview1.DataSource = SampleWrapper.ExecuteDatatable("THE SQL QUERY");


Plz Post the link if there is any open source !    

convert SQL data to EDI 835 file


I was assigned to create a app to convert SQL data to EDI 835 file. (Electronic Data Interchange file for Health Care Claim Payment/Advice,  outbound process only).

Can someone help me where to start?

Create excel file from Binary data on SqlServer



My requirement is that i upload an excel file then i need to validate that data.


1. Uploading file using using upload control saving binary in SqlServer

2. Then i am creating file from binary data  on Sql Server using below command and then using OPENROWSET to dump data to Sql table

3. Then reading Sql Table row wise and validating data.


Alter Procedure spUploadExcelFile
  @PKID int,
  @BatchID int


	Declare @VarBin varbinary(max)
	Declare @FileName varchar(100)
	DECLARE @sql nvarchar(MAX)
                DECLARE @errMsg nvarchar(MAX)
	DECLARE @ObjectReturn INT
	DECLARE @ErrorSource VARCHAR(255)
	DECLARE @ErrorDesc VARCHAR(255)
	DECLARE @ObjectToken INT

	Select @VarBin  = Content , @FileName = [FileName] from MultilangBinaryData Where PKID = @PKID
	SET @FileName = 'C:\'  + @FileName
	EXEC sp_OACreate 'ADODB.Stream' @ObjectToken OUTPUT
	EXEC sp_OASetProperty @ObjectToken, 'Type', 1
	EXEC sp_OAMethod @ObjectToken, 'Open'
	EXEC sp_OAMethod @ObjectToken, 'Write', NULL, @VarBin
	EXEC sp_OAMethod @ObjectToken, 'SaveToFile', NULL, @FileName, 2
	EXEC sp_OAMethod @ObjectToken, 'Close'
	EXEC sp_OADestroy @ObjectToken  
	SET @sql = 'INSERT INTO dbo.UserBulkData SELECT 0,NULL,'+ @BatchI

Using the single *.rpt file with multiple data sources


I've created a set of CrystalReports (*.rpt files) for an ASP.NET web app on a development server. I call each report using the following code:

protected void BTN_RunReport_Click(object sender, ImageClickEventArgs e)
CrystalReportViewer_ClientLetter.Visible = true;

ConnectionInfo con = new ConnectionInfo();
con.ServerName = Constants.ServerIP;
con.DatabaseName = Constants.DatabaseName;
con.UserID = Constants.UserID;
con.Password = Constants.Password;

CrystalReportViewer_ClientLetter.ReportSource = Server.MapPath(Constants.ClientLetters);
ParameterFields parameter = CrystalReportViewer_ClientLetter.ParameterFieldInfo;
ParameterField batchdate = new ParameterField();
batchdate.Name = "@BatchDate";
ParameterDiscreteValue batchdate_value = new ParameterDiscreteValue();
batchdate_value.Value = Convert.ToDateTime(txtBatchDate.Text);

foreach (TableLogOnInfo tlf in CrystalReportViewer_ClientLetter.LogOnInfo)
tlf.ConnectionInfo = con;

Writing the windows forms data to an HTML file

I have a windows forms application ready now i need to transfer data from the windows forms to the web and to do this i need to write an HTML file that contains the form data, i mean the data contained in the form should be written to an HTML file to be submitted and this HTML file should be generated on the local PC. I have the code for the HTML file but how can i write this HTML file with the data values contained in my windows forms. Please give your suggestions. Best Regards, Syed

Insert Excel Data via File Upload into sql server database???

Hi all, I have requirement that User can Upload the Excel Sheet Data to sql server Database at once. How i do that Any Article or hint is apperciated.   Regards 

How to send data from WPF UI to excel file?

How to send data from WPF UI to excel file? Thanks.

Join 2 flat file data flows - retain unmatched rows

I have two data flows from two separate flat files. They may contain matching IDs (account number), in this case specific data from each flow should be used to create one row. When there is no match, the rows would stand on their own. At the end of the flow, I need both flows combined into one flow, with one record for each key record (account number). If I were able to use a look-up, I could easily union the no-match data flow back into the match data flow and have the desired result. I cannot use a look-up, since the source is flat files, but this is exactly the functionality I am trying to achieve. Solutions I want to avoid: staging tables, and cache transformations. Any ideas are appreciated.

How do I add two XML tasks to a Data FLow task?

I am new to SSIS and am developing a package which takes input from an XML file and then populates an existing SQL table with these records.  However, this is a very large XML file.  I keep getting an Out of Memory exception.  (This file is 210 MB)!  So what I tried doing was separating this XML file into 8 smaller files.  Now, I can successfully load data from these smaller files one at a time.  But how can I do all of them in the same package?  I connected two XML tasks to the Data Flow task and get the error below.  I also increased my timeout and number of errors to no avail.  Advice? Error: 0xC002F304 at XML Task 1, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.". Task failed: XML Task 1 Warning: 0x80019002 at zSkywardEnrollment: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. Error: 0xC002F304 at XML Task, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.". Task failed: XML Task SSIS package "

How to detect that a Data Flow Task has failed (from inside a custom componet)

Hi, in the PostExecute Method of a custom component i would like to check if the Data Flow Task has been failed. Is that possible? Also it would be great to be able to check if this custom component already got all data or has been marked as "green".  Thanks, Dennis

Data flow task with multiple destinations randomly hangs

I have a package whose control flow consisting of a single data flow task with multiple destinations continues to hang at random locations.  The data flow task is fairly simple: - One OLE DB source (one query of selecting about 10 fields from one table) - 9 lookups that serve basically as left joins to capture which data "drops off" (does a row redirect on on lookup errors to an "error flow") - Two destinations: one the captures all the data that doesn't drop off from the lookups, and another to capture the data that does drop off; these write to two tables unrelated to each other, and unrelated to any of the tables I'm reading from I have tried the following: - Changed one or both destinations from OLE DB to flat file -- still randomly hangs - Removed both destinations altogether -- this always suceeds.   - Removed destination A and leave destination B -- this always succeeds - Removed destination B and leave destination A -- this always succeeds I took great care in making sure I get no warnings when the package runs, even resolving the unneeded column warnings, and the duplicate keys on all my lookups, but it still hangs. However, once I give up and stop the package, the following error is thrown (among others that appear to be a byproduct of the root issue): Error: 0xC02020C4 at Data Flow Task, OLE_SRC AP12 [1]: The attempt to

Shrinking the data file

Hi, I have a few doubts regarding database shrink. I have automated jobs that purge data from 3 or 4 tables every month. I would like to understand what would be the best way to reclaim disk space. I know shrinking the data file is not the best approach . i want to know if shrinking the data file once a month is acceptable. I know shrinking of the datafile leads to index fragmentation. So can i say rebuilding/Reorganizing indexes after the shrink operation is a safe approach.  Are there any safe approaches to avoid disk space issues? Thanks    
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend