.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

Large transaction log file won't shrink

Posted By:      Posted Date: April 14, 2011    Points: 0   Category :

In my SQL 2005 database, the log file is 12 GB and won't shrink.  The initial size also is 12 GB.  Does that mean that's the size of the active log, and is that why it cannot shrink smaller than that?  The initial size used to be around a MB or so.  I understand why the file grows, but not why the initial size increases.

The transaction log grew even though the recovery mode was "Simple".  I switched it to "Full" and backed up the transaction log (with the "truncate" option), but doing that didn't allow the log file to shrink.

Running dbcc shrinkfile generated the message "Cannot shrink log file ... because all logical log files are in use".

This database has been in use many years without having this problem.  This problem might have started around the time I set up replication for this database.  This server is the publisher and the distributor.  Could this cause this problem?  Do I need to remove the replication before shrinking the log file?

I've read numerous other postings on similar problems, but haven't found the answer.


View Complete Post

More Related Resource Links

how to shrink transaction log file in log shipping


I have 200GB 5 databases on one server i change there recovery model to full on monday and took full backup on monday.took differential backup on friday no log backup during this period and configure logshipping but because of 5 day database did not ran log backup transaction log size went up to 70gb each but now with transaction log backup job transaction log size came down to 20 gb used but it is not able to free up unused space it has 98% unused space.

Dbcc shrink file does not help

Upload large file via webservice problem in vb.net 1.1

Hi All, I am uploading large file via webservice using below my code in vb.net 1.1. Problem is vb.net 1.1 don't know Using Object loop. I would like to know any other loop similar Using loop to replace in vb.net 1.1Public Function UploadLargeFile(ByVal FileName As String, ByVal buffer As Byte(), ByVal Offset As Long) As Boolean Dim retVal As Boolean = False Try Dim FilePath As String = Path.Combine(System.AppDomain.CurrentDomain.BaseDirectory.ToString() & "\fpath\", FileName) If Offset = 0 Then File.Create(FilePath).Close() End If Using fs As New FileStream(FilePath, FileMode.Open, FileAccess.ReadWrite, FileShare.Read) fs.Seek(Offset, SeekOrigin.Begin) fs.Write(buffer, 0, buffer.Length) End Using retVal = True Catch ex As Exception Throw ex End Try Return retVal End Function  

Uploading Large File (40mb) fails... (webhttpbinding) (Azure)

I'm using javascript to upload a file to my WCF service hosted on Azure.  (40 mb file) Here is the snippets from the web.config for the service: <system.serviceModel> <bindings> <webHttpBinding> <binding transferMode="Streamed" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647" openTimeout="00:25:00" closeTimeout="00:25:00" sendTimeout="00:25:00" receiveTimeout="00:25:00" name="WebConfiguration"> </binding> </webHttpBinding> </bindings> <behaviors> <endpointBehaviors> <behavior name="RestfulEndpointBehavior"> <webHttp/> </behavior> </endpointBehaviors> <serviceBehaviors> <behavior name="CPUploadServiceBehavior"> <serviceMetadata httpGetEnabled="true" httpGetUrl=""/> <serviceDebug includeExceptionDetailInFaults="false"/> </behavior> </serviceBehaviors> </behaviors> <serviceHostingEnvironment aspNetCompatibilityEnabled="true"/> <services> <service name="CPUpload" behaviorConfiguration="CPUploadServiceBehavior"> <endpoint add

Shrink database log file sql server 2005

Hi, I am trying to shrink log file for a testing database.This database will be restored with production database periodically.When i tried to create backup of the log file Processed 0 pages for database 'hcbeta', file 'HC_log' on file 6. The log was not truncated because records at the beginning of the log are pending replication. Ensure the Log Reader Agent is running or use sp_repldone to mark transactions as distributed. BACKUP LOG successfully processed 0 pages in 0.423 seconds (0.000 MB/sec). I tried EXEC sp_repldone @xactid = NULL, @xact_segno = NULL, @numtrans = 0, @time = 0, @reset = 1 Resulted with error:Unable to execute procedure. The database is not published. Execute the procedure in a database that is published for replication. So tried to publish the database using EXEC sp_dboption 'hcbeta', 'Publish', 'true' resulted error mesage:Msg 15242, Level 16, State 1, Procedure sp_dboption, Line 138 Database option 'Publish' is not unique. and retrived duplicate_options as "merge Publish" ,"Publish" Plesae help me in shrinking my database log file. Thanks, Adi.

Why doesn't SQL Server 2005 Enterprise always shrink a log file after a backup?

I have several databases that, despite being backed up regularly, do not release space to the operating system.  In one extreme case, I have a log file that is less than 1% utilized (and has been for months.) I know how to manually force a shrink, but I should not need to do that on any sort of regular basis.  In fact, it is against best practices to manually shrink a log file (or any database file for that matter) because of the very high risk of an unrecoverable error. It is my understanding that SQL Server will eventually handle these things on its own.  However, it does not appear to be doing so.  Any thoughts as to why? Thanks, Dave

How to read and analyze a very large log file with high efficiency and performance?

It is a very large .txt file (more than 3M), and produced everyday, the content is user's system log like below: 2007-11-01 18:20:42,983 [4520] INFO GetXXX() SERVICE START2007-11-01 18:21:42,983 [4520] WARING USER ACCESS DENIED2007-11-01 18:22:42,983 [4520] ERROR INPUT PARAMETER IS NULL CAN NOT CONVERT TO INT322007-11-01 18:23:59,968 [4520] INFO USER LOGOUT   Any idea? Thanks.

Large transaction log files after backups

I have noticed that the size of the transaction logs on my databases have rocketed lately, I do have full and differential backups in place for the databases in question, any reason for the sudden increase in size and is there anything that I can do to mitigate it. Some of the databases have a simple recovery model on, in which case transaction logs shouldn't be maintained, but I note that the sizes are still huge and this only started when I began taking full backups and differentials. Thanks.

Very high transaction log file growth in SQL server

Hi,   I am using with below version of SQL server. Microsoft SQL Server 2005 - 9.00.3042.00 (Intel X86)   Feb  9 2007 22:47:07   Copyright (c) 1988-2005 Microsoft Corporation  Enterprise Edition on Windows NT 5.2 (Build 3790: Service Pack 2).   One of my database transaction log file is getting full very frequently on PROD database,which i have truncated ones and shrunk ones after that i have taken full backup immediately.Again the Db tran log file grown to very big size. We are taking the transaction log backup for evry one hour. and The column shows log_reuse_wait=2 & log_reuse_wait_desc=LOG_BACKUP from sys.databases. Can you please suggest me free the transaction log space without truncating the log file.   Thanks, Gangadhar  

400 Bad Request when trying to send large file to service using basichttp streaming.


Hi All,


I am trying to send the large file (approx 1.5GB) to the WCF service for upload using basic http streaming.

I have increased all the timeouts and quotas in the server and client's config files to quite large values, but still I am getting Bad Request error on the client.


Also the Bad request message is returned always in 1 minute 30 seconds after sending the request to server.


What the most unusual thing is when I use Fiddler or Charles to debug communication, everything works fine. Is it something like fiddler is pinging the IIS or service and hence its not getting timeouted or like that?




Following are the details:

Server Config:


 <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />

retrieve large file from database ?


I have a database table that works as a file repository.  Currently there are binaries stored in there and I want to pull the "large" ones out in chunks.  Some of these files are in excess of 500 MB.  I have business rules that dictate if the file is >5MB to transmit in chunks.  <5MB and I can load into memory and rip out.  I got the uploading in chunks to work, but how do I get it to pull it out of the DB in chunks?

Right now I'm getting hit with a 'System.OutOfMemory' exception.  But when I recreate the byte array of the SAME size (empty though) it doesn't break.

Download Chunks (DAL)

public byte[] getBytesByDataID(int chunkSize, string dataID)
            string query = "SELECT data.data " +
                " FROM data " +

Multiple databases backup/restore in a transaction, if possible, into only one file


Hi dear,

I have three databases named XDW, XOLTP, XOLTPSchema. All of these are for only one application. In the application, user should do backup/restore using GUI.


1. How I should do backup/restore in one transaction? i.e. all of three actions should be success elsewhere all of them should fail.

2. Is it possible to backup them into only one file e.g. X.bak?

3. Or, What is the standard solution in this condition?

Thanks in advance.

How To: Joining multiple DataTable using “LINQ to DataSet”

Large File upload Problem.


I have problem with large upload files. few user complianing about it that either their file fails to upload or taking 20 to 30 mins.

I setup everything correct and i check on different machine its worked fine but only few customer out of 5000 complain it. any idea about if i can check at client machine?

i set my upload limit as 100mb and properly configured and i test it but one strange thing happen, when i try to upload group of files more then 100MB (but no individual file in the group more then 100), sharepoitn let it do and i can see 300MB group of files at once with explorer view.

my understanding when you set the limit it must be for individual or group of files. any thought?


thanks in advacne



i am at SharePoint administrator

How to shrink the sharepoint database log file



I have a SharePoint content database with the size 70GB but he log file is 450GB. i wanted to shrink the log file. Can anyone tell me  how to do that ? It would be great if anyone can provide me the complete flow.

  1. Do I have to shink the log file when the database is offline?
  2. what if we just take the backup and create a new log file ?
  3. How do we restrict the size of the log file by not growing big but just auto shrink by it self.

This is all on the production server, i need to fix this asap. Any help would be appreciated.

Thanks in Advance.

How can I shrink the size of LDF file after BACKUP LOG?


Ive just done BACKUP LOG against particular database and found no changes in size for the appropriate LDF file. It is shrunk only after I issue:







But this approach breaks all logs chain. Is there any way to shrink the LDF file for FULL backup model without destroying the whole sequece of log backups?

How to shrink the sharepoint database log file



I have a SharePoint content database with the size 70GB but he log file is 450GB. i wanted to shrink the log file. Can anyone tell me  how to do that ? It would be great if anyone can provide me the complete flow.

  1. Do I have to shink the log file when the database is offline?
  2. what if we just take the backup and create a new log file ?
  3. How do we restrict the size of the log file by not growing big but just auto shrink by it self.

This is all on the production server, i need to fix this asap. Any help would be appreciated.

Thanks in Advance.


Transaction logs will not shrink


We have two databases on our SQL Server 2000 installation that has huge transaction logs. One of them is at 32 GB and the other is around 48 GB. When I go into the Enterprise Manager and select shrink, and then select file and the log file, it shows the actual size is much less like around 2-3 GB. After selecting to shrink the file to this size (2-3GB) and waiting, when finished, I see no change in file size. I've gone back into shrink and the file size still remains the same, like it is not shrinking it properly. We also used Backup Exec and backed up the logs with trunicate. What can we do to shrink these log files?

Using response.outputstream.write to stream large file fails in firefox



I have some code to stream files (for security of the files), the code works for all files in IE, and small files in firefox. but If the file is large (eg 750Mb) firefox hangs before it shows the open/save dialog box. 

However if i take out response.clearheaders and the response.flush in the code it does download but all the files downloaded by firefox are corrupt because the file size is too large.   Filenames are guids but I took the "-" out incase that was an issue.

If any one can shed some light on this I would be very grateful.

Function is Below. 


Private Sub StreamFile(ByRef response As HttpResponse, ByVal sFile As String)
            Dim fs As FileStream = Nothing
            Dim sFileName As String = Path.GetFileName(sFile)

            Dim oFile As New FileInfo(sFile)

                fs = New FileStream(sFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)

            Catch ex As Exception

            End Try


            'response.ClearHeaders() 'makes file size wrong in firefox
            response.AddHeader("Content-Length", oFile.Length)
            'To forcefully download, even for Excel, PDF files, reg
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend