Is there a size limit for the .dat backup that can be taken using STSADM -o backup
and for the .dat that can be restore using STSADM.
Thanks in Advance
View Complete Post
I would be grate if someone would let me know what size I should restrict my xml files to on a new website.
I designing a cart for an existing site and looking to store basic product records in xml which can be called via httprequest etc. The site currently has between two to three hundred sequencial users and I need to clarify the performance limitations and how they may apply before I proceed. Any assistance would be gratefully received
Hi I am having TOP N(N=15,25,50) report which is parameterized.i.e when user selects the 'N' eg(15) value from the drop down list the query is going to pull TOP 15 record and display on report in a barchart form.Report is working well at this point.But
when user selects N=15 and we are having only one record then that record is occupying whole space and forming a square on report.Can anyone say how to limit size of that bar so that it is not going to form as a square and display on report
Thanks in advance.
Ive just done BACKUP LOG against particular database and found no changes in size for the appropriate LDF file. It is shrunk only after I issue:
But this approach breaks all logs chain. Is there any way to shrink the LDF file for FULL backup model without destroying the whole sequece of log backups?
I just want to know the diff b/w backup/restore and export/import using stsadm.exe
for the client object model i found 2 methods for uploading files:
The 1st methods use a stream and the 2nd use a byte-array as parameter. This morning i noticed that with the Add-Method of the FileCollection-Class I only can upload files which are smaller than 3 MB. With larger files I get always the exception (400 -
The SaveBinaryDirect-Method seem to have no upload size limit. I can upload files which are 50 MB large. Unfortunatley it gives me no return value, but i like to get an object with id/guid for later operations (update/delete).
Is it possile to change the upload size limit for the add-method?
Thanks for help!
How do I control the size of my usage database ( Usage and Health
Data Collection database ) in SP2010 so it never exceeds a certain size?
Its not clear how you do this - does anyone know? I want to set it so it never exceeds 100 GB in size but logs pretty much everything.
Thanks in advance,
i've created a job to schedule backup everyday, here the script used:
BACKUP DATABASE [ON_KEY_4] TO DISK = N'D:\BACKUP\E-DRV\Backups Archive\ON_KEY_4_Backup' WITH NOFORMAT, NOINIT,
NAME = N'ON_KEY_4-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10
this back up is too big about 120g but when i do manual backup the database is only 3g. i think the DB is not being overwritten but back up on top of each one.
how can i make to overwrite each time the script is executes.
thanks in advance
Database is in Full Recovery Model. Backup is done as follows
1) Full Back up done every Sunday 00:00
2) Differential Backup done Done Every Day at 00:00
3) Transaction Log is Backed Up Every 15minutes.
I have a problem Transaction Log Size suddenly gone to 22GB (Data File is only 100mb) and Transaction Log Backup ruinning for a long time but it is not backing up.
Why is this problem occuring? How do I solve this?