I have a database that I need to change one of the table values from INT to BIG INT because it no longer can count higher. The database when created as a backup and I remove the indexes etc, is 109 gigabytes.
I changed the database to be BIG INT and after it churned for hours, it ran out of disk space. There was no drive space avalable.
So, I removed stuff, had 300 gig free, and once again tried to change the datatype to big int.
It ran out of disk space. How can a 109 gig database require 300 gig, just to change to BIG INT for one table.
So, I transferred it to another computer, restored and tried again. And again it failed for using all the disk space. This happened twice, at both 400 and 500 gigs of free space at the start.
So, I cleared 700 gigs, and it is at 42 gigs of free space and still whirring. What could possibly be going on that a database that is only 109 gigs, could require 600+ gigs just to redo from INT to BIG INT on one table? I am thinking, whatever
is going on is definitely not something that should be happening with a world class database.
While I wait to see if 700 gigs is enough space free, or if it is once again going to use all of the free space, I am starting to get irritated and think that it is perhaps not the case that this is a world class database. It shouldnt take so much
disk space just t
View Complete Post