Where did all the disk space go?

Where did all the disk space go?

I seem to suffer from a recurring event. Disk space just disappears.

We have monitoring in place which alerts once free disk space has passed a set threshold (generally 20% left). Service Tickets are raised, the issue is resolved and the ticket is closed. Recently, a Service Ticket was re-opened a week later with a note saying “not fixed” because the disk has filled once more. There are so many reasons why a disk may fill, but one thing that’s for certain is that once you are low on disk space, you need to fix it quickly.

There are numerous scenarios for disks filling. I’m just going to work through some of my more common experiences.

File Auto-Growths

A big clue towards where the Disk Space went is by examining recent auto-growths. File auto-growth is a good thing – when a Data or Log file fills, in most cases it is better for it to automatically grow and then accept the Data than to reject it, causing a failure. SQL Server does keep a basic track of Files that have grown automatically, held in a short-term internal trace mechanism.

This information can normally be seen in the “Disk Usage” report, accessible in SSMS. This just covers a single Database though.

The following query identifies recent auto-growth events that have occurred across all Databases / Drives on a SQL Server. This is incredibly useful when you get that call to say that a Drive has filled up unexpectedly.

The query can also identify some other issues that would normally be picked up with other tools – e.g. sp_blitz would have given a gentle nudge to say that having auto-growth set to 1Mb is not too clever.

Be aware that not every scenario is covered  – Manual Database Creation, file movements or manual Database growths for example. These events should be a bit more controlled, so less likely to cause a surprise “Disk full” event.

-- Identify AutoGrowth events 

-- Find the Trace file location from latest Trace file
DECLARE @trcFileName NVARCHAR(2000)

SELECT @trcFileName = [path]
FROM sys.traces
WHERE is_default = 1

-- By using log.trc, all Trace files will be scanned
SELECT @trcFileName = LEFT(@trcFileName, CHARINDEX('log_', @trcFileName)-1) + 'log.trc'

-- Return details of all files that have had AutoGrowth events
-- (rather crude method of getting physical_name, may be improved in future)
SELECT 
	   (SELECT UPPER(LEFT(MAX([physical_name]), 1)) FROM sys.master_files WHERE [DatabaseID] = [Database_ID] AND [physical_name] LIKE '%' + [FileName] + '%') AS 'Drive'
     , [DatabaseName]
     , [FileName]
	 , (SELECT MAX([physical_name]) FROM sys.master_files WHERE [DatabaseID] = [Database_ID] AND [physical_name] LIKE '%' + [FileName] + '%') AS 'FileNameAndPath'
	 , CASE EventClass
	     WHEN 92 THEN 'Data'
		 WHEN 93 THEN 'Log'
	   END AS 'FileType'
	 , [LoginName]
	 , [StartTime]
	 , [EndTime]
	 , [IntegerData] * 8 / 1024 AS [FileGrowthSizeMb]
--	 , (([Duration] / 1000) / 60) AS [DurationMin]   
  FROM fn_trace_gettable  
         (@trcFileName, default)
 WHERE EventClass IN (92, 93)
 ORDER BY [StartTime] DESC;  
GO
Example Results – Auto-Growth

Log File explosion
The most common cause of this scenario is Log Backup failure. Database Log files are cleared down internally when a Log backup is performed, meaning that the space within the Log file will be re-used. If Log backups are not performed then the Log file will just grow… until the disk that the Log file occupies is filled. If you experience disk space issues caused by a growing Log file then check your Log backups. Either the Log backups are failing, or perhaps they were never set up in the first place.

Log File explosions can be seen using the above “Recent Auto-Growths” script. They are characterised by a series of growths at the Database auto-growth setting, followed by a smaller autogrowth as SQL Server fills that last bit of disk – boom!

Example Results – Default Log file growth then disk fill

Natural Data Growth
Ideally, this should be identified through Capacity Planning and storage should be added in advance, but you can get caught out at times. This one’s simple, confirm that the existing storage has been used correctly, then add more storage.
Thresholds need to be agreed on – for example, a desire for Data Files to be populated between 50% and 90%. It makes sense to leave 10% of space free within a Database file, but rarely do you need 50%. Exceptions to this rule would be Staging / Import Databases that get populated as part of a Data Load process – space should be left for them on a permanent basis. Also, if the Data File size is significantly large or small, then those percentages should be reviewed.

You may be working with a fixed size environment (i.e. the hardware / storage cannot be expanded) – either due to hardware limitations, or by choice. This could be a development environment that’s meant to be “self-managed” by the developers (a misnomer if ever I heard one). Often, data will be created and dropped within a Database, leaving plenty of free disk space which never gets cleared up. Developers don’t clean up after themselves because they’re generally too busy with other tasks, such as Source Control and Documentation.

The Disk is full – but is it really full?
The Database files may have grown, but are they actually using the space that they consumed? I use the following script – to return space usage details for all Databases on the Server. It returns Database Name, Database File and Filename together with details of how large the file is and how much is actually used.

As an example, running this script on a Development Server will immediately identify Database Files that can be reduced in size to free up space to the operating system.

-- SQL Data / Log files listed by File Space available

DECLARE @dynSQL VARCHAR(2000) -- Used to build Dynamic SQL to get File usage
IF OBJECT_ID('tempdb..#FileDetails', 'U') IS NOT NULL DROP TABLE #FileDetails -- Drop Temp Table if already exists
 CREATE TABLE #FileDetails(Drive CHAR(1), DatabaseName VARCHAR(255), DataFile VARCHAR(255), FileName VARCHAR(255), FileSizeGB DECIMAL(12,2), SpaceUsedGB DECIMAL(12,2), SpaceUsedPct DECIMAL(12,2), SpaceFreeGb DECIMAL(12,2), SpaceFreePct DECIMAL(12,2))
SET @dynSQL = 
'USE [?]
INSERT 
  INTO #FileDetails (Drive, DatabaseName, DataFile, FileName, FileSizeGB, SpaceUsedGB, SpaceUsedPct, SpaceFreeGb, SpaceFreePct)
SELECT
       UPPER(SUBSTRING(FILENAME, 1, 1)) AS [Drive]
     , DB_NAME() AS [DatabaseName]
     , [Name] AS [DataFile]
	 , [Filename] AS [FileName]
     ,       CONVERT(DECIMAL(12,2), (ROUND(size/128.000,2)/1024)) AS [FileSizeGb]
     ,       CONVERT(DECIMAL(12,2), (ROUND(fileproperty(name,''SpaceUsed'')/128.000,2)/1024)) AS [SpaceUsedGb]
     , 100 - CONVERT(decimal(12,2), (CONVERT(DECIMAL(12,2), ROUND((size-fileproperty(name,' + CHAR(39) + 'SpaceUsed' + CHAR(39) + '))/128.000,2)) / convert(decimal(12,2),round(size/128.000,2)) * 100)) AS [SpaceUsedPct]
     ,       CONVERT(DECIMAL(12,2), (CONVERT(DECIMAL(12,2), ROUND((size-fileproperty(name,' + CHAR(39) + 'SpaceUsed' + CHAR(39) + '))/128.00,2)))/1024) AS [SpaceFreeGb]
     ,       CONVERT(decimal(12,2), (CONVERT(DECIMAL(12,2), ROUND((size-fileproperty(name,' + CHAR(39) + 'SpaceUsed' + CHAR(39) + '))/128.000,2)) / convert(decimal(12,2),round(size/128.000,2)) * 100)) AS [SpaceFreePct]
  FROM dbo.sysfiles 
 --ORDER BY [Drive], CONVERT(DECIMAL(12,2), (CONVERT(DECIMAL(12,2), ROUND((size-fileproperty(name,''SpaceUsed''))/128.00,2)))/1024) DESC
 GO
'

--SELECT @dynSQL -- Debug - View DynamicSQL

-- Execute the Dynamic SQL for every Database
  EXEC sp_msforeachdb @dynSQL

  -- Report Results
SELECT * 
  FROM #FileDetails
 WHERE [FileSizeGb] > 1       -- Files greater than 1Gb
   AND [SpaceFreePct] > 40    -- At least 40% of free space
 ORDER BY [Drive], [SpaceFreeGb] DESC

Database files identified by the above script should be examined to see if they can be shrunk to reclaim disk space.

Example Results – File Usage

Learnings
Check these methods / scripts in your Dev or Test environments because it’s likely that at some time you’ll need it for Live and that’ll need fixing fast!

Leave a Reply

Your email address will not be published.

Follow on Feedly

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close