(Fresh Scripter)
2009-01-07 11:59 AM
Deleting files after X Days

Hi Everyone

Does anyone know where I can find a script that will look in a directory and any folders within that directory and delete any files that are over x days old based on todays date?


d:\myfiles\ contains directories called 01, 02, 03 etc which in turn contain files that need to be deleted if older than x days

Thanks in advance everyone


(Seasoned Scripter)
2009-01-07 12:37 PM
Re: Deleting files after X Days

Hi Ste,

This really should be in the basic scripting forum, but I would suggest you use Dir to enumerate the files in the directory and Getfiletime to retrieve the date of the files. You could then compare that with @date and @time to see if it is old enough to delete (getfiletime returns "YYYY/MM/DD HH:MM:SS" so you will have to do some string manipulation also).



(MM club member)
2009-01-07 12:57 PM
Re: Deleting files after X Days

There is a UDF called SerialDate() that will make your life much easier.

Here is an example of how I used it to do exactly what you are looking for.


Global $array[], $array2[], $c, $Value[]
SetOption(NoVarsInStrings, On)
SetOption(WrapAtEOL, On)
If Open(1,"c:\ScriptHome\Support\iplist.txt") = 0
Use * "\\server\share" /user:"user" /password:"password"
$Drive = @Result

While @ERROR=0
	ReDim Preserve $array[$c]
	ReDim Preserve $array2[$c]
	$line = ReadLine (1)
 If $line <> ""
	$Value = Split ($line,",")
	$array[$c] = $Value[0]
	$array2[$c] = $Value[1]

$CUDate = SerialDate(@Date)
$C = 0
While $C <= UBound($Array)
	$compare = Dir($Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\")
	While $Compare<> "" And @Error = 0
		$Filedate = GetFileTime($Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\"+$Compare)
		$Filedate = Split($Filedate," ")
		$Filedate = SerialDate($FileDate[0])
			If $CUDate - $FileDate > 30
				Del $Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\"+$Compare
		$Compare = Dir()
$C = $C + 1

$compare = Dir("c:\ScriptHome\Logs\tftp*.*")
While $Compare<> "" And @Error = 0
	$Filedate = GetFileTime($Compare)
	$Filedate = Split($Filedate," ")
	$Filedate = SerialDate($FileDate[0])
		If $CUDate - $FileDate > 30
			Del "c:\ScriptHome\Logs\"+$Compare
	$Compare = Dir()

Use $Drive /delete

Function serialdate($ExpD)
  Dim $z,$h,$a,$b,$c,$y,$m,$d
  If InStr($ExpD,'/')
    If $m<3
    If $m>12

Glenn BarnasAdministrator
(KiX Supporter)
2009-01-07 03:56 PM
Re: Deleting files after X Days

I use two different tools in the environments that I support.

The first is a general cleanup process that removes all files older than 3 days from specific locations. The age and paths can be specified by parameters in the script, but generally are not changed once defined. I've attached "cleanup.kix" in a zip file to this post.

The second tool is a log maintenance script that can archive/cleanup files from locations based on a config file. The same script can be used with different parameter settings to manage files from dozens of applications on a system, each with different archive, retention, and filename parameters. This tool can archive files (rename, to folder, or to ZIP), delete files, delete archived files, control services, and run external tasks before and after archiving. All actions are logged, and logs are removed on the same schedule as the files. A summary log is also created to interface with enterprise monitoring tools like Tivoli, Patrol, or HP OpenView.

The LogMaint tool will be available on our web site for download in the near future - I'm developing a GUI management console for the final package. If you'd like the current package, just place a request for LOGMAINT through our web site contact page and we'll email you the zip file with full docs. You'll need to manually create the INI file containing the configurations, but it's not difficult and the docs provide plenty of examples.

Using LogMaint allows you to employ one standard script for all log and file cleanup tasks without any custom coding. It's in use at several enterprise environments.


(Fresh Scripter)
2009-01-07 07:24 PM
Re: Deleting files after X Days

I usually do most of my scripts using Kix, but in this case, you could also use the resource kit tool called 'forfiles'.

forfiles /p c:\temp /s /m *.log /d -30 /c "cmd /c echo @file"

/p = is the path to start searching
/s = do all subdirectories as well
/m = the file mask (*.bak, *.log, *.txt, etc)
/d = date calculation (-30 is 30 days ago, +30 is 30 days from today, -7 is 7 days ago, etc)
/c = command to execute (change echo @file to del @file to delete the files)

(Just in Town)
2009-01-13 12:20 PM
Re: Deleting files after X Days

Yeah, I found forfiles to be a rather usefull tool too.

I use it to do cleanup of logs and backups on my fileservers and Exchange.

Like for cleaning up backup RAR's on the NAS:

forfiles -p "E:\Backups\SQL" -s -m *.rar -d -60 -c "cmd /C del @FILE"

This cleans up all backup rars in E:\Backups\SQL older then 60 days.
You could offcource, instead of deleting the files move them to a long term storage NAS:
forfiles -p "E:\Backups\SQL" -s -m *.rar -d -60 -c "cmd /C xcopy @FILE \\NAS\Backups"

And again scrub them from there over after a longer period of time.

Or for a webserver you can use it to compress and cleanup log files:

forfiles -p "E:\WWWApps\Logs" -s -m *.log -d -60 -c "rar a -rr -rv -t -ag E:\WWWApps\Logs\ -m2 -mt4 @FILE"
forfiles -p "E:\WWWAps\Logs" -s -m *.rar -d -365 -c "cmd /C del @FILE"

in the -c part of forfiles you can execute just about everything you want, so you could start another script from there or even do a "net send" to the admins to tell them there are files older then x days in the folder.

(KiX Master)
2009-01-14 03:55 AM
Re: Deleting files after X Days

Or use CleanDirectory()

(KiX Master)
2009-01-14 07:50 PM
Re: Deleting files after X Days

I too use forfiles to delete some of the week old (or older) files in temp. I also use it in conjunction with robocopy to move folders that are a week old or older. I kick it off every day with a scheduled task.

(KiX Master)
2009-01-15 08:00 PM
Re: Deleting files after X Days

Yes, too bad RoboCopy seems to no longer have good pro-active development. A great tool but could use a few more enhancements.