Page 1 of 1 1
Topic Options
#191689 - 2009-01-07 11:59 AM Deleting files after X Days
Ste Offline
Fresh Scripter

Registered: 2003-05-28
Posts: 10
Hi Everyone

Does anyone know where I can find a script that will look in a directory and any folders within that directory and delete any files that are over x days old based on todays date?


d:\myfiles\ contains directories called 01, 02, 03 etc which in turn contain files that need to be deleted if older than x days

Thanks in advance everyone


Edited by Ste (2009-01-07 12:00 PM)

#191690 - 2009-01-07 12:37 PM Re: Deleting files after X Days [Re: Ste]
BradV Offline
Seasoned Scripter

Registered: 2006-08-16
Posts: 686
Loc: Maryland, USA
Hi Ste,

This really should be in the basic scripting forum, but I would suggest you use Dir to enumerate the files in the directory and Getfiletime to retrieve the date of the files. You could then compare that with @date and @time to see if it is old enough to delete (getfiletime returns "YYYY/MM/DD HH:MM:SS" so you will have to do some string manipulation also).



#191692 - 2009-01-07 12:57 PM Re: Deleting files after X Days [Re: BradV]
Gargoyle Offline
MM club member

Registered: 2004-03-09
Posts: 1597
Loc: Valley of the Sun (Arizona, US...
There is a UDF called SerialDate() that will make your life much easier.

Here is an example of how I used it to do exactly what you are looking for.


Global $array[], $array2[], $c, $Value[]
SetOption(NoVarsInStrings, On)
SetOption(WrapAtEOL, On)
If Open(1,"c:\ScriptHome\Support\iplist.txt") = 0
Use * "\\server\share" /user:"user" /password:"password"
$Drive = @Result

While @ERROR=0
	ReDim Preserve $array[$c]
	ReDim Preserve $array2[$c]
	$line = ReadLine (1)
 If $line <> ""
	$Value = Split ($line,",")
	$array[$c] = $Value[0]
	$array2[$c] = $Value[1]

$CUDate = SerialDate(@Date)
$C = 0
While $C <= UBound($Array)
	$compare = Dir($Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\")
	While $Compare<> "" And @Error = 0
		$Filedate = GetFileTime($Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\"+$Compare)
		$Filedate = Split($Filedate," ")
		$Filedate = SerialDate($FileDate[0])
			If $CUDate - $FileDate > 30
				Del $Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\"+$Compare
		$Compare = Dir()
$C = $C + 1

$compare = Dir("c:\ScriptHome\Logs\tftp*.*")
While $Compare<> "" And @Error = 0
	$Filedate = GetFileTime($Compare)
	$Filedate = Split($Filedate," ")
	$Filedate = SerialDate($FileDate[0])
		If $CUDate - $FileDate > 30
			Del "c:\ScriptHome\Logs\"+$Compare
	$Compare = Dir()

Use $Drive /delete

Function serialdate($ExpD)
  Dim $z,$h,$a,$b,$c,$y,$m,$d
  If InStr($ExpD,'/')
    If $m<3
    If $m>12
Today is the tomorrow you worried about yesterday.

#191696 - 2009-01-07 03:56 PM Re: Deleting files after X Days [Re: Ste]
Glenn Barnas Administrator Offline
KiX Supporter

Registered: 2003-01-28
Posts: 4395
Loc: New Jersey
I use two different tools in the environments that I support.

The first is a general cleanup process that removes all files older than 3 days from specific locations. The age and paths can be specified by parameters in the script, but generally are not changed once defined. I've attached "cleanup.kix" in a zip file to this post.

The second tool is a log maintenance script that can archive/cleanup files from locations based on a config file. The same script can be used with different parameter settings to manage files from dozens of applications on a system, each with different archive, retention, and filename parameters. This tool can archive files (rename, to folder, or to ZIP), delete files, delete archived files, control services, and run external tasks before and after archiving. All actions are logged, and logs are removed on the same schedule as the files. A summary log is also created to interface with enterprise monitoring tools like Tivoli, Patrol, or HP OpenView.

The LogMaint tool will be available on our web site for download in the near future - I'm developing a GUI management console for the final package. If you'd like the current package, just place a request for LOGMAINT through our web site contact page and we'll email you the zip file with full docs. You'll need to manually create the INI file containing the configurations, but it's not difficult and the docs provide plenty of examples.

Using LogMaint allows you to employ one standard script for all log and file cleanup tasks without any custom coding. It's in use at several enterprise environments.


Attachments (565 downloads)

Actually I am a Rocket Scientist! \:D

#191704 - 2009-01-07 07:24 PM Re: Deleting files after X Days [Re: Glenn Barnas]
bpogue99 Offline
Fresh Scripter

Registered: 2004-04-19
Posts: 15
Loc: USA
I usually do most of my scripts using Kix, but in this case, you could also use the resource kit tool called 'forfiles'.

forfiles /p c:\temp /s /m *.log /d -30 /c "cmd /c echo @file"

/p = is the path to start searching
/s = do all subdirectories as well
/m = the file mask (*.bak, *.log, *.txt, etc)
/d = date calculation (-30 is 30 days ago, +30 is 30 days from today, -7 is 7 days ago, etc)
/c = command to execute (change echo @file to del @file to delete the files)
KiXing to the limits!

#191771 - 2009-01-13 12:20 PM Re: Deleting files after X Days [Re: bpogue99]
JanSchotsmans Offline
Just in Town

Registered: 2009-01-12
Posts: 1
Yeah, I found forfiles to be a rather usefull tool too.

I use it to do cleanup of logs and backups on my fileservers and Exchange.

Like for cleaning up backup RAR's on the NAS:

forfiles -p "E:\Backups\SQL" -s -m *.rar -d -60 -c "cmd /C del @FILE"

This cleans up all backup rars in E:\Backups\SQL older then 60 days.
You could offcource, instead of deleting the files move them to a long term storage NAS:
forfiles -p "E:\Backups\SQL" -s -m *.rar -d -60 -c "cmd /C xcopy @FILE \\NAS\Backups"

And again scrub them from there over after a longer period of time.

Or for a webserver you can use it to compress and cleanup log files:

forfiles -p "E:\WWWApps\Logs" -s -m *.log -d -60 -c "rar a -rr -rv -t -ag E:\WWWApps\Logs\ -m2 -mt4 @FILE"
forfiles -p "E:\WWWAps\Logs" -s -m *.rar -d -365 -c "cmd /C del @FILE"

in the -c part of forfiles you can execute just about everything you want, so you could start another script from there or even do a "net send" to the admins to tell them there are files older then x days in the folder.

#191777 - 2009-01-14 03:55 AM Re: Deleting files after X Days [Re: JanSchotsmans]
Sealeopard Offline
KiX Master

Registered: 2001-04-25
Posts: 11163
Loc: Boston, MA, USA
Or use CleanDirectory()
There are two types of vessels, submarines and targets.

#191792 - 2009-01-14 07:50 PM Re: Deleting files after X Days [Re: Sealeopard]
Les Offline
KiX Master

Registered: 2001-06-11
Posts: 12734
I too use forfiles to delete some of the week old (or older) files in temp. I also use it in conjunction with robocopy to move folders that are a week old or older. I kick it off every day with a scheduled task.
Give a man a fish and he will be back for more. Slap him with a fish and he will go away forever.

#191811 - 2009-01-15 08:00 PM Re: Deleting files after X Days [Re: Les]
NTDOC Administrator Online   content

Registered: 2000-07-28
Posts: 11622
Loc: CA
Yes, too bad RoboCopy seems to no longer have good pro-active development. A great tool but could use a few more enhancements.
Page 1 of 1 1

Moderator:  Jochen, Allen, Radimus, Glenn Barnas, ShaneEP, Ruud van Velsen, Arend_, Mart 
Hop to:
Shout Box

Who's Online
0 registered and 465 anonymous users online.
Newest Members
KeithDTX, WuterHoestee, shootnhack, got2begin, Compu
17853 Registered Users

Generated in 0.065 seconds in which 0.025 seconds were spent on a total of 14 queries. Zlib compression enabled.

Search the board with:
superb Board Search
or try with google: