#191689 - 2009-01-07 11:59 AM
Deleting files after X Days
|
Ste
Fresh Scripter
Registered: 2003-05-28
Posts: 10
|
Hi Everyone
Does anyone know where I can find a script that will look in a directory and any folders within that directory and delete any files that are over x days old based on todays date?
eg:
d:\myfiles\ contains directories called 01, 02, 03 etc which in turn contain files that need to be deleted if older than x days
Thanks in advance everyone
S
Edited by Ste (2009-01-07 12:00 PM)
|
Top
|
|
|
|
#191692 - 2009-01-07 12:57 PM
Re: Deleting files after X Days
[Re: BradV]
|
Gargoyle
MM club member
   
Registered: 2004-03-09
Posts: 1597
Loc: Valley of the Sun (Arizona, US...
|
There is a UDF called SerialDate() that will make your life much easier. http://www.kixtart.org/UDF/UDF_lister.php?what=post&code=82573
Here is an example of how I used it to do exactly what you are looking for.
Global $array[], $array2[], $c, $Value[]
SetOption(NoVarsInStrings, On)
SetOption(WrapAtEOL, On)
SetOption(NoMacrosinStrings,ON)
If Open(1,"c:\ScriptHome\Support\iplist.txt") = 0
$c=0
Use * "\\server\share" /user:"user" /password:"password"
$Drive = @Result
While @ERROR=0
ReDim Preserve $array[$c]
ReDim Preserve $array2[$c]
$line = ReadLine (1)
If $line <> ""
$Value = Split ($line,",")
$array[$c] = $Value[0]
$array2[$c] = $Value[1]
EndIf
$c=$c+1
Loop
Close(1)
$CUDate = SerialDate(@Date)
$C = 0
While $C <= UBound($Array)
$compare = Dir($Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\")
While $Compare<> "" And @Error = 0
$Filedate = GetFileTime($Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\"+$Compare)
$Filedate = Split($Filedate," ")
$Filedate = SerialDate($FileDate[0])
If $CUDate - $FileDate > 30
Del $Drive+"\"+$Array2[$c]+"\"+$Array[$c]+"\backup config\"+$Compare
EndIf
$Compare = Dir()
Loop
$C = $C + 1
Loop
$compare = Dir("c:\ScriptHome\Logs\tftp*.*")
While $Compare<> "" And @Error = 0
$Filedate = GetFileTime($Compare)
$Filedate = Split($Filedate," ")
$Filedate = SerialDate($FileDate[0])
If $CUDate - $FileDate > 30
Del "c:\ScriptHome\Logs\"+$Compare
EndIf
$Compare = Dir()
Loop
Use $Drive /delete
Function serialdate($ExpD)
Dim $z,$h,$a,$b,$c,$y,$m,$d
If InStr($ExpD,'/')
$ExpD=Split($ExpD,'/')
$y=Val($ExpD[0])
$m=Val($ExpD[1])
$d=Val($ExpD[2])
If $m<3
$m=$m+12
$y=$y-1
EndIf
$SerialDate=$d+(153*$m-457)/5+365*$y+$y/4-$y/100+$y/400-306
Else
$z=0+$ExpD+306
$h=100*$z-25
$a=$h/3652425
$b=$a-$a/4
$y=(100*$b+$h)/36525
$c=$b+$z-365*$y-$y/4
$m=(5*$c+456)/153
$d=$c-(153*$m-457)/5
If $m>12
$y=$y+1
$m=$m-12
EndIf
$SerialDate=Right('0000'+$y,4)+'/'+Right('00'+$m,2)+'/'+Right('00'+$d,2)
EndIf
EndFunction
_________________________
Today is the tomorrow you worried about yesterday.
|
Top
|
|
|
|
#191696 - 2009-01-07 03:56 PM
Re: Deleting files after X Days
[Re: Ste]
|
Glenn Barnas
KiX Supporter
   
Registered: 2003-01-28
Posts: 4394
Loc: New Jersey
|
I use two different tools in the environments that I support.
The first is a general cleanup process that removes all files older than 3 days from specific locations. The age and paths can be specified by parameters in the script, but generally are not changed once defined. I've attached "cleanup.kix" in a zip file to this post.
The second tool is a log maintenance script that can archive/cleanup files from locations based on a config file. The same script can be used with different parameter settings to manage files from dozens of applications on a system, each with different archive, retention, and filename parameters. This tool can archive files (rename, to folder, or to ZIP), delete files, delete archived files, control services, and run external tasks before and after archiving. All actions are logged, and logs are removed on the same schedule as the files. A summary log is also created to interface with enterprise monitoring tools like Tivoli, Patrol, or HP OpenView.
The LogMaint tool will be available on our web site for download in the near future - I'm developing a GUI management console for the final package. If you'd like the current package, just place a request for LOGMAINT through our web site contact page and we'll email you the zip file with full docs. You'll need to manually create the INI file containing the configurations, but it's not difficult and the docs provide plenty of examples.
Using LogMaint allows you to employ one standard script for all log and file cleanup tasks without any custom coding. It's in use at several enterprise environments.
Glenn
Attachments
cleanup.zip (553 downloads) Description:
_________________________
Actually I am a Rocket Scientist!
|
Top
|
|
|
|
#191704 - 2009-01-07 07:24 PM
Re: Deleting files after X Days
[Re: Glenn Barnas]
|
bpogue99
Fresh Scripter
Registered: 2004-04-19
Posts: 15
Loc: USA
|
I usually do most of my scripts using Kix, but in this case, you could also use the resource kit tool called 'forfiles'.
forfiles /p c:\temp /s /m *.log /d -30 /c "cmd /c echo @file"
/p = is the path to start searching /s = do all subdirectories as well /m = the file mask (*.bak, *.log, *.txt, etc) /d = date calculation (-30 is 30 days ago, +30 is 30 days from today, -7 is 7 days ago, etc) /c = command to execute (change echo @file to del @file to delete the files)
_________________________
KiXing to the limits!
|
Top
|
|
|
|
#191771 - 2009-01-13 12:20 PM
Re: Deleting files after X Days
[Re: bpogue99]
|
JanSchotsmans
Just in Town
Registered: 2009-01-12
Posts: 1
|
Yeah, I found forfiles to be a rather usefull tool too.
I use it to do cleanup of logs and backups on my fileservers and Exchange.
Like for cleaning up backup RAR's on the NAS:
forfiles -p "E:\Backups\SQL" -s -m *.rar -d -60 -c "cmd /C del @FILE" This cleans up all backup rars in E:\Backups\SQL older then 60 days. You could offcource, instead of deleting the files move them to a long term storage NAS:
forfiles -p "E:\Backups\SQL" -s -m *.rar -d -60 -c "cmd /C xcopy @FILE \\NAS\Backups" And again scrub them from there over after a longer period of time.
Or for a webserver you can use it to compress and cleanup log files:
forfiles -p "E:\WWWApps\Logs" -s -m *.log -d -60 -c "rar a -rr -rv -t -ag E:\WWWApps\Logs\ -m2 -mt4 @FILE"
forfiles -p "E:\WWWAps\Logs" -s -m *.rar -d -365 -c "cmd /C del @FILE"
in the -c part of forfiles you can execute just about everything you want, so you could start another script from there or even do a "net send" to the admins to tell them there are files older then x days in the folder.
|
Top
|
|
|
|
Moderator: Jochen, Allen, Radimus, Glenn Barnas, ShaneEP, Ruud van Velsen, Arend_, Mart
|
0 registered
and 271 anonymous users online.
|
|
|