|
|
|||||||
Do you have 15 applications writing to log files? Do you have 15 different scripts to manage those log files? If not, are your log files taking over your system? NOT ANY MORE! Here's a KiXtart script package that will do all the work for you just by maintaining a single configuration file. Now, to make Lonk happy, the script comes first, and the documentation follows! Caveat: You may want to change the "\usr\local" and "\Temp" references before using this code. These are standards in our environment. I prefer NOT to use %TEMP% because it is different for each user on 2K and higher (and I've had higher priorities than to change it on 300+ servers). ; Utility to archive and remove log files based on age ; Glenn Barnas / FRIT-EROC ; 3-4-2002 ; ; VERSION 1.3 ; ; Obtains settings from \usr\local\logmaint.ini, based on the SECTION argument ; ; Must be called as "kix32 \path\to\this\script\logcleanup $SECTION=section_name" ; See the LogMaint.BAT file for more information. ; ; see readme.txt or "Common Log Cleanup utility.doc" for more info ; 06/11/02 - Corrected error in debug code, added DEBUG option to the ini file, ; Added test code - MD command executes only if target doesn't exist ; Enclosed path parameters in quotes to support paths with spaces ; when used by shell commands (ugh!) ; 06/13/02 Added code to write errors to log file and log errors to LMSTATUS.ini ; to facilitate central log status collection. BREAK on ; allow terminating the script without logging off ; this basically turns off "login script mode" ; NOTE - throughout the script, the $RTN variable is used to collect the return code ; of various functions. It is tested only when necessary. Its primary function ; is to prevent the return code from being displayed on the console (as a ; series of zeros or other numeric values) ;================================================================================= ; If the section argument isn't defined, carp and die If $SECTION = "" ? "Required parameter to identify logs to archive was not specified." + @CRLF + "Can't continue!" Exit EndIf ; format today's date for use in file names $DATE = Substr(@DATE,1,4) + Substr(@DATE,6,2) +Substr(@DATE,9,2) ; Read the parameters from the LOGMAINT.INI file ; Wrapping the READ statement in the EXPANDENVIRONMENTVARS function allows ; environment variables to be defined in the INI file for the path values ; (ie SRCDIR=%SYSTEMROOT%\logs) $INIFILE = "%SystemDrive%\usr\local\logmaint.ini" $SRCDIR = ExpandEnvironmentVars(ReadProfileString($INIFILE, $SECTION, "SRCDIR")) $DSTDIR = ExpandEnvironmentVars(ReadProfileString($INIFILE, $SECTION, "DSTDIR")) $DEBUG = ReadProfileString($INIFILE, "COMMON", "DEBUG") $SFLIST = ReadProfileString($INIFILE, $SECTION, "FILES") $BACKUP = ReadProfileString($INIFILE, $SECTION, "BACKUP") $DELETE = ReadProfileString($INIFILE, $SECTION, "DELETE") $MAXAGE = ReadProfileString($INIFILE, $SECTION, "AGE") $SERVICE = ReadProfileString($INIFILE, $SECTION, "SERVICE") $PREARCH = ExpandEnvironmentVars(ReadProfileString($INIFILE, $SECTION, "PREARCH")) $POSTARCH = ExpandEnvironmentVars(ReadProfileString($INIFILE, $SECTION, "POSTARCH")) ; define the archive log name and location $AFILE = "_" + $DATE + "_archive.wri" $ARCHIVELOG = $DSTDIR + "\" + $AFILE $FIRSTYEAR = 2001 ; anything older than this year is not calculated, just deleted ;================================================================================= ; Make sure that the DSTDIR exists before we start $P = $DSTDIR ; use a working VAR $MP = "" ; Make this Path If InStr($P, ":") = 2 ; If the second char is a ":" $P = SubStr($P, 3, 252) ; drop the leading drive definition $MP = Left($DSTDIR, 2) EndIf if Left($P, 1) = "\" $P = Substr($P,2,99) ; remove the leading dir delimiter for now EndIf $DIRS = split($P,"\",-1) ; put the individual dirs of the path into an array for each $D in $DIRS $MP = $MP + "\" + $D ; add the next dir to the root path If Not Exist($MP) ; make sure each dir in the path exists md $MP ; create it if necessary EndIf If $DEBUG = "T" ? "processing $D" ? "$MP" ? "@SERROR" EndIf Next ; Open the archiving log file $RTN = Open(1,$ARCHIVELOG,5) If @ERROR ? "Error opening archive log file - exiting @CRLF FILE: $ARCHIVELOG" exit EndIf ; Append today's date and a message to the log file $RTN = WriteLine(1,"Log cleanup executed on @DATE using [" + $SECTION + "] parameters." + @CRLF) ; Update the LMStatus file $TMP = ReadProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "LastRunD") If $TMP = "" $TMP = "Never" ; LastRunD doesn't exist, set PriorRun to "Never" EndIf $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "PriorRun", $TMP) $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "LastRunD", @DATE) $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "LastRunT", @TIME) ;================================================================================= ; If BACKUP <> 0, then MOVE the file(s) from the SRCDIR to the DSTDIR, adding the ; current date to the front of the file name (test.log becomes 20020301_test.log) ; Using MOVE in the same directory results in a simple RENAME operation! If $BACKUP <> 0 $RTN = WriteLine(1,"Archive process starting." + @CRLF) $CNT = 0 $ECNT = 0 If $SERVICE <> "" ; Stop any service defined prior to archiving If $DEBUG = "T" ; to insure the files can be moved/renamed ? "xnet stop " + Chr(34) + $SERVICE + Chr(34) Else Shell "xnet stop " + Chr(34) + $SERVICE + Chr(34) EndIf EndIf ; PREARCH/POSTARCH - commands to run prior to archiving. An alternative method ; for stopping but not starting services that require manual password entry, or ; procedures that require multiple steps to prepare the files for archiving. ; It is assumed that the creator of the config file will include any quotes or ; other escape sequences to handle proper argument passing!!! If $PREARCH <> "" ; If a PREARCH command is defined If $DEBUG = "T" ? "running $PREARCH" Else Shell "$PREARCH" ; run it! EndIf EndIf ; Loop through all files in the SRCDIR to see what should be MOVED $FILES = Split($SFLIST, ",", -1) ; split the list into an array For Each $FILE in $FILES ; process each arg in the array $SPATH = $SRCDIR + "\" + $FILE $FNAME = Dir($SPATH) ; find the matching files in the SRC directory While $FNAME <> "" And @ERROR = 0 ; process all non-null names ; skip parent paths ("." & "..") and previously archived files If Left($FNAME,3) <> "A-!" And $FNAME <> "." AND $FNAME <> ".." $MSG = " " + $SRCDIR + "\" + $FNAME + " -> " + $DSTDIR + "\A-!" + $DATE + "_" + $FNAME + @CRLF $RTN = WriteLine(1,$MSG) $CMD = "cmd.exe /c move " + CHR(34) + $SRCDIR + "\" + $FNAME + CHR(34) + " " + CHR(34) + $DSTDIR + "\A-!" + $DATE + "_" + $FNAME + CHR(34) If $DEBUG = "T" ? "$CMD" Else shell $CMD ; move/rename the file If @ERROR <> 0 ; if an error occured, ? ? "@SERROR" ; display the error and write it to the log $RTN = WriteLine(1," @SERROR") $ECNT = $ECNT + 1 ; increase the error count EndIf EndIf $CNT = $CNT + 1 EndIf $FNAME = Dir() ; get the next file Loop Next $RTN = WriteLine(1,"Archive process complete, " + $CNT + " files archived." + @CRLF + @CRLF) ; write to the common log showing # of files archived and number of errors encountered $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "Archive", $CNT) $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "ArchErr", $ECNT) If $POSTARCH <> "" ; run a post-acrhiving command if defined If $DEBUG = "T" ? "$POSTARCH" Else Shell "$POSTARCH" EndIf EndIf If $SERVICE <> "" ; start a service if defined If $DEBUG = "T" ? "xnet start " + Chr(34) + $SERVICE + Chr(34) Else Shell "xnet start " + Chr(34) + $SERVICE + Chr(34) EndIf EndIf Else ; rather than write nothing (and leave the viewer guessing), say that we did nothing! $RTN = WriteLine(1,"Archive process not performed, 0 files archived." + @CRLF + @CRLF) $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "Archive", 0) EndIf ; BACKUP PROCESS ;================================================================================= ; If DELETE <> 0 then DELETE all files in DSTDIR older than MAXAGE days that have ; the auto-archive filename prefix of "A-!" ; Include "*_archive.wri" (our own log file) in the list of files that we check If $DELETE <> 0 $RTN = WriteLine(1,"Cleanup process starting." + @CRLF) $CNT = 0 ; Create an array of monthly day values (cumulative days, 13 values including the ZERO arra y cell) ; TO keep the logic simple, the array starts with a ZERO postion, but we use only  ;positions 1-12 ; which correspond to the month integer values. Position ZERO holds "0" as a placeholder on ly. $MV = split("0,0,31,59,90,120,151,181,212,243,273,304,334", ",", -1) ; calculate today's date in days from 1/1/2001 (first day of $FIRSTYEAR) $CDATE = split(@DATE, "/", -1) $X = Val($CDATE[1]) $CDV = ((Val($CDATE[0]) - $FIRSTYEAR) * 365) + $MV[$X] + $CDATE[2] ; archived files begin with A-! or are our own log files ; if we don't do archiving (BACKUP=0) then archived files are defined ; in the SFLIST variable. The SFLIST value is included when we DO archiving ; ONLY when the SRCDIR and DSTDIR values are different. If $BACKUP = 0 $ALLFILES = $SFLIST + ",_????????_archive.wri" Else $ALLFILES = "A-!*,_????????_archive.wri" If $SRCDIR <> $DSTDIR ; Add SFLIST if SRC and DST dirs are different $ALLFILES = $SFLIST + "," + $ALLFILES EndIf EndIf ; Loop through all files in the DSTDIR to see what should be DELETED $FILES = Split($ALLFILES, ",", -1) ; split file list into array For Each $FILE in $FILES ; process the files in the array $DPATH = $DSTDIR + "\" + $FILE $FNAME = Dir($DPATH) ; read the directory While $FNAME <> "" And @ERROR = 0 If $FNAME <> $AFILE ; get timestamp $FD = Left(GetFileTime($DSTDIR + "\" + $FNAME),10) $FDATE = split($FD, "/", -1) If Val($FDATE[0]) < $FIRSTYEAR ; prior to FIRSTYEAR? ANCIENT FILE!! If $DEBUG = "T" ? "Del " + $DSTDIR + "\" + $FNAME Else Del $DSTDIR + "\" + $FNAME EndIf $CNT = $CNT + 1 ; Append the deleted file to the log file $RTN = WriteLine(1," " + $DSTDIR + "\" + $FNAME + " - " + $FD + " - ANCIENT!" + @CRLF) Else ; convert to Days since 1/1/2001 if year is 2001 or later $X = Val($FDATE[1]) $FDV = ((Val($FDATE[0]) - $FIRSTYEAR) * 365) + $MV[$X] + $FDATE[2] ; find number of days between file date and today ; delete $FNAME if difference is greater than $MAXAGE If ($CDV - $FDV) > $MAXAGE If $DEBUG = "T" ? "Del " + $DSTDIR + "\" + $FNAME Else Del $DSTDIR + "\" + $FNAME EndIf $CNT = $CNT + 1 ; Append the deleted file to the log file $RTN = WriteLine(1," " + $DSTDIR + "\" + $FNAME + " - " + $FD + @CRLF) EndIf EndIf EndIf $FNAME = Dir() Loop Next $RTN = WriteLine(1,"Cleanup process complete, " + $CNT + " files deleted." + @CRLF + @CRLF) $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "Cleanup", $CNT) Else $RTN = WriteLine(1,"Cleanup process not performed, 0 files deleted." + @CRLF + @CRLF) $RTN = WriteProfileString("%SystemDrive%\LMStatus.ini", $SECTION, "Cleanup", 0) EndIf ; DELETE ; Close the log file $RTN = Close(1) Here's a batch file the I use to invoke the script. It makes it easier to create a scheduled task with "logmaint LogID" instead of "kix32 logcleanup.kix $SECTION=LogID" code:And here's a sample INI file used to manage the event logs (daily) and the iPlanet LDAP logs (weekly)@echo off code:And, as an added bonus if you order in the next 30 minutes, the ELDUMP.KIX script referenced in the sample above.[COMMON] ; ELdump.kix - dump all event logs prior to archiving ; Glenn Barnas / Network Management component ; FRIT - 6/10/2002 ; Remove any prior .EVT files from the temp location ; Dump (backup) the requested event log ; If no errors occur while dumping, Clear the requested event log ; log maintenance moves the newly dumped .EVT files to an archive location ; using a name that identifies the date and data source (A-!20020913_APP.EVT) Break on ; make sure the TEMP folder exists! md "%SystemDrive%\temp" ; make sure that any previous .EVT output files don't exist... del "%SystemDrive%\temp\*.EVT" $RTN = BackupEventLog ("Application", "%SystemDrive%\Temp\APP.EVT") If @ERROR = 0 $RTN = ClearEventLog ("Application") EndIf $RTN = BackupEventLog ("Security", "%SystemDrive%\Temp\SEC.EVT") If @ERROR = 0 $RTN = ClearEventLog ("Security") EndIf $RTN = BackupEventLog ("System", "%SystemDrive%\Temp\SYS.EVT") If @ERROR = 0 $RTN = ClearEventLog ("System") EndIf OK - now some documentation: Common Log Cleanup utilityThe goal of the Common Log Cleanup utility is to provide a single tool to perform log management. Using a single tool consolidates development effort and minimizes post-deployment troubleshooting. The basic operational parameters of this script are maintained in a configuration INI file. Specifying the section name of the INI file causes the parameters from that section to be read and processed. This allows one script and one configuration file to manage dozens of application log files. To properly invoke the script, the following syntax must be used to pass the SECTION argument:
A batch file – logmaint.bat – is included to simplify calling this script. It takes one argument, the name of the section from the INI file.
In the above example, any file in the C:\logs directory that ends with “.LOG”, and the specific file “events.txt” will be moved to the C:\Logs\Backup directory as specified by the DSTDIR value. The files that are moved will be renamed with the current date, using the format “A-!YYYYMMDD_originalname”. The use of the “A-!” filename prefix indicates that it was automatically archived and aids in the later cleanup process. The file archive process is performed because the BACKUP value is not zero. If it were zero, then no archiving would be performed. This behavior may be desired when the application itself creates a new log file on a regular basis, but does not remove the old files. (This is the case in many web server applications such as IIS.)
Setup ScriptsThe installation is performed via the standard SWDIST tool install. The installation script accepts two optional parameters. One is used to specify a remote system name, while the other indicates that a cleanup task should be added.
INI File FormatSRCDIR and DSTDIR are standard path statements, including drive letters, but without a trailing slash. (Ie: d:\archive\logs). Both values may be the same if archiving within the same directory is desired. SRCDIR defines the location of the original log files, while DSTDIR defines the folder that should hold the archived copies.
WARNINGS!If services are holding log files open, they must be stopped by external means prior to running this script, and restarted again after this script completes, or by specifying the SERVICE parameter.
PREARCH=%COMSPEC% /c yourcommand.bat
PREARCH=kix32.exe c:\usr\local\bin\ELdump.kix The task should be scheduled for 23:59 each night so that all events of the current day are in the resulting log files. While it is possible for up to 50 seconds of events to be logged into the next day’s file, this is significantly less troublesome than having the resulting log files dated a day later than the data they actually contain. Keywords: Log cleanup, Log Archiving, Log Rotation, Log Management |
||||||||
|
|
|||||||
ooh. sounds like trouble. heh, got my linux fill up hd with logrotate stuff until got mad and disabled it. it's like. instead of not looking at your 15 logs, you can skip looking one |
||||||||
|
|
|||||||
Actually, we're legally bound by our audit department to maintain copies of our computer logs for a minimum of 2 years, and 7 years in certain cases. We keep them on tape (via the mainframe), but it was a nightmare to maintain individual scripts to stop a service, copy the log to an archive location, give it a unique name, and restart the service. We had - in one case - 4 copies of the same script in 4 locations to back up logs for 4 instances of a web server on the same machine. Each script was just slightly different in structure, the result of different people troubleshooting various problems. This script was the solution. 1 script in 1 location, managing any number of logs. One of our servers has 11 logs managed with this script - Event Log, PerfMon Log, 3 web instance logs, 2 LDAP logs, 2 Cold Fusion logs, and 2 SilverStream logs! Oh - we don't ignore the logs, either. We have scripts (perl, sorry!) that parse the logs, strip off the info messages, then summarize the warnings and errors in an email each morning to the admin staff. This lets us get a quick view of the kind & number of warnings on each of the 200+ servers. With 300 more being deployed this year, we rely heavily on automation. Glenn |
||||||||
|
|
|||||||
Nice utility. We want to archive off the log files at a preset date (eg end of month) and then delete the files in the source directory that have been archived (to cleanup). At the mo yours removes all files from the source. We will then delete the archive about a week later using a seperate script. This will allow time to ensure we have had a good backup. Do you have a variation of your utility to do this, or know of one that can? Failing that, do you mind if I hack your code apart, unless you fancy doing it EDIT: Thinking aloud............ We could use another variable in the .ini file for a "start date". |
||||||||
|
|
|||||||
Um - how about two tasks calling this script with two definitions in the ini file. First scheduled task archives the logs. Specify the same source and dest dirs and it will just rename the files in place. No deletion will occur. Second task just deletes files more than 1 day old that begin with the archive identifier "A-!*" - this will clean up your archived files. Here's an example config file: Code:
Keep in mind that you might need to add the definition to stop a service prior to archiving (Event1). No need to recode anything, if I understand you correctly. Finally, no need for a start date in this script, as you can define start dates in the scheduler. I use this exact process for one application. The scheduled task references LogMaint.bat with the name of the INI file section. The INI file path is hard-coded to \usr\local - this might be the only mod you really need to make aside from the script path reference in the bat file. Glenn |
||||||||
|
|
|||||||
Thanks for a speedy reply, which was cunning, but not for what we need. Our plan is to archive all log files, for servers that are not part of a backup regime, to a single repository (server). I was gonna use PREARCH to call a job to map the drive. That server will then be backed up. The Archive directory will then be deleted about 7 days later, allowing for a working backup to have been taken. (this can be done by a simple scheduled deltree). For servers that are backed up, it is pretty much the same except the will bhe archived to a local drive. Thing is, the end of month is not always the end of month, if you know what I mean, hence the requirement to specify the date to archive from. I'm just trying to work out how to add a date caculation into your code for the file move based on a date supplied in the .ini file. When I come up with some code I will post it, though I am a bit snowed under at the mo. |
||||||||
|
|
|||||||
The end of the month is always the beginnign of the next month minus one day. There ate date manipulation UDFs in the UDF forum adn KiXtart macros that will enable you to calculate the last day of any given month. |
||||||||
|
|
|||||||
Still, I think you have a scheduling issue and not a script issue. You need to schedule an event with 4 triggers. One for the months with 30 days, one for those with 31, and one each for Feb 28 & 29th.. the trigger for 2/28 should run a special script that continues only if "tomorrow" isn't 2/29. It then runs the normal cleanup script. Of course, you could schedule it for midnight on the first of every month... As far as central logging, here's what we do - we archive the files to a local folder with a short retention period. In our case it's 30 days, but it could be 1 or 2 days. (note - a -1 value removes all files regardless of age, while a zero value means 0-23:59 - less than a full day.) We use the PostArch command to copy the archived files to a central server. Using Robocopy, only newly generated files are copied. We maintain the central files for 90 days and then offline for 2 years (audit requirement), but a global delete would work. BTW - the postarch script uses a UNC copy so no drive mapping is required. Glenn |
||||||||
|
|
|||||||
Quote: In an ideal world - Yes. Our end of month, especially in relation to "end of month backups" seems to fall anytime within 4-5 days after the previous month has actually ended. But sods law end of month one month will be the end of the month, in which case we would like to specify the day before the end of month. If you see what I mean EDIT: I guess what I am trying to say, is it would be nice to have a date calculation before the Move. At the mo ALL files are moved from the source directory regardless of age. |
||||||||
|
|
|||||||
Quote: That's what the -1 value in the AGE parameter will do. <YOU> need to figure out how to handle your own "strange" month-end definitions! Good luck! Glenn |
||||||||
|
|
|||||||
Ahhhhh, it all makes sense now It would still be nice to use a defined date, and that then calculates the "age" from the date it actually runs. Anyways, thank you for all your help and a great utility. |