|
Thanks NTDOC, I need to do a little more work on it before it's ready to be packaged.
Lonkero, Actually speed does become an issue as the number of computers grows. Reading from the DB into an array is still fairly fast but adding to the DB becomes slower and slower as the DB grows.
If I knew more about ADO and SQL commands I could reformat the DB to make everything faster. As is, theres on table in the DB and in that one table theres two colums. One column for computer name and another column for the corresponding software name.
You can probably imagine how quickly the DB grows, currently I have about 600 computer log files that I'm parsing into the DB. Figuring about .5-2 seconds a pice thats about 10 mins.
Striping the copies out of each array also takes some time. Just looked in my DB and theres over 15000 records, which means the script has to pull out one instance of each entry in the array which can contain tens of the same string (if that made any sense). This takes quite some time, currently about 7 mins.
So, yes speed is an issue but I'm pretty patient.
|