How do you parse your log files?

I’ve been playing around with different ways to parse log files and generate server reports. I currently have Webalizer, Analog, Omniture, and mod_log_sql in place. Unfortunately, I can’t find one package that provides everything I want. Anyone have any suggestions?

9 Responses to How do you parse your log files?

  1. phil says:

    awstats is pretty

  2. Mike Bianco says:

    I use awstats also, I like it 🙂

  3. william says:

    I’ve been a big AWstats fan for many years now. But if you want to get crazy there’s always http://www.visitorville.com/

  4. We use WebTrends for reporting and I like it. They have a lot of functionality. Cause they are licensed by pageviews its not even cheap. It makes only sense if you have business users with demand for good and easy to use reports.

  5. Kelvin Luck says:

    I use Visitors:http://www.hping.org/visitors/I have a cron task which runs it daily and archives each months stats as a seperate html page as the month is over… It basically shows what I am interested in without to much noise from stuff that I don’t care about…

  6. zwetan says:

    I like deep log analyzerwww.deep-software.comsurely not the best one out there,but the UI is great :).

  7. Gorka says:

    I prefer speed over features, and use Mch 5’s Fast Stats Analyzer. Very fasst and has just the mostly needed stats.Great for parsing huge logfiles (one of our websites produces an average of 3GB logs per month)

  8. Mike Dawson says:

    We used to have WT, but gave it up when they changed their license scheme, as mentioned earlier. More-popular sites are “punished” by having to pay more.We found this alternative that does just about everything WT does, but without the cost, without the complexity and without the need for a dedicated server to run WT adequately.http://www.weblogexpert.com/This product is around $80.

  9. Christian says:

    I used Summary for a long time and it was one of those fast but lacked a easy-to-use interface and features. Then I used WebTrends for a long time which has lots of features and a nice interface but the more deep I got into my data the more spiders and proxy server problems I found in my reports. WebTrends tries to keep a list of spiders and robots up-to-date but they never got all of them, so my data was always skewed.If you want data that’s accurate I strongly suggest not using a web log analyzer because no matter how up-to-date the apps are, it’s almost impossible to filter out everything because almost everyday someone creates some new spider or robots.I found a company that intergrates email tracking with website tracking and have been using them ever since. It’s called Manticore: http://www.manticoretechnology.com/