Geeks With Blogs
They call me Big Papa 4 kids and counting

I needed to find out how many people hit and stayed on an internal company site per day on average. So, I looked through the many MANY IIS log analyzers available on the nets to help me with the task.

While I was looking for a decent one I found the LogParser tool by Microsoft, which caught my eye because it's free. I ended up using a 30-day eval copy of something else, but I kept monkeying with LogParser because it survived the 30 days.

INFO: - The Unofficial Log Parser Support Site. Not really all that helpful directly, but if you do a Google Search looking for how to use it you'll land there alot and that's helpful. So, I give the site a B+: great content (A) but terrible layout and search (C). (Content weighs heavier, for you academia nuts out there.)

The tool is great but not all that. If you know SQL, it's similar enough that the learning curve won't leave you AS winded. But the functions are different enough that it's not a snap either.

Get it here:

IIS logs track the following fields (pulled right from the log headers): date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status. Wanna know more about those? Google that yourself, lazy.

Because the IIS logs track the username of the requestor, I figured I could take a look at exactly who visits my site each month. Here's the query:
logparser -i:iisw3c "select distinct to_uppercase(cs-username) as USER from R:\W3SVC1\ex070101.log where cs-method = 'GET' and cs-uri-stem = '/MyPopularSite/MyPopularPage.aspx'" -q:ON

  • The -i switch tells logparser that it'll be looking at not just IIS logs but IIS logs in the "Extended W3C" format. Woohoo!
  • The query basically says "Give me a roster of the individual users that loaded my page on this date (from the log file name)".
  • I had to use the to_uppercase command because some NT users authenticated as lowercase and then later as UPPERCASE, which tricked the logs into thinking they were 2 distinct users. Sneaky, sneaky.
  • MyPopularPage shares a log with many other LessPopularPages on our intranet, so the where clause separates the chaff.
  • Oh, one more thing. If you look very carefully you'll see a single quote followed by a double quote just after MyPopularPage.aspx. The single quote finishes the string literal, and the double quote finishes the SQL-like statement.

My IIS admins reminded me that MyPopularSite is hosted on a web farm (that's how popular I am) and that I needed to check the logs of all three servers. Despair not! Here's the command:
logparser -i:iisw3c "select distinct to_uppercase(cs-username) as USER from R:\W3SVC1\ex070101.log, S:\W3SVC1\ex070101.log, T:\W3SVC1\ex070101.log where cs-method = 'GET' and cs-uri-stem = '/MyPopularSite/MyPopularPage.aspx'" -q:ON
  • Just like I would query from multiple tables, I can query from multiple logs. I don't really care about joining them, so don't criticize my SQL. I want this thing out of my hair.

Yes, I know it sounds pretty silly to dump one log into another. But I'm not doing that- well, not exactly. What I want is a roster- one that I can view, sort, print, and post on my cube wall as My Fan Club. After all, these people love me! They deserve to have their names in 30-point font on my cube wall. They are my adoring public, after all, and I make them great...
Here's the command:
logparser -i:iisw3c -o:CSV "select distinct to_uppercase(cs-username) as USER, date into C:\temp\logger.csv from R:\W3SVC1\ex0701*.log, S:\W3SVC1\ex0701*.log, T:\W3SVC1\ex0701*.log where cs-method = 'GET' and cs-uri-stem = '/operatingsummary/mainform.aspx'" -q:ON
  • The file logger.csv is created or overwritten by the results of this command.
    I made another change, in case you didn't notice. I wrote the log name as "ex0701*.log" because the logs for the month of January are in the format ex070101.log, ex070102.log, etc. The * catches all of those logs. So that last command captures the user IDs of the people who visited MyPopularPage for the entire month of January. I could also have replaced the log name with "ex07*.log", which would have captured my popularity for the whole year.

The LogParser tool is pretty handy and very customizable, allowing you to analyze your logs however you wish. But it also requires a great deal of knowledge of the logs before-hand. I picked a different tool because I didn't know what I was looking at right away. The other tool gave me the big picture and now I can wile away the hours in the minutia of the IIS logs for my site, or any other logs for that matter. Yay!

MyPopularSite manages to attract the attention of a pitiful 12 users a day, on average.

My page refreshed every 6 minutes, so this is the number of times my page was requested without a user name in the month of January.
logparser -i:iisw3c "select count(*) as count, date, cs-username as user from R:\W3SVC1\ex0701*.log where cs-method='GET' and cs-uri-stem='/MyPopularSite/MyPopularPage.aspx' and cs-username=null group by date, cs-username"

The good news is that there are lots of these. So although there are only 12 new users per day, people are staying on my page for a very long time (days, usually).

Posted on Monday, October 1, 2007 8:01 AM | Back to top

Comments on this post: LogParser is powerful and feels familiar

No comments posted yet.
Your comment:
 (will show your gravatar)

Copyright © baileyrt | Powered by: