Tips / programs for manual log analysis

Any troubleshooting techie will tell you that most of his / her time is spent analysing megabytes of log files trying to figure out what happened, what went wrong and so on. Log files are usually about the only thing left in the aftermath of an incident. In the mad rush to bring systems back into production, usually any troubleshooting to provide evidence or artefacts that can help with problem analysis is forgone, leaving only the persistent logs for techies to play with. In light of this it’s extremely important that a company sets up some sort of central logging server (usually an FTP and syslog server) to which devices / servers can send their logs. Logs have a tendency of having a very short life span since they age out quickly due to the sheer volume of logging data. A separate server will provide better persistence, storage and will provide redundancy since it is harder for an attacker to wipe out logs on another server to cover his tracks.

Another small point that is often overlooked is the need for all units to have the same time, usually by using NTP. Without a common time baseline it will be difficult for techies to correlate information across different logfiles.

Initially the volume of logging can be daunting to a techie that has to go through them. I’ve found that two programs are invaluable tools for log dissection. The first is Notepad++ (available here), an open source “notepad on steroids”. The second is Microsoft Excel.

I cannot go through a whole discussion of log analysis here, that would fill an entire book, but I want to highlight some of the above programs’ features that are useful in the context of log analysis

Notepad ++

  • By far the handiest feature of notepad++ is the concept of bookmarks. This enables you to select, copy, cut all lines with an instance of a string. For example. say you would like to find all entries / lines in a log containing “Administrator login” to see at what times an admin logged into the unit. CTRL+F brings up the find function. Type in the search term and select the “mark line” and hit the “Find All” button. This will bookmark all the lines containing that term. Once completed, from you can copy/delete/cut all these bookmarked lines from the search > bookmark menu. Its useful for cutting down on the information you need to process.

np

  • Another feature is the option to use the “Find in files”. This is extremely handy when you are given a folder or zip file full of access logs and you’re looking for something in particular. In the same find function described above, you’ll see a tab with “find in files” which allows you to specify which folder / directory to search in, and a search term. Notepad ++ will then highlight all the matches along with filename, allowing you to quickly home in on relevant files
  • Last but not least, in notepad ++ if you highlight a single word (not a string of words), the viewer will highlight all other instances of the word in green. I find this extremely handy because most log files include a process ID, so highlighting this process ID will highlight all other lines with the same ID, making it easier to track (example below tracking PID E000A)

np2

Excel

This program is surprising useful in analysing logs when you’re not exactly sure what you’re looking for (as opposed to notepad ++, where most of the time I use a search term, hence I already have a rough idea of what i’m looking for. Most syslog servers can export logs in CSV format, which are ideal for importing into excel. Otherwise, you may use the search and replace function of notepad ++ to insert commas after pre-determined strings, or simply use the import function of excel, and specify space-delimiting which will allow you to manually set the fields.

  • Remove duplicates: this is accessible from the data menu (excel 2007), and is useful when you would like to see all the events that happened within the log timeframe. Almost all logs contain a very large amount of repetition, so highlighting all the sheet and hitting the remove duplicates will remove all the repetition and leave you with only unique events, making it easier to spot critical events or events of interest
  • Check number of occurrences of an event: this needs some playing around, but it will give you a quick overview of the most common events in the log:
    1. Highlight all the sheet. Home > sort & filter > custom sort. Sort on the column that contains the events, in my case this is column B.  The sheet should now be ordered and similar events should be under one another
    2. Pick an empty column (eg column N in my case), in the first row enter the number 1. In the second row enter the formula “=IF(B2=B1,N1+1,1)”. (remember column B is my event column and column N is my empty column) Breaking it down, this basically checks the events, and if they are the same, would increment a value in column N. Copy paste the formula to the whole of column N.
    3. You should end up with a column that has incrementing numbers for each similar event. Copy this column. Select another column, right click, select “paste special” and paste only the values, not the formula.
    4. You can now use the steps similar to (1) to sort on this column, from largest to smallest, and then remove duplicate entries. You’ll end up with a sheet of the most common entries in the log, gives you an idea where to start looking (click on the picture below for a full view, most resolutions will truncate the image unfortunately)

excel1

  • In relation to the above point, you can also plot a graph of time vs number of times an event happened.
    1. Choose the specific event you would like to plot.
    2. Open the csv file in notepad ++ Hit CTRl+F to bring up the search function, and enter in the string to search for. Then use the “mark line2 and hit the “FIND ALL” button. copy paste all bookmarked lines to a separate  CSV file and load that into excel (i’m sure there’s an easier way to this but anyway) to isolate the event you’re after
    3. Sort the sheet by time
    4. Use a similar formula to the above, but this time column B should be the time column, not the data column.
    5. Plot the resulting number of occurrences and the time together, giving a nice graph:

excel2

I apologise for the sketchiness of these notes, they are primarily my reference, but if you have questions or need clarifications just let me know in the comments below🙂