Log Parser and PowerShell – Part II

Welcome back!

Last post I talked about using the Log Parser executable from PowerShell. I also briefly mentioned the Log Parser COM component. In this post I will go into more depth on using the COM component from PowerShell.

The COM component exposes a simple object model consisting of only three main objects.

  • LogQuery – Object used to execute queries and batches
  • LogRecordSet – Object returned by LogQuery.Execute method
  • LogRecord – Child object of LogRecordset. Object returned by LogRecordSet.getRecord method

There are also objects containing the input and output formats. There are several of them and they are well documented in the Log Parser documentation. I will give examples as we progress.

The first step on my task to integrate with the Log Parser COM object was to put together a function library of the basic building blocks.

The library consists of the following functions.

  • Get-LPInputFormat – Returns a Log Parser input format object
  • Get-LPOutputFormat – Returns a Log Parser output format object
  • Invoke-LPExecute – Executes the LogQuery.Execute method and returns a LogRecordSet
  • Invoke-LPExecuteBatch – Executes the LogQuery.ExecuteBatch method to output the query in the requested format
  • Get-LPRecordSet – Executes a Log Parser Query and returns a LogRecordSet as an array of PowerShell objects
  • Get-LPRecord – Returns LogRecord object as a PowerShell object from the current record of a LogRecordSet object

With these functions we can support almost all of the Log Parser functionality in PowerShell. I did not build in support for the custom COM input type or the NAT and Datagrid output types. The NAT and Datagrid output types can be handled in a different way in PowerShell. The COM input format is a challenge I left for another day.

Here is the function library.

This library provides the basic functionality needed for Log Parser. We can use it in two basic scenarios.

One – We want to execute a Log Parser batch. This mode works exactly as the Log Parser command line tool works it queries an input file of a given type and writes the results to an output file of a given type.

Two – We want the results of a Log Parser query returned to a PowerShell object. This will allow us to further process the results using PowerShell or simply utilize the output facilities to display the results.

Example of Scenario One:

The code above provides a good way to create scheduled reports. The syntax is easier to follow than the command line switches at least for me.

Here is the sample output chart.

Yes, I took this log from a box with nothing but hacker traffic J

Now let’s look at the second scenario. We will return the results of a Log Parser query in a PSObject.

Here we are querying for application hang events in the application event log. We will use a Log Parser query to retrieve just the events we want. Then we will use PowerShell to filter out just the events for IE. The we can easily display the output any way we like.

Here is the output.

 

To me this really seems like the best of both worlds utilizing each tool for its strength.

Now I just have to figure out why IE hangs J

Best Regards,

Dave

Leave a Reply

Your email address will not be published. Required fields are marked *

16 comments

  1. Ivo Looser says:

    Hi Dave,

    I’m really impressed with your Library. Basicly it supports extact the Features that i need. Thanks it safed me alot of Time.

    For a personal Project i added the following Method to the Library.

    # —————————————————————————
    ###
    ###
    ### Creates a Table Object from a LogRecordSet
    ###
    ###
    ### Get-LPTableResults -query -inputtype
    ###
    ###
    # —————————————————————————
    function Get-LPTableResults
    {
    param([string] $query, $inputtype)

    $rsLP = Invoke-LPExecute $query $inputtype
    $tab = new-object System.Data.DataTable(“Results”)

    $i = 0

    for($i = 0; $i -lt $rsLP.getColumnCount();$i++)
    {
    $col = new-object System.Data.DataColumn;
    $ct = $rsLP.getColumnType($i)
    $col.ColumnName = $rsLP.getColumnName($i);
    switch ($rsLP.getColumnType($i))
    {
    “1”
    {
    $col.DataType = [System.Type]::GetType(“System.Int32”);
    break;
    }

    “2”
    {
    $col.DataType = [System.Type]::GetType(“System.Double”);
    break;
    }

    “4”
    {
    $col.DataType = [System.Type]::GetType(“System.DateTime”);
    break;
    }

    default
    {
    $col.DataType = [System.Type]::GetType(“System.String”);
    break;
    }

    }

    $tab.Columns.Add($col);
    }

    $i = 0
    while(-not $rsLP.atEnd())
    #for(; -not $rsLP.atEnd(); $rsLP.moveNext())
    {
    $rowLP = $rsLP.getRecord();
    $row = $tab.NewRow();
    for ($i = 0; ($i -lt $rsLP.getColumnCount()); ($i++))
    {
    $columnName = $rsLP.getColumnName($i)
    $row.$columnName = $rowLP.getValue($i);
    }
    $tab.Rows.Add($row);
    $rsLP.moveNext();
    }

    $ds = new-object System.Data.DataSet
    $ds.Tables.Add($tab)
    return $ds.Tables[“Results”]
    }

    Maybe it for others usefull to.

    Best Regards Ivo

  2. admin says:

    Ivo,

    I am glad you found it useful!

    Thanks for the addition.

    Dave

  3. steve says:

    Thx much for the contribution…would be very interesting to see the additions and refinements since the original post. What sorts of lessons have you learned since?

  4. Aaron says:

    This looks great. one question. I would love to use PowerShell and LogParser to grab the Event Logs from my domain controllers and send them to a SQL server for further analysis. How would you recommend doing something like that? I am just getting started with PowerShell and LogParser so any assistance would be greatly appreciated.

    Thanks

    app

  5. dmuegge says:

    Aaron,
    Thanks for the feedback.

    Log Parser has the ability to output data to SQL inherently. This is done via the SQL output format. The Log Parser help file has information and examples under the output formats section.

    To use my library to do this in PowerShell you would do something like this:

    . .Muegge_LogParser_Lib.ps1

    $query = @”
    SELECT *
    FROM Application
    WHERE EventID = 1002
    “@

    $inputformat = Get-LPInputFormat “evt”
    $outputformat = Get-LPOutputFormat “sql”
    $outputformat.server = “SVRNAME”
    $outputformat.database = “DBNAME”
    $outputformat.username = “USRNAME”
    $outputformat.password = “PASSWORD”
    Invoke-LPExecuteBatch $query $inputformat $outputformat

    You could also just return objects and use a PowerShell library such as SQLPSX ( http://sqlpsx.codeplex.com ) to add the data to SQL. This would give a greater degree of control but would also be more complex.

    Regards,

    Dave

  6. Ptolemy02 says:

    Hi Dave,

    I’m new to PowerShell and your library was exactly what I needed. This library is awesome, I was able to implement your first scenario on my own data [input:TSV file, output:TSV file] (I’m doing this against SharePoint UML log files in TSV format). Commands below:

    $infile = “C:abcMacine1-*.log”
    $outfile = “C:abcMy_Error_Results.tsv”
    $inputformat = Get-LPInputFormat “tsv”
    $outputformat = Get-LPOutputFormat “tsv”

    SELECT Timestamp, Area, Category, Level, Message
    INTO $outfile
    FROM $infile
    Where Message LIKE ‘%abc%’
    ORDER BY Timestamp

    Invoke-LPExecuteBatch $query $inputformat $outputformat | Out-Null

    Although not very useful, ultra cool :).

    However, when I tried to implement your second scenario on my own data [input:TSV file, output:PSObject], I always get exceptions stating that the columns that I asked for in the query do not exist. Commands below:

    $infile = “C:abcMachine1-*.log”
    $inputformat = Get-LPInputFormat “tsv”

    $query2 = @”
    SELECT Timestamp, Area, Category, Level, Message
    FROM $infile
    Where Message LIKE ‘%abc%’
    ORDER BY Timestamp
    “@

    $records = Get-LPRecordSet $query2 $inputformat —Exception Occurs Here
    $records | where{$_.Message -like “*bsc*”}

    I’m sure i’m missing something increadably simple, please help…:)

  7. George says:

    How do you include the -checkpoint value into the Invoke-LPExcute command in the Powershell script if you want to use your logparser library to monitor a log file so it only queries the new log file lines since the last iterration?

    I tried:
    # Execute batch
    Invoke-LPExecuteBatch $query $inputformat -checkpoint:
    but this is not working.

    Here is my PS code snippet:

    $infile =
    $checkpoint =
    $query = @”
    Select Text FROM $infile WHERE Text LIKE ‘%ERROR:%’
    “@
    $inputformat = Get-LPInputFormat “TEXTLINE”
    Invoke-LPExecute $query $inputformat -checkpoint:$checkpoint

    Script runs fine but no check file is created

  8. AJ says:

    This saved me a lot of time on a recent project that required some log-digging across multiple servers. Now I have something reusable and clean enough that other folks will be able to make sense of it down the line. Thanks a lot, great stuff.

  9. Pete says:

    For those interested, I was able to get the Checkpoint file to work. Here is the code I added to the Invoke-LPExecuteBatch function.

    function Invoke-LPExecuteBatch{

    param([string]$query, $inputtype, $outputtype, $checkpoint)

    $LPQuery = new-object -com MSUtil.LogQuery
    $inputtype.iCheckpoint = $checkpoint
    $result = $LPQuery.ExecuteBatch($query, $inputtype, $outputtype)

    return $result
    }

  10. Dave says:

    This is a really excellent and re-usage code library that works straight off. Thank you very much. Saved us a lot of time.

  11. Mark Easton says:

    This post has really helped me nail a tough and late requirement, so thanks David. I did find a couple of bugs in the code that I thought I’d share, as follows:

    1) The Get-LPRecordSet function requires a 2nd paramter, $inputtype, that needs passing to the Invoke-LPExecute function like this:

    $LPRecordSet = Invoke-LPExecute $query $inputtype

    2) In the Get-LPRecordSet function there’s a line that reads:
    $RecordCount = $LPQueryResult.length-1

    Since the $LPQueryResult variable doesn’t exist and should read:
    $RecordCount = $LPRecordSet.length-1

    • John Delisle says:

      Actually, I think it should just be like this – using a “-1” to hit the last item in the array. The fact that it previously evaluated to “-1” anyhow (due to the bug) was just a bit of luck.

      for (; -not $logParserRecordSet.atEnd(); $logParserRecordSet.moveNext()) {
      # Add record
      $logParserRecord = Get-LogParserRecord($logParserRecordSet)
      $logParserRecords += New-Object System.Management.Automation.PSObject
      $logParserRecords[-1] = $LogParserRecord
      }

  12. Greg says:

    I’ve tried following Pete’s advice above for enabling checkpoint functionality.

    It looks like $inputobj does not even have an “iCheckpoint” property.

    formatMessage
    fullEventCode
    fullText
    ignoreMessageErrors
    resolveSIDs

    I tried setting “iCheckpoint” as a note property without success.

  13. Yean says:

    Hi,

    Want to thank you for this great post. With this COM components exposed, is there anyway I can specify the day parameter? For example, logparser.exe file:test.sql?source=””+day=”2″ -o:NAT will look back 2 days of log files. I am just not sure how to specify that in this case.

    • dmuegge says:

      Yean,

      Thanks for the comment, but honestly I did this post in 2008 and have not worked with this in years. The only recommendation I can give you is to look at the .chm help file that is installed with logparser. If I remember correctly you would use the datetime functions provided by logparser in your sql query to accomplish your goal. Sorry I could not provide a more specific answer, but I would need to go back and learn it again myself.

      Regards,

      Dave