HDS AMS 2000 Performance Analysis with PowerShell and PowerGadgets

Welcome!

It has been a while since my last post due to a very busy schedule with a SAN and virtualization project.

I have been working on an implementation of a HDS AMS 2500 midrange array for a VMWare vShere 4 environment. So far everything has been working and performing well. The management software included with the HDS AMS 2000 series array is SNM2(Storage Navigator Modular 2), A java based web application. This software also has a command line version which appears to be pretty comprehensive. It consists of a series of DOS executables, which can be run from PowerShell. There are a series of scripts I have been working on for viewing and creating storage resources on the array. I will share many of these in future posts. In this post I want to share some scripts I have written to extend the functionality of the performance monitoring utility in SNM2.

The base functionality of the array allows you to capture performance statistics to a text file. The file can be captured manually or automatically for a specified time period and interval down to one minute. One text file is produced per capture or all captures can be written to one file. Also, I believe based on the information I read in the SNM2 manual you can do some graphing with the web interface, but it requires an additional license and personally I think the PowerGadgets graphs are better.

The 4 scripts I have started with are get-performance_processor.ps1, get-performance_ports.ps1, get-performance_raidgroups.ps1, get-performance_luns.ps1, which do pretty much what they say and produce the following PowerGadgets charts.

Chart

The chart group is a tabbed interface which allows you to tab through the controllers and ports/RG/LU/Procs depending on the script being used. Each script generates different groups of charts for different performance counters. I have not implemented all of the performance counters just the ones which are most important to me now. I will be improving these scripts over time and implementing more counters. Here is an example of how the script works.

Script

After executing the script it will ask whether or not to collect data, if yes it will prompt for interval in minutes and time period. If no it will use previously collected data in the default output directory. Next it will ask to list data in text output. Then it will prompt for generation of each group of charts for ports, raid groups, luns or processors depending on the script run.

Now to the script. All of the scripts rely on the start-session.ps1 script and also require a password file be set for logging into the array. Additionally, an array has to be registered.

Example 1 shows a PowerShell script which will register an array and set the admin password.

You will need to replace ARRAYNAME, USERNAME and the IP Addresses for your environment.

 Example 2 shows the start-session PowerShell script which defines environmental information.

You will need to change the paths and ARRAYNAME for your environment.

Example 3 shows the get-performance_processor.ps1 script

This script collects the data from the array in separate files. Reads the pertinent data from the files and transforms it into object form which is fed into the PowerGadgets out-chart cmdlet. The other three scripts are longer as they digest more information.

To use these scripts you will need PowerShell, PowerGadgets( this a pay product with a free trial ), SNM2 CLI, and the script files attached to this post. Oh and an HDS AMS 2000 array.

Here are the script dowmloads
start-Session.txt
get-performance_processor.txt
get-performance_ports.txt
get-performance_raidgroups.txt
get-performance_luns.txt

Save the files to your script directory and change the extensions to .ps1

I hope someone finds this useful.

Regards,

Dave

PowerShell, Log Parser, PowerGadgets, and GeoIP what fun!

Welcome!

One day I was pondering how I might use log parser to map visitors to a website by state. I am aware this is easily done with tools like Google Analytics, but I was interested in using existing logs for the info.

Using the PowerShell and Log Parser functions from the library listed in a previouse post. Log parser can easily get the visitors by IP Address from an IIS log.

The next task is to get the location of the IP addresses. The tool I chose for this task was the free GeoLite City from MaxMind http://www.maxmind.com/app/geolitecity. Here is an example:

There are a couple of ways to use the MaxMind GeoIP database. It can be used in it’s native binary format or it can be imported into SQL from csv files. MaxMind recommends using the binary format, which is what I chose to do. MaxMind also provides API’s for use with a variety of platform’s. I chose to use the COM version.

After the location is determined the counts are calculated. This brings us to the point where we need to chart the results. The tool I chose for this operation is PowerGadgets. This is a charting tool made for use with PowerShell, it can be handy. Here is an example:

And here is our final Chart.

 

This works pretty well the only drawback to this solution is that PowerGadgets is a pay tool, but since I own a copy it suites my needs.

Regards,

Dave

Automated Machine Builds with PowerShell and AutoIT

I have been using PowerShell recently to create scripted builds for virtual machines. This method provides some benefits over imaging.

  • A single base image per OS can be used
  • Applications which have issues with imaging can be easily installed such as SQL Server, BizTalk, Exchange, Etc.
  • Scripted installs can be changed or updated easier than images
  • Additional configuration changes such as registry settings and/or configuration files can be easily done


One challenge I faced with this process was automating the installation of applications which do not have unattended install capabilities.

A great tool I found for this was AutoIT. It provides a method to script the windows GUI and enables automating those installs. AutoIT consists of a COM object, scripting language, editor, compiler, and a cool window info tool. I will not get into the details of AutoIT here it has good documentation. I found PowerShell to be much stronger at the scripting part, but the COM object provides excellent window control functionality to PowerShell. The window info tool that comes with AutoIT is also very handy. Here is a simple example:

To run the code above you will need to install AutoIT or just register the AutoItX.dll COM object. The code will launch notepad and type “This is a test”. While this is a pretty useless example it shows the basic purpose of AutoIT and how it is used from PowerShell.

The next piece I looked at was creating a function library to make it easy to use AutoIT from PowerShell.

Example: AutoIT PowerShell function library

Then using the functions from the library above I created application installation functions for various packages.

Example: PowerShell Application Installation Function

Using the method above I created a library of PowerShell installation functions to be called from automated build scripts.

This seems to be working pretty well so far and it allows automated builds to be scripted quickly. PowerShell and AutoIT work well together, the AutoIT COM object provides excellent GUI control and PowerShell provides strong scripting and debugging features.

Best Regards,

Dave

Log Parser and PowerShell – Part II

Welcome back!

Last post I talked about using the Log Parser executable from PowerShell. I also briefly mentioned the Log Parser COM component. In this post I will go into more depth on using the COM component from PowerShell.

The COM component exposes a simple object model consisting of only three main objects.

  • LogQuery – Object used to execute queries and batches
  • LogRecordSet – Object returned by LogQuery.Execute method
  • LogRecord – Child object of LogRecordset. Object returned by LogRecordSet.getRecord method

There are also objects containing the input and output formats. There are several of them and they are well documented in the Log Parser documentation. I will give examples as we progress.

The first step on my task to integrate with the Log Parser COM object was to put together a function library of the basic building blocks.

The library consists of the following functions.

  • Get-LPInputFormat – Returns a Log Parser input format object
  • Get-LPOutputFormat – Returns a Log Parser output format object
  • Invoke-LPExecute – Executes the LogQuery.Execute method and returns a LogRecordSet
  • Invoke-LPExecuteBatch – Executes the LogQuery.ExecuteBatch method to output the query in the requested format
  • Get-LPRecordSet – Executes a Log Parser Query and returns a LogRecordSet as an array of PowerShell objects
  • Get-LPRecord – Returns LogRecord object as a PowerShell object from the current record of a LogRecordSet object

With these functions we can support almost all of the Log Parser functionality in PowerShell. I did not build in support for the custom COM input type or the NAT and Datagrid output types. The NAT and Datagrid output types can be handled in a different way in PowerShell. The COM input format is a challenge I left for another day.

Here is the function library.

This library provides the basic functionality needed for Log Parser. We can use it in two basic scenarios.

One – We want to execute a Log Parser batch. This mode works exactly as the Log Parser command line tool works it queries an input file of a given type and writes the results to an output file of a given type.

Two – We want the results of a Log Parser query returned to a PowerShell object. This will allow us to further process the results using PowerShell or simply utilize the output facilities to display the results.

Example of Scenario One:

The code above provides a good way to create scheduled reports. The syntax is easier to follow than the command line switches at least for me.

Here is the sample output chart.

Yes, I took this log from a box with nothing but hacker traffic J

Now let’s look at the second scenario. We will return the results of a Log Parser query in a PSObject.

Here we are querying for application hang events in the application event log. We will use a Log Parser query to retrieve just the events we want. Then we will use PowerShell to filter out just the events for IE. The we can easily display the output any way we like.

Here is the output.

To me this really seems like the best of both worlds utilizing each tool for its strength.

Now I just have to figure out why IE hangs J

Best Regards,

Dave

Log Parser and PowerShell – Part I

Welcome!

Log Parser and PowerShell are both great tools and they work well when used together. Yes, you can do pretty much everything Log Parser does with PowerShell alone, but part of PowerShell’s mission is to better leverage current tools. I believe this is an excellent example. Also it has been my experience Log Parser performs better at the task. Steve Schofield also blogged about the performance of Log Parser and PowerShell for querying logs here.

There are two ways to interact with Log Parser from PowerShell. The first and the easiest to get started with is the command line version logparser.exe.

Here is the same command line query example from the last post but with PowerShell.

& ./logparser.exe “SELECT Top 10 cs-uri-stem, Count(*) FROM D:Logsex081110.log Group By cs-uri-stem Order by cs-uri-stem desc” –i:w3c

Big difference isn’t it. J

Here is a simple powershell script using Log Parser. This is a little easier to follow than the batch file example in the last post.

The drawback to these examples is we are not really gaining the full benefit from PowerShell. We could write some functions in PowerShell and convert the text output from Log Parser into PowerShell objects, but there is an easier way to do this.

OK, now for the second way to interact with Log Parser from PowerShell, the Log Parser COM component (logparser.dll). This component installs with Log Parser and should be registered and ready to use if you have installed Log Parser.

The COM component exposes a simple object model consisting of only three main objects. The LogQuery, LogRecorset, and LogRecord objects. There are also a series of objects for the input and output formats.

The COM component is more difficult to use, but is an advantage because data can be returned in object form. We will look at this in detail starting in Log Parser and PowerShell – Part II

Best Regards,

Dave

Log Parser Basics

Welcome!

Log Parser is a great tool for analyzing many types of text data. It is a free downloadable tool from Microsoft it installs a command line version and a COM component. It is a powerful tool for using the SQL language to query most types of fixed or delimited text data.

I recommend this article as an excellent place to begin learning to use log parser. It was written by the author of Log Parser, Gabriele Giuseppini and is a good overview. After reading it a great next step is to download Log Parser 2.2 here install it and take a look at the compiled help file.

The help file for the product is pretty good. The first two sections have enough information to get a good start. General use of Log Parser is covered there and other places so I will only give a brief overview here.

I believe most people probably start using log parser directly on the command line.

Example: c:LogParserlogparser.exe “SELECT Top 10 cs-uri-stem, Count(*) FROM D:Logsex081110.log Group By cs-uri-stem Order by cs-uri-stem desc” –i:w3c

Another good way to experiment with Log Parser is to use a batch file and a SQL file.

Example – Top 10 web Pages:

Batch File

This batch file prompts for the input, output, and sql query files and executes Log Parser to query a w3c log file. Log Parser allows the use of variables inside the SQL query. The example below inserts the values for the input and output files entered at the prompts into the query.

 

Log Parser SQL File

This SQL file gets the top ten pages by hits from a w3c log file.

These are basic examples of how Log Parser works. Next time I will talk about using Log Parser from PowerShell.

Best Regards,

Dave