SQL Server, PowerShell, and XtremIO Snapshots Part 1

XtremIO is becoming a popular platform for SQL Server. It performs excellent and has a great space benefit for database copies when using XtremIO snapshots. I have had a few customers asking questions about scripting snapshots for SQL Server on XtremIO using PowerShell.

Most Windows administrators these days are using PowerShell and many SQL DBA’s are also starting to use it. I decided to setup EMC AppSync with XtremIO in our lab, do a little testing, and create some PowerShell scripts to help our customers get started. My plan was to test two different scenarios, one for crash consistent and one for application consistent snapshots with EMC AppSync, but EMC announced a new native VSS provider with XtremIO 4.0 so now there will also be another way to do application consistent snapshots when 4.0 is released. This post will cover creating a crash consistent copy of a SQL database volume using XtremIO snapshots and mounting the snapshot to create a secondary QA database. This method will use the XtremIO REST API to create a snapshot of the source volume. PowerShell will be used to execute the required steps on XtremIO, VMware, and Windows.

The test environment is a SQL Server virtual machine on vSphere. The SnapTest01 volume is on a 50GB RDM on XtremIO and the SnapTest01_QA volume is a snapshot of the SnapTest01 volume.

 

SQL_Server

The example script will show the process to refresh the QA volume with a new snapshot copy. Here is the basic logical flow of the process.

SnapSshot

The first step is to load a few PowerShell modules, define some constants, and connect to VCenter and XtremIO. This is done by using the PowerCli and a function from my MTSXtremIO module, read about that here. This function uses the XtremIO REST API to create the snapshot. I also use a couple of other modules with some of my common functions and a NTFS Security Module which I did not write. I will put links to those at the end of the post.

The example above loads module dependencies and connects to VCenter and XtremIO. The SQL Management Objects are loaded to provide SQL Server functionality.

The next step is to detach the current QA database copy, remove the virtual hard disk, and remove snapshots. We will also do a couple of rescans in the process and remove the old snapshots.

The example above uses SQL Management Objects to access SQL and detach the database. It the uses the VMware PowerCli to remove the RDM from the virtual machine. Then connects to the XtremIO via REST API and deletes snapshots

Now we are ready to create a new snapshot, add it to the lunmap, add the disk to the vm, and attach the database.

The above example creates a snapshot and maps the volume to the host using the XtremIO REST API. It also rescans the disks and then adds the RDM to the virtual machine. Then the database is attached using SQL SMO.

This completes the database refresh. I hope someone finds this helpful. Next post will introduce AppSync into the mix for application consistency and some additional benefits.

Regards,

Dave

MTSXtremIO Module
NTFSSecurity Module
MTSAuthentication Module

EMC Knowledge Sharing Competition – Article Selected for Publishing

In the fall last year I learned about the 2015 EMC Knowledge Sharing Competition. This is a writing competition hosted by EMC to promote sharing of information about technology topics. All EMC certified professionals are welcome to submit abstracts for consideration in the contest. I thought the work I had done with Data Domain Boost (DD Boost) for SQL Server would a good topic and decided to give the contest a shot.

The previous work I have done on this topic can be found at the links below.

Data Domain Boost for SQL – Deployment Planning Considerations for DBAs

SQL Server Data Domain Boost Scripting Toolkit

Toolkit Project on GitHub

In November I was notified my abstract was accepted and it was time to get started writing the article. It was a lot of work especially because I was in the middle of writing and publishing the SQL Server Data Domain Boost Scripting Toolkit. It turned out to be a great experience and last week I was notified my article, How to Boost SQL Server Backups with Data Domain, was selected for publishing. It will be included in the 2015 book of abstracts and published at some point during the year.

Find our more about the EMC Knowledge Sharing Competition here

While I was not selected as a finalist, the experience was enjoyable and interesting. I think this is a very cool event EMC does, I appreciate being part of it, and I’m sure it takes a lot of work internally. I will be looking forward to the awards session at EMC World 2015 to hear the winners announced and congratulate them.

Regards,

Dave

SQL Server Data Domain Boost Scripting Toolkit

Over the past several months I have worked with customers who have been implementing EMC Data Domain Boost for SQL Server. A common challenge for these customers is automating backups and restores using SQL Server tools such as T-SQL, maintenance plans, and agent jobs. I have posted about this previously here but found some organizations still needed additional help getting started. This led me to create the SQL Server Data Domain Boost Scripting Toolkit to provide DBAs with a head start. Currently the toolkit provides T-SQL stored procedures and a table definition to provide additional functionality on top of EMC’s Data Domain Boost for SQL Server.

The toolkit is published on GitHub at this address https://github.com/dmuegge/ddb-sql-toolkit. The toolkit is licensed under the MIT open source license agreement.

With this post I felt a demo video would provide the most value and you can also find documentation on using the toolkit in the scripts and files included with the toolkit. The video is approximately 24 minutes, Enjoy.
 

 

The goal of the SQL Server Data Domain Scripting Toolkit is to provide DBAs with a head start to T-SQL scripting and automation using DD Boost for SQL Server which is part of Data Domain Boost for Microsoft Applications. The initial toolkit utilizes the xp_cmdshell system stored procedure in a set of T-SQL stored procedures to execute the DD Boost for SQL executables. I also hope to extend this toolkit in the future with additional options to utilize SQL DD Boost.

See the previous post I wrote which led me to create the SQL Server Data Domain Boost Scripting Toolkit. I hope someone finds this toolkit useful. Please provide any feedback and updates to the toolkit.

Regards,

Dave

 

EMC XtremIO and PowerShell

I have had the opportunity to work with XtremIO quite a bit lately. One of the benefits of working for an EMC partner is lab gearJ. XtremIO hast a REST based API and I wondered what others in the community had done with the API for XtremIO and PowerShell. I started searching for the available PowerShell functionality and found a couple of different modules.

The first one I ran across is here; a module created by Matt Boren. It has some great functionality and I explored it quite a bit. It is written to take advantage of PowerShell features such as the pipeline. The primary challenge I had was understanding the code well enough to be able to extend it and have a good handle on how it works. It has some complex decision and looping structures which I had difficulty following. The next module I found was here; a module created by Brandon Kvarda. This module also has good functionality and although I did not explore it as much as the module mentioned above, it had some tidbits to leverage. The primary reason I did not want to take this module and run with it was some of the development paradigms. I personally like to use a more “PowerShelly” type of approach utilizing PowerShell objects and leveraging the pipeline differently than this module.

I decided to learn some things from the code and create my own module. The articles and code provided some helpful insight and shortened the learning process. My end goal was to create a module which I better understood and could easily extend, while keeping things as simple as possible. I will provide some examples on using the PowerShell module for XtremIO and you can find it here on GitHub.

The module provides standard CRUD functionality from the XtremIO REST API. It has full coverage of all the HTTP GET functions as well as PUT, POST, and DELETE for the most common objects. To start with I am just ignoring certificate errors and it is on my list to add in use of certificates in the login process for both validation and authentication. I am currently using a simple method to store password information in a file to enable creation of scripts that can be run automatically. So let’s go through the process of setting the module up, preparing a password file and executing some commands.

The first thing that is required is to place the module in your modules folder. I will not go into specifics here as there are many sources of that information. Once the module is loaded the first step is to create a password file. The great thing about this method is the password file is encrypted based on the credentials of the currently logged on user and machine. It cannot be used on another machine or by another account on the same machine. So if you are going to setup a password file to be used in a scheduled task make sure you create the file using the account that will run the scheduled task.

Here is a screenshot of the password file creation process.

Here we see the contents of the encrypted password file.

Now that the authentication is setup and ready to use I will talk about the functionality of the module. If we issue a Get-Command –Module MTSXtremIO it will return a list of all the cmdlets currently included in the module.

The first step is to connect to the XtremIO XMS management server. This is done by using the Set-XIOAPIConnectionInfo cmdlet. The example below shows the connection process and an example of the Get-XIOCluster cmdlet returning some basic information about the cluster.

Now I will show an example of creating a volume folder and some volumes. First we will list the names of the current volume folders.

The next step is to create a new volume folder to hold the volumes. The example below shows this along with a listing of the new folder. Now the folder is ready to create the new volumes.

The example below shows how PowerShell can help to create 8 volumes very quickly.

The screenshot below shows an example of how we can view only the resulting volumes. Also in this example I show the VAAI thin provisioning alert setting. One thing to point out is the REST API does not actually set the various alert settings on volume creation and it is done as an additional task after volume creation.

The example below shows how this is accomplished using the Update-XIOVolume cmdlet and lists the updated volumes.

One note here, the New-XIOVolume and Update-XIOVolume cmdlets will be updated soon to accept pipeline input and not require the use of the foreach-object cmdlet. I discovered this flaw creating this post, Doh! That update will also enable putting the New-XIOVolume and Update-XIOVolume together on the pipeline to allow creation and setting options to happen in a simple one liner.

The other common tasks which can be managed with the module today is LUN mapping and snapshots. I will be working on other functionality and adding it to the module as time permits. I hope folks find this useful and I always appreciate feedback and comments. One last thing, if you have not had a chance to do any testing with an XtremIO all flash array I recommend it highly. It is quite fun.

MTSXtremIO PowerShell Module Download https://github.com/dmuegge/MTSXtremIO/releases

Regards,

Dave

Data Domain Boost for SQL – Deployment Planning Considerations for DBA’s

Update: This work has been expanded upon in the following post – DD Boost T-SQL Scripting Toolkit

EMC recently released Data Domain Boost for Microsoft Applications. This tool includes a SQL Management Studio plugin that allows SQL backups via a familiar SQL GUI or a CLI application.

EMC whitepaper – The business value of Data Domain Boost

This tool does a good job of keeping control of backups in the hands of a DBA and taking advantage of the benefits of Data Domain.

While helping a client recently, I had a chance to dig into DD Boost for SQL and do some testing. This post outlines some key things I learned about the architectural aspects of the software. Also some potential operational impact organizations should be aware of when planning an implementation. The amount of impact on your particular organization will vary depending on how SQL backups are managed today. In most cases the backups are currently managed by the SQL DBA’s. This is a common reason organizations choose DD Boost for SQL.

In this scenario there are some important questions to ask.

  1. Are backups managed by T-SQL in agent jobs?
  2. Are backups managed with maintenance plans?
  3. What automation is done in jobs? Particularly automation in restores.
  4. What logging and reporting is done today?
  5. Can TSQL backup routines be modified?
  6. Can xp_cmdshell be enabled on servers?

 

Many or most DBA’s have T-SQL procedures running the backups. These procedures commonly have additional logic for other operations, logging and reporting. The Data Domain boost plugin for SQL server runs as a command line executable. It has a GUI front end to help create the commands or run the process directly.

The backup program is ddbmsqlsv.exe and the restore program is ddbmsqlrc.exe. The DD Boost administrator guide recommends using windows task scheduler to execute theses commands and manage backup and restore jobs. This presents an issue for many DBA’s because they have a much greater set of functionality in the SQL Agent job engine. Additionally, agent jobs and T-SQL procedures have easy access to all relevant data in the SQL environment needed to drive the logic of the DBA’s operational processes.

This hurdle can be overcome, but there is tradeoff and compromise involved. The DD Boost plugin can be run from T-SQL scripts using the xp_cmdshell extended stored procedure. This does allow the DBA’s to keep the backup jobs and processes running in T-SQL and the agent. The trade-off will be the increased attack surface for SQL server. The following command examples show the use of xp_cmdshell with the DD Boost applications.

This method should work for most organizations, which are OK with the trade-offs. One of the trade-offs will likely be some logic change in the SQL agent jobs for backup. One question a DBA posed to me was a way to run the backup from a SQL Agent job and be able to report on success while keeping a log. Here is the quick solution I came up with. This should be a good start for most DBA’s and I am sure many can improve on this greatly.

The first step is to enable xp_cmdshell on the server and there are many places on the internet to show how to do this. Then we need to create a job for the backup.

The job has a single step to run the T-SQL.

The following T-SQL code runs the job.

This code uses an in memory table and a cursor to loop through the output of the DD Boost command. This allows the detection of success strings and if they are not found an error is raised to the job engine. This script also prints the command output, which can get captured for the history logs. I am sure there are many SQL gurus out there who can improve this code. Please post your improvements, they are welcome.

The screen shot below shows the job step advanced properties page. The “include step output in history” option will allow the output to be seen in the job activity history.

The screenshot below shows the job activity monitor log viewer, which has both successful and failed jobs.

The output of the backup command is stored in the message field in the history. This provides a way to go back and find out what happened to the backup.

I believe this method should be sufficient for most organizations except those with strict security requirements and a high level of automation in database restores. The security requirement is due to the use of xp_cmdshell, which some organizations may not want to use. The level of automation in restores could pose an issue because of how specific backup sets are accessed via the DD Boost application.

There is at least one situation I know of that could present issues. If automated database restores of particular point in times are required this may be very difficult to script reliably. With a normal SQL backup the information about that backup is recorded in tables in the MSDB database. When DD Boost for SQL does a backup it does invoke a SQL backup which logs the job information to the MSDB database. Although, the DD Boost plugin GUI appears to retrieve the backup image data from the Data Domain file system.

This causes an issue in scripting a DD Boost restore of a specific image. The command line tool allows for two modes to identify the backup image used for restore. It will restore the last backup by default or if the –t switch is supplied with a timestamp of the backup image it will use that particular image. The problem is the timestamp comes from the Data Domain file system timestamp for the backup image(I think, someone from EMC correct me if I am wrong). There may also be a way to query this, but it would be via SSH and I have not gone down that road.

In many environments DBA’s will use backup set or file naming schemes in their processes. Since the DD Boost app only allows timestamp to identify the backup the lack of timestamp correlation between DD and SQL causes an issue. The command and output below illustrates the issue.

ddbmsqlsv.exe -c SQL2012-01.vlab.local -l full -a “NSR_DFA_SI=TRUE” -a “NSR_DFA_SI_USE_DD=TRUE” -a “NSR_DFA_SI_DD_HOST=dd-01.vlab.local” -a “NSR_DFA_SI_DD_USER=ddboost” -a “NSR_DFA_SI_DEVICE_PATH=/SQL” “MSSQL$INST1:DB1”

Abbreviated Log Output

43708:(pid 3208):Start time: Wed Aug 06 10:55:44 2014
43621:(pid 3208):Computer Name: SQL2012-01 User Name: administrator
….
43709:(pid 3208):Stop time: Wed Aug 06 10:56:05 2014

This job does get logged in the MSDB database like other SQL jobs, but unfortunately the timestamps do not match up to the DD Boost data.

When looking at the DD Boost restore for the above backup. The time shown on this screen does not show the same backup start and end times as the msdb.backupset table. This is an issue because a backup could be found by description in a query, but DD Boost save sets cannot be referenced properly from DBA’s TSQL maintenance scripts as timestamps do not correlate.



ddbmsqlrc.exe -c sql2012-01.vlab.local -f -t “08/06/2014 10:55:45” -S normal -a “NSR_DFA_SI=TRUE” -a “NSR_DFA_SI_USE_DD=TRUE” -a “NSR_DFA_SI_DD_HOST=dd-01.vlab.local” -a “NSR_DFA_SI_DD_USER=ddboost” -a “NSR_DFA_SI_DEVICE_PATH=/SQL” “MSSQL$INST1:DB1”

It is not likely many organizations will have a level of automation in their SQL database restores to be caught by this, but it is one to be aware of if considering a DD Boost for SQL deployment. I am sure features will be added in future releases that will correct or eliminate this issue. In the meantime EMC, could you please provide a way for DBA’s to reference the DD Boost jobs/image files from T-SQL.

Regards,

Dave

PowerShell Meets Xplorer2 for an ESXTOP Relog

This is a topic I have been meaning to write about for a long time. I was recently working with this scenario and thought it would be a good example. First I am going to get a little nostalgic to provide some context. Back in the late 80’s and early 90’s in my first days of computing working in the DOS world, a favorite utility of mine was a file manager called Norton Commander. This was a very feature rich text based dual pane file manager. Here is a screenshot, I hope it brings back some good memories. If this does not look familiar then hopefully there is some historic or comic value.

NC

When Windows 3.0 was introduced and the primary interface became the program manager I could not believe it. Who wanted to run a computer using pictures, how absurd.J So I surrendered my beloved Norton Commander and was forced to use the wonderful Windows File Manager. Here is a screenshot so you too can relive the feature deficits.

WinFile

Of course this became Windows Explorer, which we all know and settle on using. I always wanted a file manager with that familiar feel of the Norton Commander dual pane interface, but always settled for Windows Explorer. When I started working with PowerShell several years ago my need for a better file manager became apparent. I searched and found an application xplorer2 which had the dual pane look and feel I was looking for with a lot of customizability. It turned out to be an excellent complement to PowerShell. OK, so there’s the point of the nostalgia.

I am going to talk about a few different topics in this post, but my goal is to provide a real world example of using PowerShell with xplorer2. Here is the xplorer2 interface in dual pane configuration as I use it. It can be customized extensively and I will not go into many of the features and options. This is not meant to be an xplorer2 advertisement; I am just a satisfied customer. Check it out here http://zabkat.com.

X2-A

This application can be used to enhance the navigation and launching of scripts and PowerShell is a great example. The application has the ability to create bookmarks with keyboard shortcuts, custom columns, folder groupings and other various helpful file and folder stuff. IMHO, the best features of the application which complement PowerShell are user commands coupled with keyboard shortcuts and $-tokens. This allows a powerful way to launch PowerShell scripts and feed data into the scripts.

Here is an example. I have some ESXTOP CSV performance files that I need to merge. There are certainly several ways to do this and it could be done by manipulating the text files. The ESXTOP file is a standard PDH format .csv file and can be read and manipulated by many tools including a windows command line tool called relog.exe. This tool is found on Windows XP systems and above and is used to manipulate any standard PDH format performance files. This tool can be used to do a variety tasks to the files. Here is the help text.

This command will be used in PowerShell scripts to create an easy tool for converting and merging performance logs. The first step to make this work in the xplorer2 environment is to setup the user commands. The screenshot below shows the user commands menu and functionality of the application.

X2-B

The organize dialog lets you create and customize the commands and define keyboard shortcuts.

X2-C X2-D

Here are some examples of commands I use all the time.

C:windowssystem32WindowsPowerShellv1.0powershell.exe -noexit $F

The above command runs the currently selected PowerShell script. The $F is a token in the xplorer2 environment which represents the currently selected file on the left pane. Simply select a PowerShell script and use the alt-0 keyboard shortcut.

C:Elevationelevate.cmd C:windowssystem32WindowsPowerShellv1.0powershell.exe -noexit $F $R

The above command runs the currently selected PowerShell script with the right visible directory path as an argument –The $R is a token in the xplorer2 environment which represents the right side visible directory path. The command also uses the old elevate VBScript to get an admin window. I welcome someone to clue me in on a better way to do this.

C:Elevationelevate.cmd C:windowssystem32WindowsPowerShellv1.0powershell.exe -noexit $F $G

The above command runs the currently selected PowerShell script on the left with the inactive highlighted file on the right as the argument.

C:Elevationelevate.cmd C:windowssystem32WindowsPowerShellv1.0powershell.exe -noexit $G $A

The above command runs the inactive highlighted PowerShell script on the left with the currently selected files on the right as an argument.

I will go back to our ESXTOP example to help make things more clear. In the example below I have multiple esxtop files from a host I would like to merge. The relog application will merge binary logs very easy so our first step is to convert to binary.

RL-A

The screenshot above shows we have the PowerShell Script to do the conversion highlighted on the left and the files to be converted selected on the right. We just press the alt-4 keyboard shortcut which launches the script and it converts our files for us using relog.

Here is an example of the script and the output.

RL-B

Here are all of our converted files ready to be merged. The files are sorted by extension and the binary files are selected to be run against the merge script highlighted on the left.

X2-E

The alt-4 keyboard shortcut is selected to run the script; relog merges the files and outputs in CSV format ready for further analysis.

RL-C

X2-F

We now have a merged file ready for further analysis in Windows perfmon or other tools.

One item which is worth mentioning is the use of the $Args variable. In most cases it would be recommended to use PowerShell parameters rather than $Args. Although, in this case it provides a simple method to utilize the $-tokens functionality of xplorer2.

I have found PowerShell and xplorer2 used together to be a very useful combination. I hope others will find this concept useful.

Regards,

Dave

 

EMC Isilon Platform API and PowerShell Part II

In my last post I talked about the Isilon REST based platform API. I have been experimenting more with the Isilon API and PowerShell. I thought I would share the progress so far.

I decided to create a PowerShell module to leverage the functionality exposed by the Isilon platform API. The module provides some advanced functions, script cmdlets if you prefer, to provide API access. Right now I have some ‘get’ cmdlets and a lot of work to do before I have full coverage of the API. Maybe I can catch up by the time there is full coverage of the Isilon functionality in the platform APIJ

The module uses basic authentication at this time as I am still working on other authentication options. The first step in using the module is to download it and place it in your modules directory. Then load the module and create a password file for logging on to the Isilon cluster. The New-PasswordFile command creates an encrypted file containing the supplied password. This file is used to supply authentication automatically. The file can only be used by the user who was logged on when the file was created.

Once this is complete the following code will load a console with the cmdlets and setup authentication.

This code uses the password file we created earlier. Once it is executed we see the available Isilon cmdlets and we have a console to execute Isilon commands and scripts. I put this code into a script to launch an Isilon management console.

Now that we have a console with the commands loaded we can use them to get information from the Isilon system. Here is an example of a command to return all Isilon groups.

Here we can see all groups are returned with all properties. We also see the results are returned as PowerShell objects. This gives us all of the PowerShell goodness when working with the Isilon platform API. The next example uses PowerShell to display just the information we want in an easier to read format.

The following example shows how we can query the groups, filter by type and control the display format. This example retrieves all users with the domain type of BUILTIN and displays the name and provider in a results table.

The cmdlets will also allow using the pipeline. Although, I do not have pipline functionality in all cmdlets yet. The user, group and provider cmdlets do. The following example shows a listing of all Isilon ADS BUILTIN groups and the members of each group.

Hopefully someone finds this interesting and I will try to provide more useful examples as I add functionality to the module. Here are the download links for the module and the console launch script. You will need to modify the console launch script(Start-IsilonPlatform.ps1) for your environment.

IsilonPlatform
Start-IsilonPlatform

This module is a work in progress so use at your own risk. I hope to provide more functionality soon and I hope to see EMC add more coverage to the API. I would really like to see more around cluster configuration and performance statistics. I think this is great functionality provided by EMC, more please!

Feedback on the module is welcome.

Regards,

Dave

Using the Isilon 7.0 ReST API with PowerShell

EMC recently released Isilon 7.0 “Mavericks” version of the OneFS operating system. This release has many great new features, which you can read all about here and here. One of these great new Isilon features is the ReST API, which allows programmatic access to the platform. If you are not familiar with ReST, it stands for Representational State Transfer. This is a lightweight, platform independent and stateless method of programming web services.

PowerShell allows an easy method to access the Isilon ReST API. Working with ReST is a new for me, but I thought it might be useful for some to follow along while I am learning. Also, if anyone has tips for me on this process I welcome the knowledge.

The Isilon ReST API is not enabled by default. To enable the functionality it requires changing options on the HTTP settings page in the protocols section, see below.

Isilon_HTTP_Settings

The HTTP interface can use active directory authentication, but in this post I will use basic authentication and show examples of reading data from the cluster. I hope to show more advanced examples as I learn.

PowerShell v3 has some great built-in functionality for working with ReST API’s. The Invoke-RestMethod cmdlet is exactly the functionality required to leverage the Isilon ReST API. The first challenges when working with the API will be related to authentication and certificates. The Isilon cluster will use a self-signed certificate by default. This results in a certificate error when connecting via HTTPS and can be seen when connecting to the Isilon cluster via a browser. The following code will allow a work around to the problem by ignoring the error.

In a production environment the correct way to handle this would be to install a certificate issued by a trusted certificate authority. The next step is to setup a proper HTTP header for basic authentication.

Once this is complete all we have to do is build the proper URL and issue the request. The code below will retrieve and display the SMB and NFS settings of the cluster.

The output from the above examples is shown below. As you can see this gives a quick concise view of the protocol settings.

While this is only a simple example of retrieving data from the cluster, the possibilities are endless. When considering where we are in the transformation to cloud and automation. This type of enabling technology will be the foundation of great things to come.

Stay tuned…

Regards,

Dave

Merge Multiple EMC NAR files with PowerShell

While working on a project the other day I found the need to merge multiple NAR files. The NaviSecCLI provides a method to merge two NAR files but does not allow an option to merge multiple files. I was searching on the web for methods to do this and ran across a couple of scripts.

The first script I found was done in VBScript http://blog.edgoad.com/2011/03/merging-multiple-emc-nar-files.html

The second script I found was bash for linux http://jslabonte.wordpress.com/2012/02/01/how-to-merge-nar-files/

I thought this is something that PowerShell can do much easier so here is a script to merge multiple NAR files. This script will require the NaviSecCLI to be installed to work properly.

I hope someone finds this useful.

Regards,

Dave

EMC VNXe Performance Analysis with PowerShell Part II

I appreciate the positive feedback I have received from the VNXePerformance module so far. I thought I would add to it and provide a script to generate a basic report. The script can be downloaded here.

The script will produce an HTML report and associated graphics with the following information.

  • Capacity Information system and pools(Total and Allocated)
    • Maximum, Minimum, Average, Median
    • Historical graphs for system and each pool
  • Bandwidth usage per protocol
    • Maximum, Minimum, Average, Median
    • Historical graphs
  • IOPS usage per protocol
    • Maximum, Minimum, Average, Median
    • Historical graphs

Here is a sample

The previous post used PowerGadgets for the charting functionality. This tool is not free and it is also not yet supported with PowerShell 3.0. To correct this issue I provided a function in this reporting script which uses the charting functionality in the .Net 4.0 framework. While this fixes the two issues mentioned it does require more work to use, but it will work well for our purposes here. This script uses the VNXePerformance.ps1 module from my previous post and a few new functions to produce an html report and associated graphic files. A command line example to run the script is shown below.

The script uses data provided by the VNXePerformance module and the functions in the script to format and write the report data. Here is a brief description of the functions used.

Out-DataTable – this function is used to convert the PSObject data provided as output from the module functions to the system.data.datatable type. This is required for databinding to produce charts.

Out-LineChart – This function provides chart generating functionality to produce a line chart based on provided datatable and generate a .png graphic file.

Get-SeriesRollup – This function creates summary data (maximum, minimum, average, median) for series data.

The following functions create HTML report output

  • ConvertTo-SeriesRollupHTML
  • Write-ChartHTML
  • Write-BlankHTMLTable
  • Write-HeaderHTMLTable

The first part of the script defines parameters, loads charting assembly, contains the functions declarations and module import.

The next portion of the sets the location of the SQLite database and begins the HTML report string.

The next portion of the script completes the report by using the VNXePerformance module to retrieve object data then output HTML using the script functions.

The final portion of the script closes out the html file and writes it to disk.

This should provide a good starting point to use for reporting. It has much room for improvement. Everyone please comment with information discovered about the SQLite data and information added to the report.

Start-VNXeHTMLPerformanceReport.zip
Regards,

Dave