Tuesday, December 6, 2011

Monitoring Microsoft Cluster Shared Volumes

So you built a nice failover cluster (in my case for Hyper-V) using Cluster Shared Volumes on Windows Server 2008.  Great! 

How do you know when the are full?  The base product doesn't give you a quick and easy way of monitoring disk usage of a CSV, just of the base disk letter.  The lastest version of Operations Manager DOES appear to have counters to monitor and alert on CSVs, but if you don't have that running you might want to consider this shoestring approach:

Every morning I (and the other admins) recieve an email similar to this:



First, the brains of this operation were taken from this MSDN blog post: 
http://blogs.msdn.com/b/clustering/archive/2010/06/19/10027366.aspx
I did some modification to the script to get the bare minimum results I need to format properly in an email.

Second, I am a batch file guy from way back, using powershell now but by no means a guru and never got around to learning VBScript.  I'm sure one of you reading this will say "But you can send SMTP mail directly from Powershell" and you would be correct, but I haven't tried that yet.  If anybody does make that happen or groks the text output in a more meaningful way I would love to hear it, leave a reply!  One idea - if the last column could be grep'd for values you only send the email if a value is less than x%.

The idea is we are going to run that powershell script in task scheduler (I have it set for once a day, suits my needs) and have a command line mailer send me the results. 

You will need:
  1. Admin access to one of the servers hosting the CSVs
  2. A command line SMTP sender, I like BMAIL - been around for years, works and is free and reliable
    http://www.beyondlogic.org/solutions/cmdlinemail/cmdlinemail.htm

The process:
  • Create a folder on C:\ called "jobs".  If you use a differnet name you will have to modify my stuff.
  • Save the text below as "DisplayCSVInfo2.ps1" in that folder
Import-Module FailoverClusters
$objs = @()
$csvs = Get-ClusterSharedVolume
foreach ( $csv in $csvs )
{
   $csvinfos = $csv | select -Property Name -ExpandProperty SharedVolumeInfo
   foreach ( $csvinfo in $csvinfos )
   {
      $obj = New-Object PSObject -Property @{
         Name        = $csv.Name
         Path        = $csvinfo.FriendlyVolumeName
         Size        = $csvinfo.Partition.Size
         FreeSpace   = $csvinfo.Partition.FreeSpace
         PercentFree = $csvinfo.Partition.PercentFree
      }
      $objs += $obj
   }
}

$objs | ft -auto Name,Path,@{ Label = "Size(GB)" ; Expression = { "{0:N2}" -f ($_.Size/1024/1024/1024) } },@{ Label = "FreeSpace(GB)" ; Expression = { "{0:N2}" -f ($_.FreeSpace/1024/1024/1024) } },@{ Label = "PercentFree" ; Expression = { "{0:N2}" -f ($_.PercentFree) } } | Out-File c:\jobs\result.txt



  • Save the Bmail executable (bmail.exe) in the same folder.
  • save the following to a text file named  "csvcheck.cmd"
del c:\jobs\result.txt
powershell.exe C:\jobs\DisplayCSVInfo2.ps1
c:\jobs\bmail -s smtpserver.yourdomain.com -h -t
your-email@yourdomain.com -f Clustername@yourdomain.com
-a Clustername_CVS_STATUS -m c:\jobs\result.txt -c

To break that last command down, it is saying:
  1. delete the old results file
  2. call powershell to run the data generating script
  3. run Bmail.exe with these switchs/arguments:
    • -s your_mail_server_name  (you have to have an email system that accepts smtp messages from your server.  I'm using Exchange 2010)
    • -h  (create standard headers)
    • -t  your_email_addy  ("To:"  I set up a group with all the sys admin's email in it)
    • -f  some_real_or_fake_email_addy ("From:"   doesn't have to be a real email address, just have the usual format to get through basic spam filtering routines)
    • -a some_tetxt  ("Subject:"  make it anything yo like, I use the name of the cluster being monitored)
    • -m c:\jobs\result.txt  (use the indicated text file as the body of the message)
    • -c  (put a CR/LF in between the body and header so Outlook will display it properly)
  4. Schedule the job in task scheduler.  Make sure the account you use can write and delete to the c:\jobs directory you created.  (Local System is probably fine)

3 comments:

  1. Really helpfull thanks for your wonderful knowledge sharing

    ReplyDelete
  2. Jeff, I sent you an invite on LinkedIn; I have a question. Could you please check your profile?
    Aleksandra

    ReplyDelete