Interacting with VMware vCO and the Rest API using PowerShell – getting a list of workflows

Recently I needed to get a list of VMware vCO workflows from a remote server using PowerShell. A colleague of mine pointed me in the right direction by providing me with a URL to access the vCO Rest API on, as well as letting me know what I needed to send in order to authenticate.

To connect and retrieve content back in the PowerShell example below, we’ll need to:

  • Access the Rest API URL for Orchestrator using a web client object
  • Send basic authentication in the header of our request

Notes:

One thing I did notice is that when you use your web browser to test the URL, the result is returned to you as XML, however when I used a web client object in PowerShell, I got a result returned to me in JSON. The PowerShell script below is therefore tailored to interpret the result as JSON. This being so, you’ll need to make sure you are using PowerShell 3.0 or above, as the ConvertFrom-Json cmdlet is only available using PowerShell 3.0 and above.

When sending your authentication details with the web client object request, make sure your username/password combo are used in this format:

Authorization: Basic username:password

This means that your header you add to your web client option, should be added with the string as per the above, but with the username:password part encoded using base64. The script below takes this all into account, and all you need to do is provide your username and password for vCenter Orchestrator to the PowerShell function, it will handle the base64 encoding and passing of the values to the web client itself.

Anyway, enough of that, let us get onto the actual script itself. This is presented as a PowerShell function. Load it into your session (copy-paste) or add it to your PS profile for future use. Apologies for the formatting – Syntax Highlighter really messes with the formatting and nice clean indentation I normally have in my scripts!

Here is a direct download of the PowerShell script if the script paste below doesn’t work for you:
[wpdm_file id=31]

 

Function Get-VcoWorkflow() 
{

<#
.SYNOPSIS
Fetches vCO Workflow information and details from a vCenter Orchestrator server

.DESCRIPTION
Fetches vCO Workflow information and details from a vCenter Orchestrator server

.PARAMETER Username
Username for the vCO server

.PARAMETER Password
Password for the vCO server

.PARAMETER Server
The vCO server hostname or IP address

.PARAMETER PortNumber
The port to connect on

.EXAMPLE
PS F:\> Get-VcoWorkflow -Username Sean -Password mypassword -Server 192.168.60.172 -PortNumber 8281

.LINK

http://www.shogan.co.uk

.NOTES
Created by: Sean Duffy
Date: 30/03/2014
#>

[CmdletBinding()]
param(
[Parameter(Position=0,Mandatory=$true,HelpMessage="Specify your vCO username.",
ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
[String]
$Username,

[Parameter(Position=1,Mandatory=$true,HelpMessage="Specify your vCO password.",
ValueFromPipeline=$false,ValueFromPipelineByPropertyName=$true)]
[String]
$Password,

[Parameter(Position=2,Mandatory=$true,HelpMessage="Specify your vCO Server or hostname.",
ValueFromPipeline=$false,ValueFromPipelineByPropertyName=$true)]
[String]
$Server,

[Parameter(Position=2,Mandatory=$true,HelpMessage="Specify your vCO Port.",
ValueFromPipeline=$false,ValueFromPipelineByPropertyName=$true)]
[ValidateRange(0,65535)] 
[Int]
$PortNumber
)

process 
{
$Report = @() | Out-Null

# Craft our URL and encoded details note we escape the colons with a backtick.
$vCoURL = "https`://$Server`:$PortNumber/vco/api/workflows"
$UserPassCombined = "$Username`:$Password"
$EncodedUsernamePassword = [System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($UserPassCombined))
$Header = "Authorization: Basic $EncodedUsernamePassword"

# Ignore SSL warning
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

# Create our web client object, and add header for basic authentication with our encoded details
$wc = New-Object System.Net.WebClient;
$wc.Headers.Add($Header)

# Download the JSON response from the Restful API, and convert it to an object from JSON.
$jsonResult = $wc.downloadString($vCoURL)
$jsonObject = $jsonResult | ConvertFrom-Json

# Create a blank Report object array
$Report = @()

# Iterate over the results and transform the key/value pair like formatting to a proper table.
# Note I use a hashtable to populate a new PSObject with properties, using values found in the original JSON result
foreach ($link in $jsonObject.link)
{
	$kvps = $link.attributes
	$HashTable = @{}
	$kvps | foreach { $HashTable[$_.name] = $_.value } # foreach item in the link object, populate the hashtable
	$NewPSObject = New-Object PSObject -Property $HashTable # Create a new PSObject and populate the properties/values using our hashtable

	# Add our populated object to our Report array
	$Report += $NewPSObject
}

return $Report

}
}

I hope that this script comes in handy, and gives you an idea as to how you can retrieve and convert data from vCO and its RESTful API for use in PowerShell 🙂

Results example after running the Function:
get-vcoworkflow-results

 

Get Virtual Machine Inventory from a Hyper-V Failover Cluster using PowerShell

A colleague was asking around for a PowerShell script that would fetch some inventory data for VMs on a Hyper-V cluster the other day. Not knowing too much about Hyper-V and having only ever briefly looked at what was out there in terms of PowerShell cmdlets for managing Hyper-V, I decided to dive in tonight after I got home.

 

Here is a function that will fetch Inventory data for all VMs in a specified Failover Cluster. This is what it fetches:

  • VM Name
  • VM CPU Count
  • VM CPU Socket Count
  • VM Memory configuration
  • VM State (Up or Down)
  • Cluster Name the VM resides on
  • Hyper-V Host name the VM resides on
  • Network Virtual Switch Name
  • NIC Mac Address
  • Total VHD file size in MB
  • Total VHD Count

 

Being a function, you can pipe in the name of the cluster you want, for example Get-Cluster | Get-HyperVInventory. Or you could do Get-HyperVInventory -ClusterName “ExampleClusterName”. You could also send it to an HTML Report by piping it to “ConvertTo-HTML | Out-File example.html”

Download here, or copy it out from the script block below:
[download id=”15″]
 

# Requires: Imported HyperV PowerShell module (http://pshyperv.codeplex.com/releases/view/62842)
# Requires: Import-Module FailoverClusters
# Requires: Running PowerShell as Administrator in order to properly import the above modules

function Get-HyperVInventory {
<#
.SYNOPSIS
Fetches Hyper-V VM Inventory from a specified Hyper-V Failover cluster

.DESCRIPTION
Fetches Hyper-V VM Inventory from a specified Hyper-V Failover cluster

.PARAMETER ClusterName
The Name of the Hyper-V Failover Cluster to inspect

.EXAMPLE
PS F:\> Get-HyperVInventory -ClusterName "dev-cluster1"

.EXAMPLE
PS F:\> Get-Cluster | Get-HyperVInventory

.LINK
http://www.shogan.co.uk

.NOTES
Created by: Sean Duffy
Date: 09/07/2012
#>

[CmdletBinding()]
param(
[Parameter(Position=0,Mandatory=$true,HelpMessage="Name of the Cluster to fetch inventory from",
ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
[System.String]
$ClusterName
)

process {

$Report = @()

$Cluster = Get-Cluster -Name $ClusterName
$HVHosts = $Cluster | Get-ClusterNode

foreach ($HVHost in $HVHosts) {
$VMs = Get-VM -Server $HVHost
foreach ($VM in $VMs) {
[long]$TotalVHDSize = 0
$VHDCount = 0
$VMName = $VM.VMElementName
$VMMemory = $VM | Get-VMMemory
$CPUCount = $VM | Get-VMCPUCount
$NetSwitch = $VM | Get-VMNIC
$NetMacAdd = $VM | Get-VMNIC
# VM Disk Info
$VHDDisks = $VM | Get-VMDisk | Where { $_.DiskName -like "Hard Disk Image" }
foreach ($disk in $VHDDisks) {
$VHDInfo = Get-VHDInfo -VHDPaths $disk.DiskImage
$TotalVHDSize = $TotalVHDSize + $VHDInfo.FileSize
$VHDCount += 1
}
$TotalVHDSize = $TotalVHDSize/1024/1024
$row = New-Object -Type PSObject -Property @{
Cluster = $Cluster.Name
VMName = $VMName
VMMemory = $VMMemory.VirtualQuantity
CPUCount = $CPUCount.VirtualQuantity
CPUSocketCount = $CPUCount.SocketCount
NetSwitch = $NetSwitch.SwitchName
NetMACAdd = $NetMacAdd.Address
HostName = $HVHost.Name
VMState = $HVHost.State
TotalVMDiskSizeMB = $TotalVHDSize
TotalVMDiskCount = $VHDCount
} ## end New-Object
$Report += $row
}
}
return $Report

}
}

 

Example use cases – load the function into your PowerShell session, or place it in your $profile for easy access in future, and run the following:

# Example 1
Get-HyperVInventory -ClusterName "mycluster1"
# Example 2
Get-Cluster | Get-HyperVInventory
# Example 3
Get-HyperVInventory -ClusterName "mycluster1" | ConvertTo-HTML | Out-File C:\Report.html

 

The function includes help text and examples, so you can also issue the normal “Get-Help Get-HyperVInventory” or “Get-Help Get-HyperVInventory -Examples”. It is by no means perfect and could do with some improvements, for example if there is more than one Virtual Switch Network associated with a VM these would be listed in a row multiple times for each. Feel free to suggest any improvements or changes in the comments.

 

Veeam Backup stats report for all your VM Backup jobs in PowerShell

 

The other day I was asked to collect some statistics on our Veeam Backup & Recovery server from as many VM Backup jobs as possible. The environment has roughly 70 scheduled jobs thats run either daily or weekly. After searching around a bit first I could not find any current solution or built in method to retrieve the info I needed to collect in a quick or automated way. First ideas were to either somehow grab the info via SQL queries from the Veeam database, or to rather take a sampling of 10-20 different types of jobs and their backup sessions over one normal incremental run day, and one normal full backup day (Manually collecting this data from email reports would be quite a slow process).

 

After browsing around the Veeam Community Forums I suddenly remembered that there was a PowerShell module that Veeam Include with B&R. I read the basic documentation and got acquainted with a few simple cmdlets.  I wanted to build a report, that would loop through every single Veeam B&R Job we have, and grab data from the last 7 backup sessions of each (daily backups), therefore giving me a good idea of both full backup and incremental backup runs performance, times taken etc… My first attempt at a script got me almost all the way there (tried during spare time in my evenings!) – I was however having trouble matching backup session data with the right day’s backup file stats – sometimes the ordering was out, and I would get metrics back for a backup file that was not from the correct day. Before I was able to resolve this myself, help arrived from “ThomasMc” over at the Veeam Community Forums. (Thanks Thomas!) We got a script together that was able to match up sessions correctly. I then added a few more features, as well as some nice HTML formatting and the ability to grab statistics for all jobs instead of just one sample job. The resulting script gets the following info for you:

 

  • Index (1 = the last backup sesion, 2 = the day before that, etc)
  • Job Name
  • Start time of job
  • Stop time of job
  • File Name (Allows you to determine if the job was a full or incremental run)
  • Creation Time
  • Average Speed MB – average processing speed of the job
  • Duration – time the job took to complete
  • Result – Success/Warning/Failed (Failed is highlighted in red)

 

Here is an example of the report run on my Veeam Backup & Recovery Lab environment at home (Thanks to Veeam for the NFR licenses they gave out to VCPs earlier this year!)

 
[download id=”1″]
[download id=”8″]

 

So, to run the above script, launch a PowerShell session from within Veeam B&R (Tools -> PowerShell). This will make sure your PowerShell session launches with the Veeam Automation/PowerShell snapin. Execute the script and you’ll get an HTML file output to the root of your C:\ drive. By default, all jobs you have in Veeam will be detailed. If you wish to sample a specific job, or a job with a certain word/phrase in it, adjust the -match parameter for the Get-VBRJob cmdlet line near the top of the script. The default setting is an empty string – i.e. “”. To change how many sessions the the script fetches for each backup job, just change the “$sessionstofetch” variable defined at the top of the script.
I have added comments throughout the script for those interested in how it works. Lastly, you could also quite easily modify this script to e-mail you the report, or even run it as a scheduled task. Let me know if you need help doing this and I’ll gladly modify it as required.

 

PowerCLI – checking for snapshots on VMs and emailing the report back

Checking for any snapshots running on VMs in various clusters can be quite repetitive if done manually, looking through vCenter at each of your VMs. In the clusters I work with there are a LOT of VMs to check, and naturally I wanted to automate this process. Sure, I could rely on the vCenter alarms for snapshot size warning, but these are not completely reliable, as they only alert me when snapshots start growing large in size. I wanted something that would alert me to the presence of a snapshot regardless of its size. I therefore set about learning the basics of PowerCLI (as you can see in my last post) and searched around for some sample cmdlets that would help me retrieve a list of VMs with snapshots on them.

 

So here is the end result of running this snapshot checking script. It uses powershell cmdlets to generate an HTML email and sends it across to the address you specify. You will of course need to ensure you can connect out on port 25 for mail and have authentication on your mail server (or being sending from and to a domain hosted on your mail server (i.e. connecting to relay mail internally). Enter your mail server, to, and from details in the script to customise it. You’ll also need to authenticate with your vCenter server before running the script of course – you could use a cmdlet in the script to do this automatically. I have just been manually authenticating for now as I have not yet deployed this in production and have just been testing.

 

 

So here is the all important PowerCLI script!

 

#These are the properties assigned to the HTML table via the ConvertTo-HTML cmdlet - this is used to liven up the report and make it a bit easier on the eyes!

$tableProperties = "<style>"
$tableProperties = $tableProperties + "TABLE{border-width: 1px;border-style: solid;border-color: black;}"
$tableProperties = $tableProperties + "TH{border-width: 1px;padding: 5px;border-style: solid;border-color: black;}"
$tableProperties = $tableProperties + "TD{text-align:center;border-width: 1px;padding: 5px;border-style: solid;border-color: black;}"
$tableProperties = $tableProperties + "</style>"

# Main section of check
Write-Host "Looking for snapshots"
$date = get-date
$datefile = get-date -uformat '%m-%d-%Y-%H%M%S'
$filename = "F:\VMwareSnapshots_" + $datefile + ".htm"

#Get your list of VMs, look for snapshots. In larger environments, this may take some time as the Get-VM cmdlet is not very quick.
$ss = Get-vm | Get-Snapshot
Write-Host "   Complete" -ForegroundColor Green
Write-Host "Generating snapshot report"
$ss | Select-Object vm, name, description, powerstate | ConvertTo-HTML -head $tableProperties -body "<th><font style = `"color:#FFFFFF`"><big> Snapshots Report (the following VMs currently have snapshots on!)</big></font></th> <br></br> <style type=""text/css""> body{font: .8em ""Lucida Grande"", Tahoma, Arial, Helvetica, sans-serif;} ol{margin:0;padding: 0 1.5em;} table{color:#FFF;background:#C00;border-collapse:collapse;width:647px;border:5px solid #900;} thead{} thead th{padding:1em 1em .5em;border-bottom:1px dotted #FFF;font-size:120%;text-align:left;} thead tr{} td{padding:.5em 1em;} tbody tr.odd td{background:transparent url(tr_bg.png) repeat top left;} tfoot{} tfoot td{padding-bottom:1.5em;} tfoot tr{} * html tr.odd td{background:#C00;filter: progid:DXImageTransform.Microsoft.AlphaImageLoader(src='tr_bg.png', sizingMethod='scale');} #middle{background-color:#900;} </style> <body BGCOLOR=""#333333""> <table border=""1"" cellpadding=""5""> <table> <tbody> </tbody> </table> </body>" | Out-File $filename
Write-Host "   Complete" -ForegroundColor Green
Write-Host "Your snapshot report has been saved to:" $filename

# Create mail message

$server = "yourmailserveraddress.com"
$port = 25
$to      = "youremailaddress"
$from    = "youremailaddress"
$subject = "vCenter Snapshot Report"

$message = New-Object system.net.mail.MailMessage $from, $to, $subject, $body

#Create SMTP client
$client = New-Object system.Net.Mail.SmtpClient $server, $port
# Credentials are necessary if the server requires the client # to authenticate before it will send e-mail on the client's behalf.
$client.Credentials = [system.Net.CredentialCache]::DefaultNetworkCredentials

# Try to send the message

try {
    # Convert body to HTML
    $message.IsBodyHTML = $true
    $attachment = new-object Net.Mail.Attachment($filename)
    $message.attachments.add($attachment)
    # Send message
    $client.Send($message)
    "Message sent successfully"

}

# Catch an error

catch {

	"Exception caught in CreateTestMessage1(): "

}

 

Another point worth mentioning – you should change the path that the report is saved to on disk – in my script it is set to F:\, so just modify this to suit your environment. Kudos to Andrew at winception for his Snapshot checking code – I have used a lot of it above, but modified it somewhat to include additional information, and style the HTML table so that it is much easier on the eyes. I also added the e-mail functionality to the script. The following is a screenshot after I executed the script in PowerCLI manually. You would of course look to automate the process by scheduling this script in on your machine.

 

 

Enjoy, and please drop any comments, improvements or feedback in the comments section!