Changing Registry entries on multiple systems with PowerShell and Remoting

 

A few weeks ago, a colleague asked if I knew of a way to script the change or modification of the Registered Owner / Organization information on a Windows Server system (2003 or 2008). I knew that this could be achieved with PowerShell and had some initial ideas, so I spent a few minutes whipping up the script below.

For this to work, you should ideally have all systems on the same Windows Domain and have enabled PowerShell remoting on each system that needs to be changed. Of course you could also just run the script on a single workstation/server on its own without the need for PSRemoting.

 

# On all remote machines that need their info changed
Set-ExecutionPolicy RemoteSigned
Enable-PSRemoting # Say yes to all prompts
#region This part only needed if machines do not belong to the same domain...
# Note: This can be a security risk, only use if you are sure you want to allow any host as a trusted host. (e.g. fine for lab environments)
cd wsman::localhost\client
Set-Item .\TrustedHosts * # Say yes to all prompts
#endregion
# Run on your management machine/machine you are using to update all others...
$computers = @("SERVER1","SERVER2","SERVER3")

foreach ($computer in $computers) {
    Enter-PSSession $computer
    cd 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion'
    Set-ItemProperty -Path . -Name "RegisteredOwner" -Value "Auth User"
    Set-ItemProperty -Path . -Name "RegisteredOrganization" -Value "Lab"
    Exit-PSSession
}

 

So the above should update your registered owner and organization details for each server listed in the $computers array. (Specify your own host names here). The above script should be easy enough to modify if you are looking to change other registry entries. Finally, don’t forget that you should always be careful when updating registry, especially via script – make sure you have backups!

 

How to use PoSH or PowerCLI to SSH into Devices & retrieve information (Gathering SHA1 Fingerprints)

 

I was listening to GetScripting podcast #29 the other day. The guest was Pete Rossi (PoSH Pete), and in the discussion he discussed data centre automation. Part of the automation he has set up involves wrapping SSH with PowerShell, and by doing so he is able to automate various functions on devices that can be SSH’d onto. This got me thinking of potential use cases. Soon enough I already had a couple of use case scenarios that could do with automating using SSH and PowerCLI. Pete mentioned he mainly uses an SSH component by a company called “WeOnlyDo Software”, however Alan Renouf also mentioned having heard of “SharpSSH”. I decided I wanted to try both out and figure out how to use both, so with that I set out figuring out how to get them working with PowerShell and PowerCLI. In this post (Part 1) I will cover using the SharpSSH DLL. In Part 2 I will go into the (easier in my opinion) wodSSH component (also paid for) method.

 

SharpSSH (based on Tamir Gal’s .NET library)

 

I believe Tamir Gal originally created this library, however it seems to now be maintained by others.

 

First of all, for SharpSSH to work with PowerShell or PowerCLI, you’ll need to get the relevant DLL that will be loaded by your script. I found a version of SharpSSH being actively worked on and improved by Matt Wagner on Bitbucket. I downloaded this version (called SharpSSH.a7de40d119c7.dll) to get started. To load the functions that we’ll be using to SSH in to devices, I used the following PowerShell function. Just be sure to reference in the correct path of the SharpSSH DLL that you downloaded above in this function. Download the function below:

 

[download id=”13″]

 

Then as long as the functions are loaded in your PoSH session, you should be able to run the example below.

 

How to SSH into ESXi hosts and retrieve SHA1 Fingerprints using PowerCLI and SharpSSH

 

Example output after running the script detailed below against multiple ESX hosts

 

 

Now, first off I’ll say that this isn’t necessarily the best way of retrieving SSL Fingerprints from your ESXi hosts in terms of security – you’d want to do this from the DCUI of the ESXi hosts to confirm the identity of each host is as you expect. (See this blog post and comments over at Scott Lowe’s blog for more detail on the security considerations). With that being said, here is my implementation of SharpSSH, used to SSH into each ESXi host (from a Get-VMHost call) and retrieve the SHA1 Fingerprints. The script will create and output a table report, listing each ESX/ESXi host as well as their SHA1 Fingerprint signatures.

 

Background for the Script

 

I believe this is actually quite an easy bit of info to collect using PowerCLI and the ExtensionData.Config properties on newer hosts / vSphere 5, but in my environment I was working with, all my ESX 4.0 update 4 hosts did not contain this Fingerprint info in their ExtensionData sections when queried with PowerCLI. Therefore I automated the process using SSH as I could use the command “openssl x509 -sha1 -in /etc/vmware/ssl/rui.crt -noout -fingerprint” to generate the Fingerprint remotely on each host via SSH. So with that in mind, here is the script that fetches this info. Note it will prompt for root credentials on each host that is connected to – this could probably be easily changed in the Function (downloaded from above). So here is the final script which will list all ESXi hosts and their SHA1 Fingerprints:

 

$Report = @()
$VMHosts = Get-VMHost | Sort Name

foreach ($vmhost in $VMHosts) {
	New-SshSession root $vmhost
	if (Receive-SSH '#')
	{
		Write-Host "Logged in as root." -ForegroundColor Green
		$a = Invoke-SSH "openssl x509 -sha1 -in /etc/vmware/ssl/rui.crt -noout -fingerprint" 'SHA1'
		$temp = $a | select-string -pattern "SHA1 Fingerprint="
		$row = New-Object -TypeName PSObject -Property @{
			SHA1 = $temp
			HostName = $vmhost
		}
		$Report += $row
		$rootlogin = $true
		Write-Host "Output complete." -ForegroundColor Green
	}
	if ($rootlogin -eq $true)
	{
		Write-Host "Exiting SSH session."
		Send-SSH exit
	}
	Write-Host "Terminating Session."
	Remove-SshSession
}

$Report

 

Well, I hope this helps you out with a way to automate SSH access to devices to retrieve information or change settings. This could easily be adapted to send SSH commands to any other kind of device that accepts SSH as a method of login. Switches, Routers, linux servers, you name it! In my next blog post I will be showing you how to use the wodSSH library (We Only Do Software) to do SSH in PowerShell or PowerCLI – I have found this method to be a bit easier to use when compared with SharpSSH! So look out for my next post coming soon!

Generating Graphical Charts with VMware PowerCLI & PowerShell

Charts are awesome – they can help make sense of endless reams of text and data and they generally look pretty. So my question to myself was: “How do I get useful data I generate using PowerCLI into a nice, neat little chart?” I had a quick google and found a couple of different solutions. The one that stood out as being the easiest to start off with for me was to use the “Microsoft Chart Controls for Microsoft .NET Framework 3.5

I read a few blog posts around detailing how to create these custom .NET charts in PowerShell, but this tends to be quite a tedious process – akin to creating a Windows Forms GUI in PowerShell manually – basically a complete pain. The blog posts I read definitely helped me understand how to create charts and soon I was able to generate some pretty cool charts based off data from PowerCLI (or PowerShell) data. I wanted to ultimately automate the creation of Charts for my PowerCLI and PowerShell scripts, so I decided to create myself a Function that could be used anywhere to generate a Bar or Pie Chart on the fly.

httpv://youtube.com/watch?v=EvQ2znWr0lk

Enter Create-Chart. This is the Function I have made that accepts a bunch of Parameters to create a custom Chart and outputs this to a .PNG file. The data needs to be fed in to the function via a Hash Table (this could be changed) so I also created a “helper” Function called Create-HashTable which also does the work of generating a hash table for use with the Create-Chart Function. You could of course also just feed the Create-Chart function with a manually created HashTable too – this is useful to know because my Create-HashTable function is fairly basic and is not too flexible. Here are a couple of examples of Charts I created using these Functions:

Pie Chart of VMs and their configured Memory Resource settings
The same data but now in a Bar chart
Host Chart created with a manually created Hash Table (Name and MemoryUsageMB Properties)

Download the two functions below to give them a try! Please do suggest any improvements – my parameter handling on the Create-Chart script needs work – they are not specified as mandatory, although all parameters are mandatory – I couldn’t get the Function to work correctly when I did make them mandatory. The Create-HashTable function could also be improved in that at the moment you can only specify Cmdlets for the “Cmdlet” parameter that do not have any double quotation marks inside the cmdlet or any $_ variables. This is because of the way I am using the Invoke-Expression cmdlet inside the function. A simple cmdlet parameter such as “Get-VM | Select Name, MemoryMB” would work just fine for example. Remember that the Create-Chart function needs to be fed with a Hash Table. This could be generated yourself, or by using the Create-HashTable function below. Here are the downloads:

Don’t forget the Microsoft Chart Controls – a requirement to run these functions
Download the functions here:

https://www.shogan.co.uk/wp-content/uploads/Create-Chart.zip

https://www.shogan.co.uk/wp-content/uploads/Create-HashTable.zip

Once you have the Functions loaded, here are some examples to show you how to use them:

Pie Chart of VMs and their MemoryMB setting:

Create-Chart -ChartType Pie -ChartTitle "Sean's Awesome VM Chart" -FileName seanchart3 -XAxisName "VMs" -YAxisName "MemoryMB" -ChartWidth 750 -ChartHeight 650 -DataHashTable (Create-HashTable -Cmdlet "Get-VM" -NameProperty Name -ValueProperty MemoryMB)

Bar Chart of ESXi Hosts and their Memory Usage (MB) values:

Create-Chart -ChartType Bar -ChartTitle "Sean's Awesome Host Chart" -FileName seanchart4 -XAxisName "VM Hosts" -YAxisName "Memory Usage (MB)" -ChartWidth 750 -ChartHeight 650 -DataHashTable (Create-HashTable -Cmdlet "Get-VMHost" -NameProperty Name -ValueProperty MemoryUsageMB)

Use your own Hash Table to input the data:

Create-Chart -ChartType Bar -ChartTitle "Custom Chart" -FileName seanchart5 -XAxisName "My Objects" -YAxisName "My Object Values" -ChartWidth 750 -ChartHeight 650 -DataHashTable $HashTable

So there you have it – a fairly easy way to Chart the data you can get from your PowerCLI or PowerShell cmdlets! I wrote these Functions as part of a larger report that I am working on for another soon to come blog post! As I mentioned above, there is plenty of room for improvement – so if you do make any improvements or changes, please be sure to post them in the comments section.

PowerCLI Advanced Configuration Settings – Setting Host Syslog options

 

This is a PowerCLI script that will script setting up your ESXi host syslogging in two places: A Shared Datastore (that all ESXi hosts have visibility of) and to a remote Syslog host on the network. Alternatively, you could set it to change the Syslog directory to a different local path on each ESXi host (rather than the default). The two advanced configuration settings we’ll be working with are:

 

  • Syslog.global.logDir
  • Syslog.global.logHost

 

Why did I do this? Well, tonight I got a little sidetracked after looking into a small warning on a new scripted ESXi host I deployed – it had not automatically set up logging during installation, my guess, was because it’s local datastore had failed to initialise during installation – I couldn’t see a local datastore after deploying it via a script and had received a warning during installation. I went ahead just to see what happened. This was a new test host in my home lab. So afterwards, I decided I would set it up a remote Syslog location to clear the warning off. Naturally, I wanted to do this in PowerCLI.

 

The following script will take a few options you specify like Shared Datastore name, Syslogging Folder Name, and a remote Syslog host, and go ahead and configure all the ESXi hosts in your specified DataCenter object to use these options you specify. If you have only one datacenter object the script will see this and use that one by default. Otherwise it will ask you for the name of the specific DataCenter you want to work with. It will then create a “ESXi-Syslogging” folder on the shared datastore you specified (if it doesn’t already exist) and under this, create a sub-folder for each ESXi host. It will then configure each ESXi host to send it’s Syslog info to this datastore. As far as I can tell, certain Sysl0g files are linked from /scratch/log on each ESXi host to the datastore specified. For example:

 

usb.log becomes:

usb.log -> /vmfs/volumes/1bec1487-1189988a/ESXi-Syslogging/esxi-04.noobs.local/usb.log

 

Naturally, this wouldn’t be a good idea if the ESXi host could potentially lose access to this datastore, so the script will also specify a remote Syslog host to log to. If you don’t have one, check out the Free “Kiwi Syslog Server”. Here is the resulting script (download via URL below or just copy the script out the block):

[download id=”4″]

 

# Remote Syslogging setup for ESXi Hosts in a particular datacenter object
# By Sean Duffy (http://www.shogan.co.uk)
# Version 1.0
# Get-Date = 15/12/2011

#region Set variables up
$SyslogDatastore = "SharedDatastore2" # The name of the Shared Datastore to use for logging folder location - must be accessible to all ESXi hosts
$SyslogFolderName = "ESXi-Syslogging" # The name of the folder you want to use for logging - must NOT already exist
$RemoteSyslogHost = "udp://192.168.0.85:514" #specify your own Remote Syslog host here in this format
$DataCenters = Get-Datacenter
$FoundMatch = $false
$SyslogDatastoreLogDir = "[" + $SyslogDatastore + "]"
#endregion

#region Determine DataCenter to work with
if (($DataCenters).Count -gt 1) {
	Write-Host "Found more than one DataCenter in your environment, please specify the exact name of the DC you want to work with..."
	$DataCenter = Read-Host
	foreach ($DC in $DataCenters) {
		if ($DC.Name -eq $DataCenter) {
			Write-Host "Found Match ($DataCenter). Proceeding..."
			$FoundMatch = $true
			break
			}
		else {
			$FoundMatch = $false
			}
	}
	if ($FoundMatch -eq $false) {
		Write-Host "That input did not match any valid DataCenters found in your environment. Please run script again, and try again."
		exit		
		}
	}
else {
	$DataCenter = ($DataCenters).Name
	Write-Host "Only one DC found, so using this one. ($DataCenter)"
	}
#endregion

#Enough of the error checking, lets get to the script!

$VMHosts = Get-Datacenter $DataCenter | Get-VMHost | Where {$_.ConnectionState -eq "Connected"}

cd vmstore:
cd ./$DataCenter
cd ./$SyslogDatastore
if (Test-Path $SyslogFolderName) {
	Write-Host "Folder already exists. Configure script with another folder SyslogFolderName & try again."
	exit
}
else {
	mkdir $SyslogFolderName
	cd ./$SyslogFolderName
}

# Crux of the script in here:
foreach ($VMHost in $VMHosts) {
	if (Test-Path $VMHost.Name) {
		Write-Host "Path for ESXi host log directory already exists. Aborting script."
		exit
	}
	else {
		mkdir $VMHost.Name
		Write-Host "Created folder: $VMHost.Name"
		$FullLogPath = $SyslogDatastoreLogDir + "/" + $SyslogFolderName + "/" + $VMHost.Name
		Set-VMHostAdvancedConfiguration -VMHost $VMHost -Name "Syslog.global.logHost" -Value $RemoteSyslogHost
		Set-VMHostAdvancedConfiguration -VMHost $VMHost -Name "Syslog.global.logDir" -Value $FullLogPath
		Write-Host "Paths set: HostLogs = $RemoteSyslogHost LogDir = $FullLogPath"
	}
}

 

Some results of running the above in my lab:

 

Script output after running against 3 x ESXi hosts

 

Advanced settings are showing as expected
Tasks completed by the PowerCLI script

 

Here is the structure of the Logging in my Datastore after the script had run.

 

 

Lastly, I have found that these values won’t update if they are already populated with anything other than the defaults (especially the LogDir advanced setting). You’ll get an error along the lines of:

 

Set-VMHostAdvancedConfiguration : 2011/12/15 11:37:01 PM    Set-VMHostAdvancedConfigurati
on        A general system error occurred: Internal error

 

If this happens, you could either set the value to an empty string, i.e. “” or you could use Set-VMHostAdvancedConfiguration [[-NameValue] <Hashtable>] (using a hashtable of the advanced config setting as the NameValue) to set things back to the way they were originally. The default value for Syslog.global.logDir is = [] /scratch/log for example. It might be a good idea to grab this value and pipe it out into a file somewhere safe before you run the script in case you need to revert back to the default Syslog local location – you would then know what the value should be. You could use something like this to grab the value and put it into a text file:

 

Get-VMHostAdvancedConfiguration -VMHost $VMHost -Name "Syslog.global.logDir" | Out-File C:\default-sysloglocation.txt

 

Also, remember to think this out carefully before changing in your own environment – it may not suit you at all to set your Syslog location to anywhere else other than the local disk of each host. However, you may find it useful to automatically set the remote syslog host advanced setting for each of your ESXi hosts – in that case, simply modify the script above to only apply the one setting instead of both. I hope this helps, and as always, if you have any suggestions, improvements or notice any bugs, please feel free to add your comments.

 

 

Veeam Backup stats report for all your VM Backup jobs in PowerShell

 

The other day I was asked to collect some statistics on our Veeam Backup & Recovery server from as many VM Backup jobs as possible. The environment has roughly 70 scheduled jobs thats run either daily or weekly. After searching around a bit first I could not find any current solution or built in method to retrieve the info I needed to collect in a quick or automated way. First ideas were to either somehow grab the info via SQL queries from the Veeam database, or to rather take a sampling of 10-20 different types of jobs and their backup sessions over one normal incremental run day, and one normal full backup day (Manually collecting this data from email reports would be quite a slow process).

 

After browsing around the Veeam Community Forums I suddenly remembered that there was a PowerShell module that Veeam Include with B&R. I read the basic documentation and got acquainted with a few simple cmdlets.  I wanted to build a report, that would loop through every single Veeam B&R Job we have, and grab data from the last 7 backup sessions of each (daily backups), therefore giving me a good idea of both full backup and incremental backup runs performance, times taken etc… My first attempt at a script got me almost all the way there (tried during spare time in my evenings!) – I was however having trouble matching backup session data with the right day’s backup file stats – sometimes the ordering was out, and I would get metrics back for a backup file that was not from the correct day. Before I was able to resolve this myself, help arrived from “ThomasMc” over at the Veeam Community Forums. (Thanks Thomas!) We got a script together that was able to match up sessions correctly. I then added a few more features, as well as some nice HTML formatting and the ability to grab statistics for all jobs instead of just one sample job. The resulting script gets the following info for you:

 

  • Index (1 = the last backup sesion, 2 = the day before that, etc)
  • Job Name
  • Start time of job
  • Stop time of job
  • File Name (Allows you to determine if the job was a full or incremental run)
  • Creation Time
  • Average Speed MB – average processing speed of the job
  • Duration – time the job took to complete
  • Result – Success/Warning/Failed (Failed is highlighted in red)

 

Here is an example of the report run on my Veeam Backup & Recovery Lab environment at home (Thanks to Veeam for the NFR licenses they gave out to VCPs earlier this year!)

 
[download id=”1″]
[download id=”8″]

 

So, to run the above script, launch a PowerShell session from within Veeam B&R (Tools -> PowerShell). This will make sure your PowerShell session launches with the Veeam Automation/PowerShell snapin. Execute the script and you’ll get an HTML file output to the root of your C:\ drive. By default, all jobs you have in Veeam will be detailed. If you wish to sample a specific job, or a job with a certain word/phrase in it, adjust the -match parameter for the Get-VBRJob cmdlet line near the top of the script. The default setting is an empty string – i.e. “”. To change how many sessions the the script fetches for each backup job, just change the “$sessionstofetch” variable defined at the top of the script.
I have added comments throughout the script for those interested in how it works. Lastly, you could also quite easily modify this script to e-mail you the report, or even run it as a scheduled task. Let me know if you need help doing this and I’ll gladly modify it as required.