Can’t find/create new VMFS datastore after removing old one using Add Storage Wizard

 

Yesterday I was doing some out of hours work for a client who had a VM on a VMFS 3 datastore that only had a 1MB blocksize, but needed a data disk larger than 256GB. Naturally, the datastore needed to be recreated with a larger blocksize, as it is in a vSphere 4 environment and running VMFS 3. I did a storage vMotion to move the data off the datastore, then removed the datastore using one of the ESX hosts. I checked the other ESX hosts and noticed it was still showing there too. Refreshing the storage on each solved this and the datastore disappeared as expected.

 

I then went to re-create the datastore, happy with the knowledge that our SAN would still be presenting the LUN to our ESX hosts. When I arrived at the Add New Disk/LUN wizard, I saw that the LUN was in fact, not showing. Here is the troubleshooting process I used to sort this issue out. It turned out to be two old ESX hosts that were decomissioned a short while back who were still “hanging on” to this datastore.

 

  • Checked each live ESX host for the Datastore and made sure it was removed
  • Rescanned each HBA on the ESX hosts for new storage etc…
  • Tried adding datastore again (still couldn’t find the storage)
  • Logged onto FC SAN and ensured the LUN was still being presented to all ESX hosts in the cluster
  • Used vCenter “Maps” feature to select “VM to Datastore” relationship to ensure no VMs were mapping to this datastore or trying to refer to it anymore (like CDs mounted etc).
  • Under Maps, I noticed the datastore in question was still visible at the very bottom, but nothing related to it – i.e. there were no branches to anything.
  • Went to Datastores view in vCenter, and saw the datastore still listed there but had “(inactive)” next to it’s name.
  • Highlighted the datastore and looked at it’s summary (see screenshot below for example) and saw 2 x ESX hosts still showing as connected to it. I then realised that these would be the old, decommissioned ESX hosts in the inventory still “holding on” to the datastore.
  • I removed these old hosts from the vCenter inventory, and the datastore was then gone. I was then able to re-create it and the presented LUN appeared in the add storage wizard as you would normally expect.
 
datastore showing as (inactive) even after removing it from all connected ESX hosts

 

2 x ESX/ESXi hosts were still connected to the old, removed storage, preventing it from being re-created (This example shows one with 5 hosts connected for reference)

 

So if you find yourself removing a datastore, and when arriving at the add new storage wizard and this LUN is not being presented back to you as an option to create a new datastore with, try the above troubleshooting steps to ensure nothing is still referencing the old, deleted datastore. Once sorted out, you should get the option back again.

 

Veeam Backup stats report for all your VM Backup jobs in PowerShell

 

The other day I was asked to collect some statistics on our Veeam Backup & Recovery server from as many VM Backup jobs as possible. The environment has roughly 70 scheduled jobs thats run either daily or weekly. After searching around a bit first I could not find any current solution or built in method to retrieve the info I needed to collect in a quick or automated way. First ideas were to either somehow grab the info via SQL queries from the Veeam database, or to rather take a sampling of 10-20 different types of jobs and their backup sessions over one normal incremental run day, and one normal full backup day (Manually collecting this data from email reports would be quite a slow process).

 

After browsing around the Veeam Community Forums I suddenly remembered that there was a PowerShell module that Veeam Include with B&R. I read the basic documentation and got acquainted with a few simple cmdlets.  I wanted to build a report, that would loop through every single Veeam B&R Job we have, and grab data from the last 7 backup sessions of each (daily backups), therefore giving me a good idea of both full backup and incremental backup runs performance, times taken etc… My first attempt at a script got me almost all the way there (tried during spare time in my evenings!) – I was however having trouble matching backup session data with the right day’s backup file stats – sometimes the ordering was out, and I would get metrics back for a backup file that was not from the correct day. Before I was able to resolve this myself, help arrived from “ThomasMc” over at the Veeam Community Forums. (Thanks Thomas!) We got a script together that was able to match up sessions correctly. I then added a few more features, as well as some nice HTML formatting and the ability to grab statistics for all jobs instead of just one sample job. The resulting script gets the following info for you:

 

  • Index (1 = the last backup sesion, 2 = the day before that, etc)
  • Job Name
  • Start time of job
  • Stop time of job
  • File Name (Allows you to determine if the job was a full or incremental run)
  • Creation Time
  • Average Speed MB – average processing speed of the job
  • Duration – time the job took to complete
  • Result – Success/Warning/Failed (Failed is highlighted in red)

 

Here is an example of the report run on my Veeam Backup & Recovery Lab environment at home (Thanks to Veeam for the NFR licenses they gave out to VCPs earlier this year!)

 
[download id=”1″]
[download id=”8″]

 

So, to run the above script, launch a PowerShell session from within Veeam B&R (Tools -> PowerShell). This will make sure your PowerShell session launches with the Veeam Automation/PowerShell snapin. Execute the script and you’ll get an HTML file output to the root of your C:\ drive. By default, all jobs you have in Veeam will be detailed. If you wish to sample a specific job, or a job with a certain word/phrase in it, adjust the -match parameter for the Get-VBRJob cmdlet line near the top of the script. The default setting is an empty string – i.e. “”. To change how many sessions the the script fetches for each backup job, just change the “$sessionstofetch” variable defined at the top of the script.
I have added comments throughout the script for those interested in how it works. Lastly, you could also quite easily modify this script to e-mail you the report, or even run it as a scheduled task. Let me know if you need help doing this and I’ll gladly modify it as required.

 

Upgrading to vSphere 5.0 from vSphere 4.x – Part 1 – vCenter Server

 

Introduction

 

vSphere 5.0 is a great step up in terms of new functionality and features from vSphere 4.1. VMware have introduced some awesome new features that are definitely worth making use of by getting your environment upgraded. A few that caught my eye are sDRS (Storage DRS), Storage vMotion of VMs with Snapshots, and of course the ability to now run those “monster” VMs (much larger VMs supported now).

 

Before you take the plunge and upgrade, I would recommend doing a good amount of reading best practises documentation and planning. I have linked to two very useful documents from VMware that will help with your planning. There is an upgrade checklist which you can work through systematically, as well as a best practises whitepaper which helps explain the process in good detail with some great screenshots. In my post I will be going through the process I followed to upgrade my lab environment from vSphere 4.1 to 5.0. As it is a lab environment, I didn’t do too much in terms of planning, but for production environments, this would be a good idea as it never hurts to be prepared.

 

As part of the process, I also set up a brand new VMware Update Manager server in a VM once my vCenter server was updated to 5.0 to aid upgrading my ESXi hosts from 4.1 to 5.0. I only have three virtualised ESXi hosts running in my lab cluster, so it would have been quicker to use the ISO and do them manually, but I wanted to go through the process myself as I don’t spend enough time on Update Manager in my work environment.

 

Documentation to read

vSphere 5.0 upgrade checklist

VMware vSphere 5.0 Upgrade Best Practises Technical Whitepaper

 

So without further ado, let’s begin.

 

Update your vCenter Server

 

  • First of all, make sure you are running vCenter Server 4.0 or above. If you are, then chances are you’ll be running a 64bit OS. Just be sure to check though, as it is possible to run vCenter Server 4.0 on a 32bit OS. If you have vCenter Server running on a 64bit OS, then you are good to go, otherwise you’ll need to take a slightly more complicated route, which involves creating a new vCenter server on a 64bit OS, installing vCenter 5.0 and then migrating the existing database over to the new server. For specifics there is documentation on this process in the vSphere 5.0 Upgrade Guide from VMware.
  • Next up, check that the other minimum requirements are met; especially in terms of CPU and RAM.
  • Backup your existing vCenter database – this is clearly a very important step. Make sure you back up everything. Before I upgraded my lab, I took a backup of my vCenter SQL 2005 Express Database as well as my SSL certificates. The vCenter 5.0 upgrade wizard will remind you about this too. Keep your backups safe. For my lab vCenter database, I simply installed Microsoft SQL Server Management Studio Express (free) and used the backup option in there to backup the database as well all other SQL system databases just to be safe.

 

You can use SQL Server Management Studio Express to backup your SQL Express Database.

 

  • Also take a backup of your vpxd.cfg file. On a 2003 Server this would by default be located under %ALLUSERSPROFILE%\Application Data\VMware\VMware VirtualCenter\ or for a 2008 Server, it would by default be under C:\ProgramData\VMware\VMware VirtualCenter\
  • Now you should look at your Database options and upgrade path. Review the Prerequisites documentation in the Upgrade Guide for databases and ensure yours is supported. If it is not, then be sure to follow the correct procedure to migrate to a supported one.
  • Ensure your vCenter Server has a 64bit DSN – it will already have one if you are running vCenter Server 4.1 (as this is 64bit only) but could be different if you are still on vCenter 4.0.
  • If you are running a Microsoft SQL Database (as I am in my lab), ensure your System DSN under ODBC connections is using the SQL Native Client driver.
  • Ensure you have Microsoft .NET 3.5 SP1 and Windows Installer 4.5 or greater installed on your vCenter server.

 

Ensure your System DSN under ODBC connections is using the SQL Native Client driver.

 

If you already have a running vCenter 4.x deployment then you shouldn’t have too many more things to sort out, but a few other fairly obvious ones to check (taken from the Upgrade Checklist document I linked above) are:

 

    • Note down all your database login credentials and ensure the vCenter database login has db_owner permissions.
    • Make sure your current installation path of vCenter does not have any commas or periods in it. (I assume this can cause trouble for the upgrade!)
    • Ensure your vCenter Server name is not longer than 15 characters and is registered correctly with your AD Domain’s DNS.
    • Ensure all required ports in/out of the vCenter server are open.
    • Make sure any additional plug-ins you use for vCenter Server are compatible with 5.0 – also make sure you re-enable or reconfigure these post-upgrade.
    • Make sure you know the rest of your hardware is compatible with vSphere 5.0! i.e. check your ESX / ESXi hosts are compatible and there will be no issues there. You can use the vCenter Host Agent pre-upgrade checker utility included with the vCenter 5.0 installation media for this.

 

At this point, if you are happy to proceed and have your backups safe, load up the installation media for vCenter 5.0 and begin the installer. Start the installation wizard for vCenter 5.0 – this wizard does a great job of guiding you through the upgrade process and asking what configuration options you would like. They are fairly straight forward – choose as applicable to your installation. I have captured screenshots of the installer process I followed for my lab – most options for me were fine to leave on their defaults. See the following series of screenshots:

 

The main vCenter 5.0 Installation menu

 

Using Windows Authentication for SQL server.

 

We are upgrading, so this is the obvious choice.

 

Automatic upgrade of vCenter agent on hosts.

 

Configure the vCenter server service account.

 

Configure your ports - in most cases these *should* be the defaults

 

More ports to configure...

 

Web Services JVM Memory Configuration - based on how many hosts you have

 

Increase ephemeral ports available if you have this VM Power on requirement!

 

After this wizard is completed the upgrade will begin. If applicable your vCenter database schema will also be upgraded and soon you’ll be up and running with vCenter 5.0. Note that this upgrade does require your vCenter service be down for the duration of the installer upgrade process – how long this is really depends on how big your vCenter DB is. My lab vCenter upgrade took about 25 minutes to run through, but its inventory is very small – for most production environments with up to 100 VMs or so I wouldn’t see this taking longer than 45-60 minutes in most cases, but remember its dependant on various other factors. Once the installer is finished, it would be a good idea to also update your vSphere client to access your vCenter server with – install the version that comes with your vCenter 5.0 installation media and login again. You should see your inventory as per usual and hopefully all will be well. One thing I noticed immediately following my upgrade was that I had an alert for my datastores – I blogged about this over here actually. This is only really applicable if you are running a lab environment (or small production) with only 1 or 2 shared datastores. Other than this, the rest of my VMs were all running happily on their respective ESXi hosts and everything else was just fine.

 

So coming up in Part 2, I will cover the next step in upgrading your environment to vSphere 5.0 – the ESX(i) hosts along with a couple of different methods of doing this. If there is anything that I have missed, or you have any tips or additional info, please feel free to update using the comments section.

 

More in this series:

 

Part 2 – Upgrading to vSphere 5.0 from vSphere 4.x – Part 2 – Manually upgrading ESX(i) hosts
Part 3 – Upgrading to vSphere 5.0 from vSphere 4.x – Part 3 – Using VUM to upgrade ESX(i) hosts

Adding a Percent Free Property to your Get-Datastore cmdlet results using Add-Member

 

Update:

 

Thanks to Alan Renouf  (@alanrenouf) for pointing out the New-VIProperty cmdlet to me, I was able to go back to the drawing board and really shorten up my original PowerCLI script by using the New-VIProperty cmdlet. @PowerCLI on twitter also pointed this out shortly afterwards. So after taking a quick look at the reference documentation for the cmdlet, here is my new VIProperty to get the “PercentFree” for each Datastore object returned from the Get-Datastore cmdlet!

 

New-VIProperty -Name PercentFree -ObjectType Datastore -Value {"{0:N2}" -f ($args[0].FreeSpaceMB/$args[0].CapacityMB*100)} -Force

 

To use it, simply run the above in your PowerCLI session, then use “Get-Datastore | Select Name,PercentFree”. A better option would be to load the above New-VIProperty script into your PowerCLI / PowerShell profile. Taking it one step further, @LucD has kindly offered to add this to his VIProperty Module, which means you could instead, just load this module in to your profile and benefit from all the other great VIProperty extensions! Example usage below:

 

New-VIProperty -Name PercentFree -ObjectType Datastore -Value {"{0:N2}" -f ($args[0].FreeSpaceMB/$args[0].CapacityMB*100)} -Force
Get-Datastore | Select Name,FreeSpaceMB,CapacityMB,PercentFree

 

You can find tons of other great VIProperties, or even download the entire module over at LucD’s blog.

 

Original Post:

 

I have been using the Get-Datastore cmdlet quite frequently at my workplace lately – mainly to gather information on various datastores which I export to CSV, then plug into Excel to perform further sorting and calculations on. To save myself a step in Excel each time (creating columns in spreadsheets to show and format the Percent Free figure of each Datastore), I decided to add this into my PowerCLI script.

 

Below, I’ll show you a fairly simple script that will add a member “NoteProperty” (A property with a static value) to your Datastore PS Objects. In the script, we’ll grab all the Datastores based on a search criteria (by default this will get all Datastores or Datastores with the word “Shared” in their name, but you can change it to match what you would like), then we’ll iterate through each, calculate the Percentage of free space from the two figures that we are given back already from Get-Datastore (CapacityMB and FreeSpaceMB), and finally add the member property to the current datastore object. Once this is done, we’ll output the results of  the $datastores object using “Select” to show the Name, Free Space in MB, Capacity in MB, and Percent Free of each object contained within to a CSV file (Remove the | Export-Csv part on the end if you just want the results output to console instead).

 

 

function CalcPercent {
	param(
	[parameter(Mandatory = $true)]
	[int]$InputNum1,
	[parameter(Mandatory = $true)]
	[int]$InputNum2)
	$InputNum1 / $InputNum2*100
}

$datastores = Get-Datastore | Sort Name
ForEach ($ds in $datastores)
{
	if (($ds.Name -match "Shared") -or ($ds.Name -match ""))
	{
		$PercentFree = CalcPercent $ds.FreeSpaceMB $ds.CapacityMB
		$PercentFree = "{0:N2}" -f $PercentFree
		$ds | Add-Member -type NoteProperty -name PercentFree -value $PercentFree
	}
}
$datastores | Select Name,FreeSpaceMB,CapacityMB,PercentFree | Export-Csv c:\testcsv.csv

 

Here’s an example of our output:

 

Note that you could simplify the script by removing the function called “CalcPercent” and adding it to your PowerShell or PowerCLI environment profile. Hope this helps!

 

How to create a PowerShell / PowerCLI profile & add Functions for later use

 

Ever wondered how to set yourself up a PowerShell or PowerCLI profile? Or how to go about saving useful Functions that you have created or picked up elsewhere to your profile for use in new sessions? Here I’ll detail the basics of PowerShell or PowerCLI session profiles and show you how to set one up, as well as how to save your first Function to this profile for use in future sessions.

 

Creating a PowerShell (or PowerCLI) profile is a great idea if you are considering spending any decent amount of time in the shell or executing scripts.  They allow you to customise your PowerShell / PowerCLI environment and apply useful functions or elements to every new PowerShell / PowerCLI session you start, without the need to reload these items yourself each time. A few examples of what you can add to your profile are functions, variables, cmdlets and snap-ins. This makes life so much easier when scripting or working on your latest bit of automation.

 

To begin with, you can see the currently set profile path in your session by typing $profile in the shell. If you would then like to edit your profile, try issuing the command “notepad $profile”. This will (if it exists) open your profile in notepad to edit. If it doesn’t exist, you’ll get an error when notepad tries to load indicating so. If this is the case, or you would like to create a new PowerShell profile for use in PowerShell or PowerCLI, use the last bit of script listed in the next section to create a profile and get started with customising your environment.

 

Show your profile path or open it up for editing in notepad:

$profile
notepad $profile

 

Create a new PowerShell profile for the current user if one does not already exist, then open it up for editing in notepad:

if (!(test-path $profile)) 
           {new-item -type file -path $profile -force}
notepad $profile

 

Once you have your profile created and ready for editing, try add a few useful Functions or variable declarations to get yourself going, then launch a new PowerShell or PowerCLI session to try them out. Here is a quick example of a function you can create to test your profile out in PowerCLI:

 

function VMInfo {
	param(
	[parameter(Mandatory = $true)]
    	[string[]]$VMName)
	Get-VM $VMName
}

 

Save your profile with the above function added, then start a new session of PowerCLI. Connect to your vCenter server using Connect-VIServer and then try type in “VMInfo aVMname” (where aVMname = the name of an actual VM in your environment). As you may have noticed, this function is just doing the same job that the Get-VM cmdlet already does for you, and will simply return the info about a VM specified. Here is a quick run down of the function above. The function is declared first of all (Name of the function). Then we open curly brackets to start the code. The param section then defines whether a parameter is mandatory (needed) or not, and what type of variable that parameter is (in our case we used a string), as well as giving the parameter a variable name ($VMName). The last bit does the actual work to run the Get-VM cmdlet on the parameter ($VMName) passed to the function. Here’s the output when running the new function against a VM called “SEAN-DC01” in my PowerCLI session:

 

 

You should now be able to see how useful this can become – you can quickly add new functions, variables, etc to your PowerShell profile for future use in new sessions. Try think of a few useful ones for cmdlets that you use often on a day to day basis and add these into your profile. You won’t regret it in the long run!