Home > Home labs, Virtualization, VMware > My VMware vSphere Home lab configuration

My VMware vSphere Home lab configuration

September 5th, 2012 Leave a comment Go to comments

I have always enjoyed running my own home lab for testing and playing around with the latest software and operating systems / hypervisors. Up until recently, it was all hosted on VMware Workstation 8.0 on my home gaming PC, which has an AMD Phenom II x6 (hex core) CPU and 16GB of DDR3 RAM. This has been great, and I still use it, but there are some bits and pieces I still want to be able to play with that are traditionally difficult to do on a single physical machine, such as working with VLANs and taking advantage of hardware feature sets.

 

To that end, I have been slowly building up a physical home lab environment. Here is what I currently have:

Hosts

  • 2 x HP Proliant N40L Microservers (AMD Turion Dual Core processors @ 1.5GHz)
  • 8GB DDR3 1333MHz RAM (2 x 4GB modules)
  • Onboard Gbit NIC
  • PCI-Express 4x HP NC360T Dual Port Gbit NIC as addon card (modifed to low-profile bracket)
  • 250GB local SATA HDD (just used to host the ESXi installations.

Networking

  • As mentioned above, I am using HP NC360T PCI-Express NICs to give me a total of 3 x vmnics per ESXi host.
  • Dell PowerConnect 5324 switch (24 port Gbit managed switch)
  • 1Gbit Powerline Ethernet home plugs to uplink the Dell PowerConnect switch to the home broadband connection. This allows me to keep the lab in a remote location in the house, which keeps the noise away from the living area.

Storage

  • This is a work in progress at the moment, (currently finding the low end 2 x bay home NAS devices are not sufficient for performance, and the more expensive models are too expensive to justify).
  • Repurposed Micro-ATX custom built PC, housed in a Silverstone SG05 micro-ATX chassis running FreeNAS 8.2 (Original build and pics of the chassis here)
  • Intel Core 2 Duo 2.4 GHz processor
  • 4GB DDR2-800 RAM
  • 1 Gbit NIC
  • 1 x 1TB 7200 RPM SATA II drive
  • 1 x 128GB OCZ Vertex 2E SSD (SATA II)
  • As this is temporary, each drive provides 1 x Datastore to the ESXi hosts. I therefore have one large datastore for general VMs, and one fast SSD based datastore for high priority VMs, or VM disks. I am limited by the fact that the Micro-ATX board only has 2 x onboard SATA ports, so I may consider purchasing an addon card to expand these.
  • Storage is presented as NFS. I am currently testing ZFS vs UFS and the use of the SSD drive as a ZFS and zil log / and or cache drive. To make this more reliable, I will need the above mentioned addon card to build redundancy into the system, as I would not like to lose a drive at this time!

Platform / ghetto rack

  • IKEA Lack rack (black) – cheap and expandable : )

 

To do

Currently, one host only has 4GB RAM, I have an 8GB kit waiting to be added to bring both up to 8GB. I also need to add the HP NC360T dual port NIC to this host too as it is a recent addition to the home lab.

On the storage side of things, I just managed to take delivery of 2 x OCZ Vertex 2 128GB SSD drives which I got at bargain prices the other day (£45 each). Once I have expanded SATA connectivity in my Micro-ATX FreeNAS box I will look into adding these drives for some super fast SSD storage expansion.

 

The 2 x 120GB OCZ SSDs to be used for Shared Host Storage

HP NC360T PCI-Express NIC and 8GB RAM kit for the new Microserver

 

Lastly, the Dell PowerConnect 5324 switch I am using still has the original firmware loaded (from 2005). This needs to be updated to the latest version so that I can enable Link Layer Discovery Protocol (LLDP) – which is newly supported with the VMware vSphere 5.0 release on Distributed Virtual Switches. This can help with the configuration and management of network components in an infrastructure, and will mainly serve to allow me to play with this feature in my home lab. I seem to have lost my USB-to-Serial adapter though, so this firmware upgrade will need to wait until I can source a new one off ebay.

 

  1. Colin Westwater
    September 5th, 2012 at 20:41 | #1

    Sean, the N40L’s can take 16GB RAM…

  2. Andrey
    September 6th, 2012 at 04:09 | #2

    Hey Sean, I actually have a similar vSphere 5 environment, minus the switch which I am still shopping around for… I was actually looking to get the Synology DS411 with a couple of SSD’s. Although it is a bit pricey. Also looking to upgrade from 8GB to 16GB of memory on each host.

  3. September 6th, 2012 at 10:25 | #3

    Nice setup 🙂 – been working on my own setup also. Got my storage but need to get my whitebox(es).

    Are you using these lack racks (http://www.ikea.com/us/en/catalog/products/40104270/) ?

  4. September 13th, 2012 at 02:23 | #4

    Hi Sean,

    I am also building my virtual study lab currently using my HP Microserver N36L, but as it is 1.3Ghz, I am finding iit hard installing vCenter Server 5 on it (requires 2.0Ghz the book says). How did you get it on yours then ?

  5. September 13th, 2012 at 09:55 | #5

    Hi Babar,

    That should work just fine – the requirement in the docs is there for production cases – in this case it is just going to be a small lab environment on your Microserver. So ignore that, and just set it up as a VM, even with 1.3GHz it’ll still run.

    Cheers,
    Sean

  6. September 13th, 2012 at 12:41 | #6

    Thanks for your reply Sean.

    As I try to install the vCenter on a Win7 VM, after the files extraction etc, it fails the installer saying it is not compatible with the platform.

    My VM is dual core 1.3Ghz, 4GB memory and 45GB disk. Dont know what I could be doing incorrectly ?

  7. September 13th, 2012 at 12:55 | #7

    @Babar Mughal

    That is your problem right there – you are trying to install on Win7 – which I don’t believe is a supported platform at all to run vCenter! You should use Windows Server 2008 R2 (64bit) or a server OS that is supported 🙂 Alternatively you can use the vCenter virtual appliance.

  8. September 13th, 2012 at 13:16 | #8

    Aah, that rings a bell. Thanks I will give that a go…

  1. September 11th, 2012 at 03:31 | #1

ERROR: si-captcha.php plugin says GD image support not detected in PHP!

Contact your web host and ask them why GD image support is not enabled for PHP.

ERROR: si-captcha.php plugin says imagepng function not detected in PHP!

Contact your web host and ask them why imagepng function is not enabled for PHP.