nVidia introduces the worlds “first virtualized GPU”

 

I usually only ever follow nVidia and AMD with regard to their GPU offerings for gamers, this being one of my passtimes, however this press release of the green team’s the other day caught my attention.

 

To summarise, nVidia are unveiling their “VGX” platform, which will allow IT to deliver virtualized desktops with graphics or GPU computing power similar to, or as close to the real deal as possible, for users on any connected device (not necessarily just PCs or Thin clients for example). This VGX platform will consist of a few things, one of which will be the addon cards for servers that are passively cooled and as energy efficient as possible (interesting when considering how much power desktop gaming-grade GPUs generally consume!)

 

Some of the features nVidia are toting for their VGX platform thus far, according to their press release are:

 

  • GPU Accelerated desktops (of course)
  • Ultra-low latency remote display capability
  • Energy efficient, passively cooled hardware. Each board will have
    • 4 x GPUs (each with 192 CUDA architecture cores and a 4GB frame buffer).
    • 16GB memory
    • Industry standard PCI Express interface
  • VGX GPU Hypervisor
    • This is a software layer that should integrate with commercial hypervisors (VMware anyone?), enabling virtualization of the GPU
  • High user densities – shared graphics processing power for multiple users
    • (Up to 100 users to be served from a single server powered by one VGX board apparently)

 

Here are a few videos from the press release:

 

httpv://youtube.com/watch?v=kTnB_9ZgEvg

httpv://youtube.com/watch?v=e6BI2OTM-Hk

 

The article has mention of Citrix technology support, but what about VMware View? I am sure this type of integration should be available – I wonder how PCoIP would work to deliver virtual desktops accelerated by the VGX platform. If the latency reduction claims and acceleration benefits are anything to go by then we should be in for an even better VDI experience!

 

Leave a Comment