Share this page : facebooktwitterlinkedinmailfacebooktwitterlinkedinmail

You may come across some scenarios like some user need to do designing jobs from home, or setup a GPU server for students, who may only study the Graphic design class for a couple of hours a week, so a RDP session with GPU would meet these scenario or even setup VM in Azure and turn it on whenever needed.

We need to put these factors into consideration:

  • The RDP performance: As of writing this, RDP v10 works well with RemoteFx to eliminate the Delay and Jittering.
  • GPU virtualisation for VDI/Remote desktop host session: Microsoft Hyper-V vGPU or DDA, the supported cards are listed for Nvidia are here https://docs.nvidia.com/grid/latest/product-support-matrix/index.html.  and  https://en.wikipedia.org/wiki/RemoteFX
  • VGPU licensing from NVIDIA
  • Remote desktop CALS for multiple user.
Hardware/Specs choose

I will use an Azure Standard_NC6s_v3 instance with a Tesla v100 16GB card, which basically is a Hyper-V VM with GPU DDA and the pricing is about 6 AUD per hour. Then I will test the Autodesk Fusion 360 with a RemoteAPP on RDS

You can setup a similar VMs either in Azure or on premises with DDA, vGPU or session host.

  • See here the virtualisation choice for your choice, https://docs.microsoft.com/en-us/windows-server/remote/remote-desktop-services/rds-graphics-virtualization.
  • Then choose a VM instance, all the VM sizes can be found here: https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-gpu. Notice the CPU family VMs are only available in some area and not available for free trail.

Once the VMs is chosen, you can register an Azure account, and provide credit card details.

Install Driver, EXTENSION and License

Then you can connect to the VM instance via Remote desktop, note that by default you use the Microsoft Hyper-V video card, and  the Dedicated GPU are not available, because the driver is not installed. Besides, you need to install license for All other series but NV series. Also you need to install the Azure VM extension to enable RDMA network connectivity.

Usually we download the graphic driver from Nvidia website, but for Tesla and GRID, we need to download it from Microsoft website(https://docs.microsoft.com/en-us/azure/virtual-machines/windows/n-series-driver-setup) cause the GRID driver include licensing for GRID Virtual GPU Software in Azure( we will talk about GPU virtualisation licensing later).

For Tesla V100 card, there is only driver for server 2016 and 2012R2 on MS website and the driver for server 2019 on NVIDIA does not support licensing, if you want to install driver for windows server 2019, the GRID driver for Tesla V100 card on server 2019 will be OK, which have been tested.

Once the driver is installed, use command tool  nvidia-smi in folder C:\Program Files\NVIDIA Corporation\NVSMI. If the driver is installed, you will see output similar to the following. The GPU-Util shows 0% unless you are currently running a GPU workload on the VM. Note that now probably the TCC/WDDM status is TCC as shown below, which will prevent it from working as display GPU, we need to install an extension in Azure.

HpcVmDrivers extension

 

The HpcVmDrivers extension must be added to install Windows network device drivers that enable RDMA connectivity. To add the VM extension to an RDMA-enabled N-series VM, use Azure PowerShell cmdlets for Azure Resource Manager or Azure Portal.

Note that the command in the link above is out of date, use command below or use Azure Portal, click Virtual machine, then choose Extension , Click Add.

 

Find NVIDIA GPU Driver Extension, in the right window, read the explaination and click Create .

After installation, refresh the page to check the provision status, make sure it succeed.

Or you can use Powershell or Azure Cli command.

Azure Powershell command:

Set-AzVMExtension -ResourceGroupName "myResourceGroup"   -VMName "myVM" `    -Location "AustraliaEast" -Publisher "Microsoft.HpcCompute" -ExtensionName "NvidiaGpuDriverWindows" -ExtensionType "NvidiaGpuDriverWindows" -TypeHandlerVersion 1.2 -SettingString '{ ` }

Azure CLI

az vm extension set --resource-group myResourceGroup --vm-name myVM --name NvidiaGpuDriverWindows --publisher Microsoft.HpcCompute --version 1.2 --settings '{ ` }

Once this is setup, reboot the VM, wait for about 5 minutes.

Confirm the driver is installed and extension is working.

logon the VM via Remote desktop. Check the Graphic card adopter status in Device manager, disable the Microsoft Hyper-V video card, expand the monitor, disable the one using Microsoft Hyper-V video card. Reboot the VM. 

  • Then check the nvidia-smi command mentioned above, and make sure is WDDM instead of TCC.
  • Open task manager, you should see the GPU in the performance tab.
  • Open the control panel, you should see NVIDIA panel, open it you should see the configuration options.
  • click start menu, type dxdiag.exe, hit enter, click Display tab, you should see the correct tesla v100 card information.

If above info does not show correctly, troubleshoot on driver and extension installation.

Licensing

For each user access a VM with NVIDIA GPU plugged in, you need to pay a license fee, see below docs.

https://docs.nvidia.com/grid/latest/grid-licensing-user-guide/index.html

For Azure and other cloud service, some instance(with old GPUs) give free NVIDIA license, some you need to bring your own license (BYOL), see below doc:

https://docs.nvidia.com/grid/cloud-service-support.html

You can register a trial license and use it for 90 days gracefully for 128 VMs (https://www.nvidia.com/object/vgpu-evaluation.html). After that, you need to pay per user on either annual basis or permanent. https://www.nvidia.com/en-au/data-center/buy-grid/. And you need to setup a license server with license file downloaded from web portal.

The License Server is designed to be installed locally within a customer’s network, and be configured with licenses obtained from the NVIDIA Software Licensing Center(https://nvid.nvidia.com/dashboard). So when the client machine boot, it lease one license from the license server, and return it when it shut down.

Let’s start with registering an account, go to https://www.nvidia.com/object/vgpu-evaluation.html and register your account, you will receive an email  with your account login link,  Product Activation Key, and the link to redeem Product activiation key.

Then under Software& Services >product information, click NVIDIA GRID, you can download the software and manual there. Note that only support windows 2012r2, 2016 and windows 10 as of writing, and install only 32 bit.

In that manual, the missing part is create a system variable with below steps.

  1. Install Jave SE JDK 32 bit ( Must be 32 bit, remove 64 bit if you already installed), right click This PC, property, advanced system settings, Environment Variables…, under System Variables, click New, Variable name: JAVA_HOME, Variable value: C:\Program Files (x86)\Java\jre1.8.0_221( or your location).
  2. Extract, then Install with double click setup or command setup.exe -I GUI. Follow step 2.5 until the end.

 

RemoteFX enabled for Remote desktop session host.

 

 

RDS installation

 

There are plenty tutorial how to setup RDS, you can use same way to do it on Azure GPU family VMs, but also keep in mind the licensing.

You can follow the below steps to setup RDS:

https://docs.microsoft.com/en-us/windows-server/remote/remote-desktop-services/rds-remotefx-vgpu

https://docs.microsoft.com/en-us/windows-server/remote/remote-desktop-services/rds-supported-config

https://docs.microsoft.com/en-us/windows-server/remote/remote-desktop-services/migrate-rds-cals

https://docs.microsoft.com/en-us/windows-server/remote/remote-desktop-services/rds-client-access-license

 

Reference

https://docs.microsoft.com/en-us/azure/virtual-machines/windows/n-series-driver-setup