You may come across some scenarios like some user need to do designing jobs from home, or setup a GPU server for students, who may only study the Graphic design class for a couple of hours a week, so a RDP session with GPU would meet these scenario or even setup VM in Azure and turn it on whenever needed.

We need to put these factors into consideration:

  • The RDP performance: As of writing this, RDP v10 works well with RemoteFx to eliminate the Delay and Jittering.
  • GPU virtualisation for VDI/Remote desktop host session: Microsoft Hyper-V vGPU or DDA, the supported cards are listed for Nvidia are here  and
  • VGPU licensing from NVIDIA
  • Remote desktop CALS for multiple user.


First option is do it via AZURE

Azure GPU instance


Hardware/Specs choose

I will use an Azure Standard_NC6s_v3 instance with a Tesla v100 16GB card, which basically is a Hyper-V VM with GPU DDA and the pricing is about 6 AUD per hour. Then I will test the Autodesk Fusion 360 with a RemoteAPP on RDS

You can setup a similar VMs either in Azure or on premises with DDA, vGPU or session host.

  • See here the virtualisation choice for your choice,
  • Then choose a VM instance, all the VM sizes can be found here: Notice the CPU family VMs are only available in some area and not available for free trail.

Once the VMs is chosen, you can register an Azure account, and provide credit card details.

Install Driver, EXTENSION and License

Then you can connect to the VM instance via Remote desktop, note that by default you use the Microsoft Hyper-V video card, and  the Dedicated GPU are not available, because the driver is not installed. Besides, you need to install license for All other series but NV series. Also you need to install the Azure VM extension to enable RDMA network connectivity.

Usually we download the graphic driver from Nvidia website, but for Tesla and GRID, we need to download it from Microsoft website( cause the GRID driver include licensing for GRID Virtual GPU Software in Azure( we will talk about GPU virtualisation licensing later).

For Tesla V100 card, there is only driver for server 2016 and 2012R2 on MS website and the driver for server 2019 on NVIDIA does not support licensing, if you want to install driver for windows server 2019, the GRID driver for Tesla V100 card on server 2019 will be OK, which have been tested.

Once the driver is installed, use command tool  nvidia-smi in folder C:\Program Files\NVIDIA Corporation\NVSMI. If the driver is installed, you will see output similar to the following. The GPU-Util shows 0% unless you are currently running a GPU workload on the VM. Note that now probably the TCC/WDDM status is TCC as shown below, which will prevent it from working as display GPU, we need to install an extension in Azure.

HpcVmDrivers extension


The HpcVmDrivers extension must be added to install Windows network device drivers that enable RDMA connectivity. To add the VM extension to an RDMA-enabled N-series VM, use Azure PowerShell cmdlets for Azure Resource Manager or Azure Portal.

Note that the command in the link above is out of date, use command below or use Azure Portal, click Virtual machine, then choose Extension , Click Add.


Find NVIDIA GPU Driver Extension, in the right window, read the explaination and click Create .

After installation, refresh the page to check the provision status, make sure it succeed.

Or you can use Powershell or Azure Cli command.

Azure Powershell command:

Set-AzVMExtension -ResourceGroupName "myResourceGroup"   -VMName "myVM" `    -Location "AustraliaEast" -Publisher "Microsoft.HpcCompute" -ExtensionName "NvidiaGpuDriverWindows" -ExtensionType "NvidiaGpuDriverWindows" -TypeHandlerVersion 1.2 -SettingString '{ ` }

Azure CLI

az vm extension set --resource-group myResourceGroup --vm-name myVM --name NvidiaGpuDriverWindows --publisher Microsoft.HpcCompute --version 1.2 --settings '{ ` }

Once this is setup, reboot the VM, wait for about 5 minutes.

Confirm the driver is installed and extension is working.

logon the VM via Remote desktop. Check the Graphic card adopter status in Device manager, disable the Microsoft Hyper-V video card, expand the monitor, disable the one using Microsoft Hyper-V video card. Reboot the VM. 

  • Then check the nvidia-smi command mentioned above, and make sure is WDDM instead of TCC.
  • Open task manager, you should see the GPU in the performance tab.
  • Open the control panel, you should see NVIDIA panel, open it you should see the configuration options.
  • click start menu, type dxdiag.exe, hit enter, click Display tab, you should see the correct tesla v100 card information.

If above info does not show correctly, troubleshoot on driver and extension installation.


For each user access a VM with NVIDIA GPU plugged in, you need to pay a license fee, see below docs.

For Azure and other cloud service, some instance(with old GPUs) give free NVIDIA license, some you need to bring your own license (BYOL), see below doc:

You can register a trial license and use it for 90 days gracefully for 128 VMs ( After that, you need to pay per user on either annual basis or permanent. And you need to setup a license server with license file downloaded from web portal.

The License Server is designed to be installed locally within a customer’s network, and be configured with licenses obtained from the NVIDIA Software Licensing Center( So when the client machine boot, it lease one license from the license server, and return it when it shut down.

Let’s start with registering an account, go to and register your account, you will receive an email  with your account login link,  Product Activation Key, and the link to redeem Product activiation key.

Then under Software& Services >product information, click NVIDIA GRID, you can download the software and manual there. Note that only support windows 2012r2, 2016 and windows 10 as of writing, and install only 32 bit.

In that manual, the missing part is create a system variable with below steps.

  1. Install Jave SE JDK 32 bit ( Must be 32 bit, remove 64 bit if you already installed), right click This PC, property, advanced system settings, Environment Variables…, under System Variables, click New, Variable name: JAVA_HOME, Variable value: C:\Program Files (x86)\Java\jre1.8.0_221( or your location).
  2. Extract, then Install with double click setup or command setup.exe -I GUI. Follow step 2.5 until the end.


RemoteFX enabled for Remote desktop session host.

Change the local computer policy / or create a GPO and link it to the OU where RDS server is sitting  to use the hardware graphics adapter for all RDS sessions.

  1. Choose Local Computer Policy > Computer Configuration > Administrative Templates > Windows Components > Remote Desktop Services > Remote Desktop Session Host > Remote Session Environment.

  2. Enable the Use the hardware default graphics adapter for all Remote Desktop Services sessions option.

  3. Enable the Enable RemoteFX encoding for RemoteFX clients designed for Windows Server 2008 R2 SP1 option.

Onsite server

Tesla P40 24G

Download the vGPU driver from If you go to the normal driver download page, you will see:

           NVIDIA Virtual GPU Customers

  • Enterprise customers with a current vGPU software license (GRID vPC, GRID vApps or Quadro vDWS), can log into the enterprise software download portal by clicking below. For more information about how to access your purchased licenses visit the vGPU Software Downloads page.


Login to NVIDIA software Licensing Center. Go to Software & Services >  Product Information > Current Releases > NVIDIA Virtual GPU software then Click NVIDIA vGPU for Windows. Open the zip file and install the version for your server.

Follow step above to Enable  RemoteFX for remote session by group policy.

Don’t log on the server locally via a monitor, as the P40 or similar card don’t have a output port.

Now remote desktop into your server, run C:\Program Files\NVIDIA Corporation\NVSMI>nvidia-smi.exe you should see: Tesla P40 WDDM, go to Task manager you should see the GPU section and usage.

RDS installation


There are plenty tutorial how to setup RDS, you can use same way to do it on Azure GPU family VMs, but also keep in mind the licensing.

You can follow the below steps to setup RDS: