Remote Desktop use graphics card

Windows Remote desktop graphics card

cudagraphics cardnvidia-graphics-cardremote desktopwindows 8

So as most people know, when you use RDP to connect to your desktop, it disables the graphics card and uses generic CUDA.

I don't want Windows to revert to using CUDA instead of the Graphics Card. I have a GTX 780ti in the computer but it isn't being used by RDP. Is there any way to force Windows to use the hardware graphics card?

I've tried TightVNC, RealVNC and LogMeIn, but I want to use RDP as it is the fastest and works best for me.

Best Answer

Firstly, you are getting your terms mixed up. CUDA is an NVIDIA technology for programming their GPU [and other things, but that's the simplest description].

Microsoft's RDP uses a it's own graphics driver which converts the rendered screen into network packets to send to the client.

This is the core of how RDP works and you cannot change it.

On the server, RDP uses its own video driver to render display output by constructing the rendering information into network packets by using RDP protocol and sending them over the network to the client. On the client, RDP receives rendering data and interprets the packets into corresponding Microsoft Windows graphics device interface [GDI] API calls.

Source: //msdn.microsoft.com/en-us/library/aa383015[v=vs.85].aspx

Related Question
  • Difference in screen resolution in Remote connections between RealVNC & Windows Remote Desktop
  • Windows EC2 GPU Instance on Windows
  • Windows 10 Remote Desktop With RemoteFX and Hardware h.264 Using Nvidia NVENC

Video liên quan

Chủ Đề