So I got the Odyssey G9. I already updated to latest firmware as of today. So I'm turning on HDR in windows 10, so all of the below will be related to HDR. I'm using the displayport cable that came with the monitor, and I have a Titan RTX GPU.
At 240Hz, HDCP just isn't working. Nvidia control panel says the display doesn't support HDCP.
If I swtich to 120Hz, the Nvidia control panel says HDCP is up and running. However, if I try to watch any HDCP protected content the screen starts flickers, makes a crackling sound in the audio out, and turns off for about ten seconds. Then it turns on for another 1s, and then the same thing again in an infinite loop until I reboot the PC to get the HDCP content off the screen.
Also, at 120Hz, the bit depth is locked to 8bpp (in HDR still). For some reason I can only get 10bpp at 240Hz (still in HDR). That doesn't make any sense to me since at a lower frame rate you should have MORE available bandwidth, so why do I get LOWER bit rate? This isn't the main issue though, because obviously I want to stay at 240Hz anyway (kinda the whole point of this thing), I just can't get HDCP working. But, if 120Hz worked at 10bpp w/ HDCP that would be on okay temporary work around, but as it stands right now 120Hz means 8bpp and HDCP still doesn't work.
At 60Hz, I can get 10bpp and HDCP works (not just in theory, I can actually watch movies that are HDCP protected without the monitor crashing). But obviously 60Hz is a far cry from 240Hz and not really acceptable.
Anyone know what's wrong?
HDCP is a digital copy protection against piracy. It is designed to run at 4k 60hz for best results.
It is NOT for 240hz gaming and web browsing.
Google HDCP to be enlightened as to it's history, benefits and performance.