So I got the Odyssey G9. I already updated to latest firmware as of today. So I'm turning on HDR in windows 10, so all of the below will be related to HDR. I'm using the displayport cable that came with the monitor, and I have a Titan RTX GPU.
At 240Hz, HDCP just isn't working. Nvidia control panel says the display doesn't support HDCP.
If I swtich to 120Hz, the Nvidia control panel says HDCP is up and running. However, if I try to watch any HDCP protected content the screen starts flickers, makes a crackling sound in the audio out, and turns off for about ten seconds. Then it turns on for another 1s, and then the same thing again in an infinite loop until I reboot the PC to get the HDCP content off the screen.
Also, at 120Hz, the bit depth is locked to 8bpp (in HDR still). For some reason I can only get 10bpp at 240Hz (still in HDR). That doesn't make any sense to me since at a lower frame rate you should have MORE available bandwidth, so why do I get LOWER bit rate? This isn't the main issue though, because obviously I want to stay at 240Hz anyway (kinda the whole point of this thing), I just can't get HDCP working. But, if 120Hz worked at 10bpp w/ HDCP that would be on okay temporary work around, but as it stands right now 120Hz means 8bpp and HDCP still doesn't work.
At 60Hz, I can get 10bpp and HDCP works (not just in theory, I can actually watch movies that are HDCP protected without the monitor crashing). But obviously 60Hz is a far cry from 240Hz and not really acceptable.
Anyone know what's wrong?
Solved! Go to Solution.
HDCP is a digital copy protection against piracy. It is designed to run at 4k 60hz for best results.
It is NOT for 240hz gaming and web browsing.
Google HDCP to be enlightened as to it's history, benefits and performance.
I fully understand what HDCP is. It should not depend on frame rate, and you shouldn't have to switch modes on your monitor to stream a movie, that's ridiculous.
It's ridiculous that on an expensive, top of the line monitor I have to menu dive and manually change refresh rate in order to play HDCP content. Is this something that can be fixed via firmware update?
Exactly the same issue here, getting the "This display does not support HDCP" in Nvidia control panel when running 240hz. Quite frustrating, any fix released yet?
An Accepted Solution has been marked and provided for this thread. The thread will now be locked for further replies. in hopes to keep the integrity of the thread from steering in a non-technical manner. If you have a separate concern, feel free to post again or send one of our moderators a private message with more details. Please note duplicate posts on similar subjects starting 2/1/21, will be removed to keep our community organized and make it easier for our users to find resolutions and needed content. Remember that if you do make a post, please include as many details about your symptoms as possible. Also, make sure that your title is a good summary of the overall situation that's occurring with your product. Thank you for being part of the community!
Be sure to click " ✓ Accept as Solution" when you find an answer that works for you.
Please note, notification emails are a DO NOT REPLY address, you must log-in on the community page in order to respond.