Denon AVR HDMI question

Neither Denon nor NVidia has been able to resolve a problem I have. My system consists of a Linux Box running MPlayer through an NVidia GeForce 610 video card. The HDMI output of the card goes to the DVD input to a Denon AVR-3313CI receiver. The output of the Denon goes to a 65" LG 4K oled display. The resulting display on the LG is as if there is a serious interlace mismatch.

If the HDMI signal goes directly to the LG the picture is perfect.

I have tried other NVidia cards, several versions of the 610 and all show the same problem. Other NVidia cards such as the 210 and

710 do not show the interlace problem.

As with any receiver, I would expect the Denon would query the display for its edid capability and forward that information to any source connected to the Denon's input. Apparently not so.

For the time being I have put a 2-way HDMI splitter in the line coming from the computer to the Denon. One output of the splitter goes to the Denon while the other output goes directly to the HDMI1 of the LG display. Simultaneously the HDMI output of the Denon goes to HDMI2 of the LG.

This arrangement works, and nvidia-settings shows the display characteristics are set by the LG, but Denon is mentioned in the nvidia-settings output.

Starting with power off on all three components, I first turn on the LG, then the Denon, then I boot the computer. The computer runs 24/7 thereafter.

Could there be a simple solution for connecting the components together which might correct the problem?

Reply to
root
Loading thread data ...

could it be that the signal going from the Denon to the television is HDMI 1.4 and the signal out of the video card is HDMI 2.0?

".... the primary reason for the switch to HDMI 2.0 is that 4K Ultra HD televisions require much more bandwidth to realize their full potential. Since 4K Ultra HD is four times the resolution of 1080p, the former HD standard, it requires more throughput to handle extra data going back and forth. Lots more.

HDMI 1.4 supported 4K resolutions, yes, but only at 24 or 30 frames per second (fps). That works fine for movies but isn?t useful for gaming and many TV broadcasts, which require 50 or 60 fps. Also, HDMI 1.4 limited

4K Ultra HD content to 8-bit color, though it is capable of 10- or 12-bit color. HDMI 2.0 fixed all of that because it could handle up to 18 gigabits per second ? plenty enough to allow for 12-bit color and video up to 60 frames per second."

formatting link

Reply to
Mike S

Thanks for responding. I bought a replacement (identical) video card and the problem was fixed. The Denon receiver specs do not indicate whether it is 1.4 or 2.x, but it talks about 3D passthrough and

4K upconversion. I added a HDMI splitter in front of the receiver so that one output goes to the receiver, and the other goes directly to the TV. As I understand HDMI the source interrogates the display and sends the appropriate output. The TV will respond with its characteristics, but the receiver is not able to respond because it has no output. At least that is how I figure it to work. With this arrangement, and a new video card, the system is working very well.
Reply to
root

Your analysis makes sense, glad you solved it.

Reply to
Mike S

Forget the splitter. You need to connect....

PC - TV - Receiver.

One of your TV's HDMI inputs should be supporting ARC (Audio Return Channel) - this will provide pass-through audio on all channels to the receiver.

It's an interesting exercise to check all independent channel outputs actually work from your PC mixer. With all HDMI it should be a breeze.

In my case as my own receiver doesn't support HDMI, I make do with the optical output from my TV, sending audio to that over HDMI encoded as Dolby Digital 5.1 using the A52 plugin.

formatting link

Just mentioning but you definitely don't need that :)

--
Adrian C
Reply to
Adrian Caspersz

Thanks for responding. I found there is a significant delay in the sound when the receiver is set to TV. Also the sound is then stereo instead of 7.2 HDMI.

Reply to
root

That sounds like a delay function has been left active in the receiver, that should be turned off. It may have been previously enabled, so that sound and picture are in sync after the time taken for picture processing.

Also the sound is

formatting link

Looking at the table on the above page, that is from the use of HDMI-ARC, and not HDMI-eARC.

Your TV needs to support HDMI 2.1 for that.

If you haven't got HDMI 2.1, I think your splitter implementation is possibly more justified now!

--
Adrian C
Reply to
Adrian Caspersz

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.