HDMI Distribution: Different HDTV Resolutions

airman210

New Member
Lately I'm learning that I can't simply use a 1x8 (4K-capable) HDMI splitter to feed a 1080p signal to various 720p, 1080p and even 2160p TVs around the house. Information is scarce, but experts tell me that, faced with this mix, the splitter will revert to feeding the lowest resolution of the eight (720p) to all of those HDTVs... what a waste! That's right, they say if seven 2160p (4K) TVs and just one 720p are connected to the splitter, all TVs will display 720p!  Is this true?
If you think about it, I guess it makes sense: if the system didn't auto-degrade the resoution, the 720p set would probably be blank. I need a "smart" splitter!
 
I'm planning a distribution system for a new house with the following: one 720p, three 1080p, and four 2160p 4K TVs. Does anybody know how I can [eventually] send 2160p to all of these and simultaneously view the highest resolution capable on each one?
 
You can't.  This is not something a "smart splitter" would ever be able to overcome.  Because you're dealing with the fact that it's the source devices that have to send a signal and they're only going to be able to send one.  What would be necessary would be a per-output descaler that, in real-time, downgraded the incoming highest-quality signal.  This is technologically possible, but not without spending a lot of money. 
 
HDMI is a constantly-negotiated connection.  The source and the display devices must negotiate and low-end devices will dictate what the source can send.  Also note that audio is also negotiated in the same manner.  So nothing will get anything better than 2-channel stereo when a TV has negotiated with the source.  This can be sort-of overcome by using receivers instead of feeding HDMI straight into the TVs.  Then you're only at the mercy of the least-capable AVR for audio negotiation.  
 
It's often far less troublesome (and often cheaper) to replace the TVs than fight with HDMI negotiation.  It's usually a LOT less trouble to have per-TV devices handling playback (aka cable boxes or streamers like a Tivo Mini) and not use HDMI distribution AT ALL..  Or, at the very least, accept that distributed HDMI will always be limited that way, and then use whatever player/streamer/cable box is best suited for higher quality only where it's actually "needed".  
 
Really, you have to ask just what is it you're trying to accomplish?  If you're trying to save money and not have cable boxes in every room then you'll also have to content with the fact that multiple rooms are going to be forced to share the same channel (and then there's the remote control from rooms back to source 'adventure').
 
If you really need to distribute to multiple devices with different resolutions, you won't be able to use HDMI (or HDBaseT).  You might need to consider something like Video Storm (http://www.video-storm.com/proddetail.asp?prod=VRX010).  But I don't think their current products do UHD.
 
Why do you have a 720p device in the mix?  Those are going the way of the dinosaur.  And what sources for UHD (2160p) are you planning on using?  Those are few and far between right now.
 
Thanks, Bill, for your prompt reply and for shedding light on this neglected topic.
So it is true, in my worst case example of 7 UHDTVs and one 720p HDTV connected to the same 1x8 4K-capable splitter, we have maybe $12000 worth of UHDTVs and one wanna-be HDTV all displaying 720p; i.e. a huge waste of money! I wonder how many amateur system designers have fallen into this trap. I think this issue should be better publicized; haven't seen any warnings on the splitters I've considered.
 
Let me address your last paragraph. My current system is 13 monitors fed from a couple of AV901HD 1x9 Component A/V Distribution Amplifiers. Yes, it saves money. And yes, every TV displays the same channel, but the viewers (usually two) like the same programs. The 13 remote controls are no problem: I simply use the Next Generation™ system which is a transmitter/battery inserted into each remote. They all communicate back to one set-top box, and it works great.
 
As for the audio, I hadn't thought about that yet, so thanks for bringing it up. If I'm understanding the problem correctly, perhaps I could use 8 audio extractors to derive 5.1 SPDIF audio from HDMI. Or one extractor and split the SPDIF. Wonder if there would be lip-sync issues?
 
One final question: In a 2160p source system with one 720p, three 1080p, and four 2160p 4K TVs on the same 1x8 splitter (with everything powered-on), if the 720p TV was powered-off, would the remaining 7 TVs now display 1080p? Likewise, if the three 1080p sets were also powered-off, would the remaining four 4K UHDTVs now display 2160p? If all the lower-end devices are off, they can't negotiate lower-rez with the source, right?
 
 
 
Well, it's only a problem when you don't know that HDMI supports source devices being able to send different resolutions/formats based on negotiation with target devices.  No previous consumer AV tech did this (DVI, perhaps and VGA only sorta-kinda).  
 
There's no "neglect" involved.  It's a rude awakening to some, for sure.  Trouble is the HDMI splitter & amp folks don't want to educate anyone about this because it'll lose sales for them.  The TV folks don't either as they're rather have you buying new sets, not keeping old ones in the mix.  The source device folks, well, they're fighting off nonsense from the content providers; they've already got their hands full.
 
Again, by the time you wrestle with all the issues it's often a lot less trouble to use per-display devices instead of trying to use centralized distribution.  Don't say you weren't warned.  Especially when you get to doing things like streaming from web sources and controlling them via tablet/phone apps.  aka, Chromecast, other TV stick devices, Fire TV, Roku, etc.  
 
As for remotes, you're sure that will work in the actual spaces planned here?  Lots of those gizmos that attempt to 'bridge' remotes via RF fail horribly in real-world conditions.  Yeah, they're "supposed to" work but then again so would an HDMI splitter...
 
I don't want to seem like I'm throwing a wet blanket.  More like a cold bucket of ice water to the face.  The realities behind high-res sources/formats is kinda like that.
 
Ice water welcomed! I've been trying to get educated on this topic for three years, with little progress, until you answered my posts. And you confirmed my suspicions about why this kind of info is hard to come by.
 
I haven't ruled out the per-display devices solution. Might even go half and half: DirecTV Genie Minis on the 4Ks, and HDMI on the others.
 
Remotes... In my present central TV system I've used the Next Gen transmitters in 13 remotes for more than 11 years in a 3370 ft² house with no problems. The new house will be 2166 ft²... what could possibly go wrong? (Don't answer that!)
 
And if I keep the lower-rez TVs turned off, I'll be viewing 2160p on the UHDTVs; problem solved.
 
Thank you very much for your time and expertise.
 
I think residential matrix switches are going the way of the dinosaur, but JustAddPower offers a 1080p solution that includes a scaler. Best for commercial applications and large homes, it uses transmitters, receivers, and a managed network switch.
 
Yeah, the HD over IP stuff certainly holds promise, at least for 1080p sources.  I do wonder about the claims of "visually lossless" when it comes to theatrical materials at 1080p.  There's just so many bits of nuance that get mangled when transcoded.  What kind of price-point is the JustAddPower stuff aiming toward?
 
There's definitely been a 'moving target' for all this stuff.  
 
JAP is shooting for high end resi and all commercial. I think the ideal application is a sports bar.

Yeah, some video purists can see the difference between native and the JAP transcoded 1080p, but I'm not one of them. I don't need to watch Archer and Criminal Minds in perfect video. Maybe I'll enjoy movies again someday after getting a larger display, 42" the last 8 years.
 
Neurorad said:
Yeah, some video purists can see the difference...
 
I hear ya. Trouble is once you see an encoding glitch you always notice it there after.  I ran into that watching some stuff on a Samsung we had in the master bedroom, vs the Pioneer plasma in the family room.  The blotchy-ness of the encoder in the Samsung was truly ugly.  This given the exact same source content (Tivo recordings off cable & Netflix streams).  So I'd be a little concerned about how the HD-over-IP scheme might introduce similar glitches.  Although I suppose as long as you never watch the content through anything other than the transcoders you might not notice it.  That and the audio decoders would do a terrible job with Netflix streams.  I disliked the wife's Downton Abbey already, hearing the audio mangled (tinny pops) did not help.
 
Although I'd imagine today's encoder tech is a lot less crude than a 10 year old entry-level TV.
 
I used to use an A/V matrix (not HDMI though) to distribute everything throughout the house - for aesthetics as much as saving money.
 
However, now with media streamers I see no need to have anything except a local device hidden behind the TV to cover all, but the most advanced needs. On all, but my main watching area, I use simple Fire TV boxes. They give me Kodi (XBMC) which can stream all of my stored media in it's native format and it's massive list of video addons along with the numerous other native apps, such as Hulu, HBO, ESPN, Netflix, Crackle, etc.
 
It won't get you 4K, but do you really need it everywhere?
 
This is what I was getting at, just how much is "needed" vs what's desired or assumed to be necessary?  Realizing, of course, that you're not going to be getting >1080 content from any of the traditional cable/OTA delivery systems.  So it's going to be streamed, typically via network.  Question is whether it's going to be downloaded to a local drive and played from there (aka 4k wait-to-watch) or buffered (and presumably compressed) via some kind of active Internet source.  The latter pretty much says use streaming boxes behind each TV.  The former gets a little muddier, especially with regard to genuine 4K content.  Gizmos like the Kaliedescape (sp?) with their licensing schemes might make it tempting to use a centralized box and distribute from that out via HDMI.  Those boxes ain't cheap.  
 
Stuff like the Fire TV, Chromecast and the like really do make a pretty nice alternative.  For "live TV", however, they're a bit weak.  For that I've been very happy to be using Tivo DVRs and their Tivo Mini units.  For TVs where I want multiple live tuners I have a Tivo DVR.  But for places where I only need one tuner the Tivo Mini is an excellent solution.  It shares a tuner with a Tivo DVR and uses that for viewing live signals.  It's drop-dead simple to use, such that guests and in-laws make use of it with no instructions at all.  Keep this in mind as you move toward solutions.  Things that are capable of "everything" are often too much of a pain in the ass to use on an on-going basis (aka XMBC/Kodi).  
 
Back
Top