Point, set and match!
And those people will continue to believe that the Emperor is wearing shiny new clothes... no matter how much you argue the technical merits of such claims. That's why I'm surprised that this thread has gone on for so long now. It isn't even like the speaker cable or interconnect debates where there are factors that obviously come into play and there is provable science behind it... HDMI is designed to transmit packetized data that either makes it there or doesn't.
Kudos to Habanero Monk for at least TRYING to offer an in-person comparison, despite the nitpicking that seems to have surrounded it. Not that it would have changed anyone's minds one way or the other.
So those characteristics, or analog thinking as you say, aren't possible in HDMI ? How so ? Transmission of data is one thing, how it sounds is another. Certainly we can agree that metallurgy alone and it's various forms can complete the data transmission and sound somewhat different doing it...no ?
The same reason all cdp's don't sound the same. A laser is picking up the same data but other aspects of that cdp will alter the final sound. Same with cables.....design, metallurgy used, connectors, all alter sound to some degree good or bad.
another thread that is about to get the AXE
Can you have a cable that is superior in construction? Absolutely. Does that alter the sound presuming both the cheaper and more expensive cables transmit the data packets to the sink end? NO. All better quality gets you is the potential for a longer run of cable or slightly less work for the TMDS decoding at the sink end since it will have to fall back on redundancies less often. However, such redundancies do NOT have an effect on the quality of the sound - the packet either arrives or it doesn't. TMDS packeting rules out the jitter-related issues we saw with PCM over TOSlink. Therefore, all you have is 1) the quality of the source up to the HDMI port's encoder, and 2) the quality of the decoder on the sink end. Now, if you were changing cabling beyond the decoder at the sink end, then YES, that could create an audible difference based on metallurgy, etc. because then you're back to an analog stage. See previous statements about speaker cables and interconnects. But from HDMI cable to HDMI cable, the sound from the source can not logically change in the ways people claim they are hearing. The data either arrives or it doesn't, the same way data cables for hard drives do. It's a completely different paradigm than analog cables, because HDMI is a digital cable using digital transmission methods, not things like phase, voltage, resistance, RF interference, etc. that matter in the analog world. The same goes for video, where people are claiming certain cables improved things like saturation or contrast... whereas that's quite impossible, since unlike analog, those signals aren't being reconstructed based on analog cables. They're being reconstructed at the display side from packetized digital data that tells the display's decoder what color combination goes where. An RGB value doesn't magically change because you got a better cable - the display still sees the same RGB value and its decoder displays that value on screen when called for. When those packets don't arrive intact, it's a block of values that gets dropped, which is why if enough packets are dropped (as happens when a cable is at the threshold of failure), you get visible artifacts that tell you the display can't decode the picture. This is usually exhibited by a screen with an overall pink tone or "snow" as the display tries to reconstruct the packets predictively to avoid complete failure (which is a feature of the HDMI chipset), or by the picture not displaying at all or dropping in and out as the transmission of the signal across the cable drops beneath the threshold. Pretty sure we've posted signal charts that show this phenomenon occurring.
Can things like RF interference affect that signal? Absolutely. That's why HDMI has phase-reversed redundant leads the way balanced audio cables do - to minimize the effects of RF interference. But the point is that analog effects on that digital signal don't exhibit themselves the way they do on strictly analog cables, since the square wave is just a means to transmit the data packets themselves - NOT THE AUDIO OR VIDEO via analog means. The packets arrive... or they don't. If they don't, you get a failure in decoding, which according to the people who designed HDMI can ONLY exhibit itself in the ways I've previously outlined.
Now if the data error is unrecoverable you will get large drop out and the end user will certainly notice. Hence block errors in HDMI display when a problem happens. But when a problem ISN'T happening one HDMI cable isn't going to out do another properly functioning HDMI cable.
That's where you are wrong Monk. Metallurgy does make a difference in sound otherwise everyone would use the same cable regardless. The old 1's and 0's argument has been proven false time and time again, your a tad behind the curve on this one my friend.
I'm not setting out to change anyones mind on this, believe what you will, no skin off my nose. However if you seek further enjoyment, and wonder what a good quality cable can bring to the table, let your own ears/eyes be the judge, not me, and certainly not anyone else here.
The science and theory that some of you rely solely on has over the years often been proven wrong and some times by the same people that proclaimed things to be a certain way to begin with. One should never put all their eggs in one basket.
In two HDMI cables as long as the 1's and 0's get there the same way the same time you get the same output.
I'm interested in any peer reviewed and published data that you have to present. So far Kuntasensei is the only one to present anything valid, and I'm the only one that has put out a more than friendly invite for people to put their conjecture to the test.
CAT5e as example: I'm going to get the same data rates from one properly terminated length of cable that I am from any other properly terminated same length.
I'm telling you if I put you front of a test bed and have you A/B on a remote between mirrored output on an IPS display you are going to fail miserably. You CAN NOT do it reliably. I guarantee it.
Any truth to what this guy says?
"HDMI is an abortion of an interface that was crammed down our throats by Sony and Hollywood. Silicon Image was the party that made it all possible.
The idea by Sony was to have the audio and the video both on the same cable, to avoid confusing the schmucks who buy their Sony TV sets at Best Buy and can't figure out how to connect it. Hollywood demanded "content protection", and it was decided that HDCP as developed by Intel would suffice. Silicon Image was determined to develop the silicon chips so that they could cash in on the cash cow.
Of the many problems associated with HDMI, the audio quality is totally handicapped for lack of -- a pin! They designed the connector before they finished designing the system. They didn't have enough pins to also have a master audio clock.
So with HDMI, the audio clock is derived from the video clock. For high-def TV, the video clock runs at either 74.25 MHz or 74.25 * (1000/1001) [thank you NTSC!!!]. The audio clock runs at multiples of 48 kHz. Of course, these are not related. So the receiver has a PLL to regenerate an audio clock based on instructions from the transmitter (source) telling it what to do.
The result is the worst jitter of any system yet invented. It truly sucks.
Much later, they added a thing called Audio Rate Control in HDMI 1.3a. This puts a buffer and the master audio clock in the receiver. Then commands are sent upstream on the CEC line telling the player to speed up and slow down as necessary to keep the buffer full.
The only people to use this are Sony (HATS) and Pioneer (PQLS), but both use proprietary implementations that prevent use with other equipment.
And the fee for using this pile of steaming dog dung? $30,000 per year in licensing fees. It's a beautiful world, no?"
Even Component R/G/B was capable of 1080i. Good ole VGA which is ~25 years old is still capable of surpassing 1080P.
The minimal buffering isn't a problem if there is never a buffer under run. The whole 'if a tree falls in a forest' thing.
And as usual HDCP and ICT are all hacked. HDCP has caused many a nightmare getting a DVD/BR player - Receiver - TV to handshake properly.
The cable is minimal in all of this however since most of the issues are software. Hardware is typically the easier part. There is a reason Apple has Display Port and Thunderbolt.
The real solution has been out there for ages however: CAT5e cabling. Even a 100Mbit connection can do 1080P w/o breaking a sweat. Check out HD Base T http://www.hdbaset.org/
100 Meter distance and I'm sure Audio Quest would then come out with a $3000 100 meter run of the stuff and another 7 page debate would debut.
Just a little something from a Whathifi'er about his beliefs. The difference here Monk, is he at least tried them. Not saying everyone will hear or see a difference, but most who do, don't go back to monoprice HDMI cables for a reason.
I was wrong about HDMI cables!
Sat, Feb 23 2013, 12:37AM
Joined: 22 Feb 2011
For years I have argued how the cheap HDMI cable was no different to an expensive version.
However, today at the Bristol show I purchase 3 Chord Active Silver cables and it totally blew me away. For bluray playback or fir anything where data is being read actively from a disc, there is a massive boost in sound quality, the sound was sharper, crisp and punchier. Picture was also warmer, deeper in colour and rich. I am still reeling from the shock, I fully expected there to be no difference and to a certain degree I was right... Using it with sky, no difference, PC, no difference. However, bluray yes, music yes and even gaming.
My wife walked in and said "the tv looks better and sounds better". She was unaware of the test I was conducting, so that was unbiased from a non audiophile/videophile.
So 8 pages of back and forth ,whats the point? Who wants to make a closing statement on HDMI cables?
I want 10baseT to be the new standard. Then we can all argue about Ethernet cables. Cat 7 is the new standard. Screw HDMI
Monoprice HDMI cables have worked great for me too1
5 Differences in HDMI Cables ) than you will quickly see that a $5 High Speed HDMI cable from any brand you like is probably the best choice. I am happy with my Amazon cables for more than 3 years now and never had any issue with them. That's a pretty unemotional judgement :loneranger:
Crap system and crap hdmi cables go hand in hand
You guys do KNOW that there is a HISTORY to the above debate on this site? Go back a few years and see WHO believed WHAT about HDMI cables and WHAT they NOW believe. It is an eye-opener, to say the least. Suffice it to say that HDMI cables drifted into the analog debates over audio cables in "recent" years. And once that happened, all the old cable debates reproduced themselves. The above, reprises those debates which are perennial in the audiophile world!
I say, "enjoy" your cables!
Greeks....what can they possibly know about history. :razz::cheesygrin::cheesygrin:
The Integra DTR-60.5 HDBaseT Receiver preview at AH.
328 feet over Cat5/6 and 100 watts of Power Over Ethernet (POE). I hope this standard gains traction and replaces HDMI. Oh to have a locking connector.
I'm using a red monoprice 3ft cable to my AVR-to-plasma hdmi connection and am also using 4 cheap $15 hdmi cables for my dvr, bluray, ps3 and apple tv. And they are working ever since.
Haven't notice degradation of picture, sound or anything perceivable.
Maybe i'll try those Audioquest or Monster cables and do a double blind test. Someday.
Across all these "cable debates", I'm always amazed that there appears to be very little (if any) double-blind assessment. If I were a cable manufacturer, I'd think it was a nice marketing advantage to have data describing my claims.