Ok wat jij wilt dan hebben kassa en consumenten bond en de rest van de techneuten het fout.
voor wie het wel interesseert hier nog even een uitleg
Since this question gets asked like 15 times a day, and I usually end up responding to them, I'll make a general post... Sure would be nice to be stickied, but since that won't happen, at least highlight it and keep the URL so you yourself will have an easy time "replying" to the onslaught of questions...
I originally wrote this as a reply to a post, but thought it made more sense standing on it's own... So here goes...
"Question: Is there any difference between a cheap (i.e. $10 HDMI cable) and an expensive (i.e. $150 HDMI cable)???"
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
Laatst bewerkt: 26 mrt 2007