|
|
|
|
|
|
| |
| |
|
|
From: Jeremy "UncleHoot" Praay
Subject: HDMI cable confusion/paranoia
Date: 5 Mar 2010 11:16:56
Message: <4b912e78@news.povray.org>
|
|
|
| |
| |
|
|
I recently bought a new HDTV along wtih a Blu-Ray player. Prior to the
purchase, I spent a lot of time trying to understand the technology. One of
the things that struck me, was the fact that Walmart's HDMI cables cost
between $30 to $40+ for a 6 foot cable. I can find other HDMI cables from
various reputable sources for less than $10. Heck, when I bought my new LCD
monitor last Fall, it came with a "free" HDMI cable. I've used it and it
works great. Certainly, the manufacturer would not have included a $30
Cable with the monitor. In fact, they probably wouldn't even include a $5
cable.
Since HDMI is a digital connection, any cable capable of handling a 1080p
connection (Blu-Ray) is good enough right? Short answer, "Yes." Long
answer, "Maybe..." As the length of the cable grows, the quality of the
cable becomes more important. Gizmodo.com actually did a really good write
up on the issue, eventually concluding that while way overpriced, the
"Monster" brand cables were actually higher quality, but again, so what? If
you have an 8-bit-color 1080p signal, any cable shorter than 10' should be
able to handle the job.
What's interesting is to read reviews of the Monster HDMI cables where they
sell a 3-foot HDMI cable for around $100. Some poor saps are saying, how
it's well worth the money, because it's so much better than composite video.
Well, yeah, HDMI is much better, but you're still stupid for spending $100
on a 3-foot cable that you should be able to find for a few bucks. Others
claim that the $100 Monster cable gives them such a better picture than the
other (mon-Monster) HDMI cables. Unless I'm missing something, that's like
saying your Internet looks a lot better since you switched from a cat 5
ethernet connection to cat 6. In the analog realm, high quality cables
meant a lot. In the digital realm, anything capable of handling the signal
should be as good as any other, although I've seen my share of really crappy
quality ethernet cable, as well. But they either work, or they don't.
Sometimes you may have partially working ethernet cables, but with HDMI,
your picture would be getting noticeably screwed up. i.e. a bad cable would
become very apparent, very quickly, because you'd see a very screwed up
picture on your TV, or no picture at all.
Another thing I hear mentioned is the higher bandwitch capacity of Monster
(and other) HDMI cables. Again, I'm a bit confused here. Some day, when
Blu-ray discs start using 12-bit color or perhaps higher resolutions, then
the higher bandwidth might be necesary. But for now, what good is it?
Isn't that the same as running 100 megabit ethernet on a cable that can
handle gigabit ethernet? Yeah, it's nice for the future, but it's not going
to give you anything extra in the present. Am I wrong about this? This is
the part that confuses me the most, because I see a lot of reviewers
claiming that the higher bandwidth capacity actually works better for them.
I believe 1080p/60Hz/24-bit with audio is about 6GB/s, so why would a cable
capable of handling 15GB/s make anything any better? Isn't it still
transmitting at 6GB/s?
Finally, 120Hz LCD. I suppose in the future, we may have a 120Hz signal
from a Blu-Ray (or whatever), but currently 60Hz is the max. So, you're
still transmitting a 60Hz signal into a 120Hz TV, so I don't see how a
better quality cable would help there either. In fact, I'm still not
convinced that 120Hz LCD makes any sense whatsoever. With CRT's, it made a
LOT of difference, because it substantially reduced flicker. LCD doesn't
have flicker. It has response times, and lowering the response times does
make a difference. I'm not convinced that a 5ms 60Hz screen would be any
different from a 5ms 120Hz screen, or even a 5ms 240Hz screen.
There it is. Let me know if I'm wrong. I can't believe that I could be
right about all of this stuff, as there seem to be so many people out there
who seem to think I'm wrong.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On Fri, 05 Mar 2010 11:19:21 -0500, Jeremy \"UncleHoot\" Praay wrote:
> There it is. Let me know if I'm wrong. I can't believe that I could be
> right about all of this stuff, as there seem to be so many people out
> there who seem to think I'm wrong.
One thing that I learned quickly was that if you start looking based on a
specification (1.2 vs 1.3 vs 1.3b), you're only looking at the connector
quality, not the actual cable quality.
I picked up a 50-foot cable from Firefold (whom I've been quite happy
with) for about $15, and it's generally worked well for me - though I did
have issues with losing HDMI sync (that I seem to have resolved) when the
system was cold (but after it warmed up for about 20 minutes, it was
usually fine). The thing to look at is apparently the cable gage and
shielding (which would be important as well).
Another thing to consider, though, is if the equipment being used has a
signal booster in it. Our Harmon/Kardon AVR-254 does signal boosting,
and that's probably why I can use a lower grade cable with greater
success.
Jim
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
And lo On Fri, 05 Mar 2010 16:19:21 -0000, Jeremy "UncleHoot" Praay
<jer### [at] questsoftwarecmo> did spake thusly:
> I recently bought a new HDTV along wtih a Blu-Ray player. Prior to the
> purchase, I spent a lot of time trying to understand the technology.
> One of
> the things that struck me, was the fact that Walmart's HDMI cables cost
> between $30 to $40+ for a 6 foot cable. I can find other HDMI cables
> from
> various reputable sources for less than $10. Heck, when I bought my new
> LCD
> monitor last Fall, it came with a "free" HDMI cable. I've used it and it
> works great. Certainly, the manufacturer would not have included a $30
> Cable with the monitor. In fact, they probably wouldn't even include a
> $5
> cable.
<snippety>
The latest which magazine had a quick segment where they sent a 6.2Gb per
second data stream through a £10, £20, and £100 HDMI cable of unspecified
length without a single error occurring.
--
Phil Cook
--
I once tried to be apathetic, but I just couldn't be bothered
http://flipc.blogspot.com
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Jeremy \"UncleHoot\" Praay" <jer### [at] questsoftwarecmo> wrote:
> Finally, 120Hz LCD. I suppose in the future, we may have a 120Hz signal
> from a Blu-Ray (or whatever), but currently 60Hz is the max. So, you're
> still transmitting a 60Hz signal into a 120Hz TV, so I don't see how a
> better quality cable would help there either. In fact, I'm still not
> convinced that 120Hz LCD makes any sense whatsoever. With CRT's, it made a
> LOT of difference, because it substantially reduced flicker. LCD doesn't
> have flicker. It has response times, and lowering the response times does
> make a difference. I'm not convinced that a 5ms 60Hz screen would be any
> different from a 5ms 120Hz screen, or even a 5ms 240Hz screen.
If a 120Hz LCD shows each frame of a 60Hz input twice per output frame,
then there ought to be no difference. However, if the LCD does some kind
of interpolation each other frame then there might be some visible
difference. I don't know if they do, though.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 3/5/2010 11:31 AM, Warp wrote:
> "Jeremy \"UncleHoot\" Praay"<jer### [at] questsoftwarecmo> wrote:
>> Finally, 120Hz LCD. I suppose in the future, we may have a 120Hz signal
>> from a Blu-Ray (or whatever), but currently 60Hz is the max. So, you're
>> still transmitting a 60Hz signal into a 120Hz TV, so I don't see how a
>> better quality cable would help there either. In fact, I'm still not
>> convinced that 120Hz LCD makes any sense whatsoever. With CRT's, it made a
>> LOT of difference, because it substantially reduced flicker. LCD doesn't
>> have flicker. It has response times, and lowering the response times does
>> make a difference. I'm not convinced that a 5ms 60Hz screen would be any
>> different from a 5ms 120Hz screen, or even a 5ms 240Hz screen.
>
> If a 120Hz LCD shows each frame of a 60Hz input twice per output frame,
> then there ought to be no difference. However, if the LCD does some kind
> of interpolation each other frame then there might be some visible
> difference. I don't know if they do, though.
>
Both of you are assuming this is "Blue Ray" that is the issue too. Game
systems, I think, can output in native 120Hz, and computers certainly
can and *must* in some cases, if you want decent video. If you are
running 3D shutter glasses through them, then its mandatory, etc. And,
that looks like something coming up soon with Blue Ray. Also, you are
assuming that Blue Ray isn't using the full Hz, if available, which
might be true of the cheaper units, but.. I wouldn't bet on it a) being
that way with all of them, or b) staying that way.
--
void main () {
if version = "Vista" {
call slow_by_half();
call DRM_everything();
}
call functional_code();
}
else
call crash_windows();
}
<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models,
3D Content, and 3D Software at DAZ3D!</A>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Jeremy "UncleHoot" Praay wrote:
> Since HDMI is a digital connection, any cable capable of handling a 1080p
> connection (Blu-Ray) is good enough right?
You want few errors. How many errors you actually get depends on your
environment and your cable length.
> In the digital realm, anything capable of handling the signal
> should be as good as any other,
Remember that cables don't carry digital signals. They carry analog signals
that your equipment interprets as digital values. The cable has to not
degrade the signal enough that *your* end equipment can still understand the
digital values. You can't hook the cable up to laboratory-quality test
equipment and say "Yep, no errors."
Which is not to say cheap cables aren't sufficient. It's merely to say that
cheap cables aren't necessarily sufficient just because it's digital.
It's the same reason you can only get ADSL service at certain distances from
the telco CO. It's an analog signal. It's only digital once it gets inside
your computer.
> Finally, 120Hz LCD. I suppose in the future, we may have a 120Hz signal
> from a Blu-Ray (or whatever), but currently 60Hz is the max. So, you're
> still transmitting a 60Hz signal into a 120Hz TV, so I don't see how a
> better quality cable would help there either.
It wouldn't, unless you're actually getting bit errors.
> In fact, I'm still not
> convinced that 120Hz LCD makes any sense whatsoever.
Understand that there's a processor inside the TV that's interpolating
frames. It's not just painting the same frame twice. It's actually drawing
something different every frame, based on the motion vectors in the image.
--
Darren New, San Diego CA, USA (PST)
The question in today's corporate environment is not
so much "what color is your parachute?" as it is
"what color is your nose?"
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Patrick Elliott" <sel### [at] npgcablecom> wrote in message
news:4b91529b$1@news.povray.org...
> Both of you are assuming this is "Blue Ray" that is the issue too.
Woops.
> Game systems, I think, can output in native 120Hz, and computers certainly
> can and *must* in some cases, if you want decent video.
120Hz or 120fps? Sometimes the two terms are interchangeble, but not
always. It's true that getting 120fps on a 60Hz monitor won't look as
smooth as a 120Hz monitor. I don't think any console systems on the market
today output at 120Hz, but I could be wrong.
> If you are running 3D shutter glasses through them, then its mandatory,
> etc. And, that looks like something coming up soon with Blue Ray.
That would be awesome.
> Also, you are assuming that Blue Ray isn't using the full Hz, if
> available, which might be true of the cheaper units, but.. I wouldn't bet
> on it a) being that way with all of them, or b) staying that way.
My Blu-Ray player has the capability to output 12-bit color. Last I
checked, there were no Blu-Ray discs that contain 12-bit color. My TV is
8-bit. So, while some of the technology is there, it's definitely not all
there yet, but it probably will be in the next couple years, but it's not
going to enhance my viewing experience. Likewise, I'm pretty sure that no
Blu-ray discs are being produced at 120Hz, nor is that likely to happen over
the next several years. Heck, if Hollywood would start filming at a mere
60fps, it would be a huge improvement.
One of the things that most HDTV's do is anti-judderring, which is
synchronizing a 24fps signal with a 60Hz monitor. I've read that with the
120Hz TV's, they can really smooth-out a 24fps signal, and it looks a lot
like video-tape (in smoothness, not quality). That's neat if you want your
movies to look like video tape. But I wanted more of an authentic feel, so
I went with the Sony TV, which shows a 24fps Blu-ray movie at, you guessed
it, 24Hz. This seems like a simple solution to a simple problem.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Darren New" <dne### [at] sanrrcom> wrote in message
news:4b9154a5$1@news.povray.org...
>
> Remember that cables don't carry digital signals. They carry analog
> signals that your equipment interprets as digital values. The cable has to
> not degrade the signal enough that *your* end equipment can still
> understand the digital values. You can't hook the cable up to
> laboratory-quality test equipment and say "Yep, no errors."
>
> Which is not to say cheap cables aren't sufficient. It's merely to say
> that cheap cables aren't necessarily sufficient just because it's digital.
>
> It's the same reason you can only get ADSL service at certain distances
> from the telco CO. It's an analog signal. It's only digital once it gets
> inside your computer.
Yes, but if the bits start out as 101 and end-up as 101 on both the high
quality cable and the low quality cable (every time), then even though the
high quality cable is transmitting the analog signal better, it really makes
no difference, as they both interpret the digital portion the same way. I
think we're both on the same page here.
> Understand that there's a processor inside the TV that's interpolating
> frames. It's not just painting the same frame twice. It's actually drawing
> something different every frame, based on the motion vectors in the image.
It doesn't just paint the same frame? Hmmmm... If so, then it's similar to
the anti-judderring (or an extension of it). That's more interesting, now.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Jeremy "UncleHoot" Praay wrote:
> Yes, but if the bits start out as 101 and end-up as 101 on both the high
> quality cable and the low quality cable (every time),
Right. I'm just saying that the equipment at each end determines how good
quality cable you need. You can't test a cable on lab equipment and say "we
got no errors, so you won't either." It's like saying "we put X brand
gasoline in our car and got 40MPG, so you will to."
I think we understand each other. :-)
> It doesn't just paint the same frame?
That's what they *say*. Whether they're just full of it, I couldn't say.
The demos make it look better, but then they would.
--
Darren New, San Diego CA, USA (PST)
The question in today's corporate environment is not
so much "what color is your parachute?" as it is
"what color is your nose?"
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Jeremy \"UncleHoot\" Praay" <jer### [at] questsoftwarecmo> wrote:
> Yes, but if the bits start out as 101 and end-up as 101 on both the high
> quality cable and the low quality cable (every time), then even though the
> high quality cable is transmitting the analog signal better, it really makes
> no difference, as they both interpret the digital portion the same way. I
> think we're both on the same page here.
Maybe if there's nearby interference (not completely unthinkable given
how much electronics there are in today's homes) the likelihood for the
bits to get garbled during transport is lessened with the higher-quality
cable.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|