|
|
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> That sucks. I don't really want to buy an HDTV because I can still see
>> all the digital compression artifacts after working on image compression
>> code for a few years.
>
> Finland switched to digital TV broadcasting recently (not necessarily
> a good thing IMO), but I haven't bothered buying any kind of receiver
> (too expensive, especially for the cable version).
>
> I recently was at a home where they were watching TV and I got to watch
> it from really close. I was shocked at the bad quality of the image! It was
> full of mpeg compression artifacts. It was almost like watching a youtube
> video in full-screen.
>
> It seems that digital TV broadcasts use less than half the bitrate of
> a typical DVD, and you can sometimes spot mpeg artifacts even in DVD movies.
> No wonder the image quality of the digital TV was so bad.
>
> I clearly remember how they advertised digital TV as increasing the
> image quality compared to analog broadcasts. BS.
>
The signal to noise ratio is much better for digital. Simply because
they broadcast the artifacts so if you receive them correctly that is
within specifications. What you would have preferred is a better signal
to disturbance ratio, where the signal is the original uncompressed
signal and the disturbance is anything, noise or artifact that is in the
received image different from the original. These are two entirely
different concepts and you can not blame an advertiser to choose the one
that suits the paying company best.
Post a reply to this message
|
|