|
|
> So how come BluRay disks are 5x the storage capacity, yet still have
> roughly the same run time? I thought it was because HD video requires
> more space to store.
On DVDs you'd be limited to under 2 hours if you used the maximum
bitrate (around 10 MBit/s IIRC), so you are forced to use lower bitrates
if you want to include menus and other various extras on the disc. With
BluRay, even if you used an average of 30 MBit/s (which is extremely
high quality with h264, and likely never actually needed, it usually is
around half that) you have about 4 hours run time. This is why you can
fit two versions of an entire film (eg 2D and 3D version) on a single
disc, plus all the extras. That would be impossible on DVD.
> I thought it was more or less the case that *all* codecs work by
> transforming the input, deciding how "important" each signal component
> is, and then keeping only the most important bits, according to what the
> requested bitrate was. I don't see anything there that makes a higher or
> lower bitrate change the amount of compute power required.
The whole point of video compression is trying to find patterns
frame-to-frame to reduce the information needed to reconstruct the
correct video (otherwise you'd just have a series of JPEG images). The
longer and more detailed you search for such patterns, the lower bitrate
you will be able to achieve for a given level of quality, or
equivalently higher quality for a given bitrate.
Post a reply to this message
|
|