Tim Cook <z99### [at] gmail com> wrote:
> Not sure if this is delving into the same direction as those GPU-POV
> threads, but is there an effective way to split off and only use the
> arbitrary-precision math when it falls outside the hardware capability?
How do you tell when to "split off"?
For example, how many mantissa bits are necessary to "accurately" depict
the number 0.1? (Note that 0.1 cannot be represented accurately by any
amount of mantissa bits, similarly to how 1/3 cannot be represented by
any amount of decimals in base-10 representation.)
--
- Warp
Post a reply to this message
|