|
 |
This is a problem that comes up once in a while. I'm curious if anyone knows
a good solution.
You have two objects and circles that bound them. For each circle, you know
its position and its radius squared. (You don't have the radius unless you
take the square root, which is undesirable because it's slow, and caching it
isn't a good option.) You want to know if these circles intersect.
Obviously, it's easy to get the distance between the circle's centers
squared. What you want to test is whether dist < radius1 + radius2, but all
you have is distSq, radius1Sq, and radius2Sq.
The best I can do is to square both sides of the test:
distSq < (radius1 + radius2)^2
And then use the fact that (a+b)^2 = a^2 + 2*a*b + b^2 to get:
distSq < radius1Sq + radius2Sq + 2 * radius1 * radius2
And then, since the radii are positive, the right hand side can be changed
to the following, but the test will now succeed more often than it should:
distSq < radius1Sq + radius2Sq + 2 * max( radius1Sq, radius2Sq )
So now we have a test that doesn't require taking any square roots, but it
isn't as "tight" as it could be - it will return true sometimes when the
circles aren't intersecting.
Can this be improved?
- Slime
[ http://www.slimeland.com/ ]
Post a reply to this message
|
 |