|
|
Kevin Wampler wrote:
> Invisible wrote:
>> The paper I'm looking at has several places where the program has to
>> guess a solution that is "good enough" because finding the perfect
>> solution is "NP-complete". Makes it sound like solving such a problem
>> is somehow "impossible" - but as you say, all it really means is that
>> it's slow. (Or at least, *potentially* slow.)
>
> Yeah, that's basically correct. It's a pretty well justified approach
> though since often NP-complete problems will take too long to solve
> practically in the average case as well as the worst case.
>
> If you haven't done so before, it can be enlightening to write a program
> to solve some NP-complete problem by brute force, and then get a feel
> for how quickly its running time explodes on larger problems. It's easy
> to rationally know what `exponential' means, but I've known many people
> to still be surprised when they see for themselves what that entails in
> real life.
>
> What's the paper for out of curiosity?
GraphViz paper on graph layout.
For example, finding a global arrangement of all graph nodes that
minimises the number of edges that cross each other is meant to be
NP-complete.
In this instance, the point is that finding *the* minimum number of
crossings isn't hugely important, so long as the number can be kept
reasonably small. But it's just the implication that this is impossibly
difficult that bothered me. How long can it possibly take to try all
possible permutations of 10 nodes?
Post a reply to this message
|
|