 |
 |
|
 |
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Warp wrote:
> I think that the difference between our definitions is that you are
> defining the concept of "choice" while I am trying to define the concept
> of "free choice". There's a difference.
Hmmm... To some extent.
> A computer makes choices based in its input. A conditional statement is
> basically choice-making: Depending on the input, it chooses either one
> execution branch or another.
Agreed.
> That kind of choice is deterministic and predictable.
Predictable in the small. In the large, it's possible that the only way to
predict what the result of a program will be is to run the program and find
out. In other words, the only way to predict what the program will choose is
to let it choose and see what the result is.
> If a computer is
> reading a source of true randomness (such as sampling the noise produced
> by a resistor) and making choices based on that input, then these choices
> are non-deterministic and unpredictable.
Right.
> However, neither case is "free choice". The computer is bound by the laws
> of physics to act according to its input (be it deterministic or not). The
> computer is not a sentient being which is making a choice according to what
> it wants.
How complex can the computer get before you'd say it's a sentient being
making a choice based on what it wants? If we had SciFi levels of AI
around, would you claim they're not sentient, not really making choices?
> It's mechanically making choices according to its input and to the
> laws of physics. The computer is not "free" to make any choice it desires,
> because it has no desires.
What if it gets complex enough that you can't distinguish the computer's
desires from its programming? If it's too complex to understand why it made
a choice, even given everything you know about it? If it's so integrated
with its inputs that it's impossible to make it take the same path through
the program twice, even tho it is deterministic?
It's like asking whether computers can be intelligent, then declaring "No,
because intelligence requires soul, so even if a computer's behavior was
100% indistinguishable from a person, it still would only be pretending to
be intelligent."
> So if we pose my original question in another way: Are we intelligent
> sentient beings capable of making free choices, or are we simply computers
> acting according to laws of physics based on input?
I'm claiming those two aren't incompatible. You're claiming the only way to
have free choice is to disobey the laws of physics (by defining it that
way). I'm questioning whether that's a useful definition, intuitive as it
may be.
I'm also questioning whether it even makes sense to ask such a question,
given you're asking about whether the laws of physics apply to our own
physical behavior. I.e., how could our free will *not* be part of physics if
it results in physical results?
I'm also questioning whether you can rationally distinguish between
"intrinsically random" and "chosen by supernatural means".
> Maybe free will cannot exist, and is just an invented, artificial and
> ultimately false notion. An illusion.
Possibly, given the definition that it's supernatural. However, given the
definition that it's supernatural, you've just defined it as being
impossible to investigate. That's the way in which I'm saying it's not a
useful definition: you've defined it in a way that blocks further useful
inquiry into the subject.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Warp wrote:
> Darren New <dne### [at] san rr com> wrote:
>> Then I think you've trivially answered your question by making it impossible
>> to determine the answer of whether or not we have free choice.
>
> That's why they call it philosophy and not science. ;)
Yep. Don't get me wrong. It's a fun discussion.
>>> I find it a rather odd definition. "If the consequences are too complicated
>>> to be predicted by us, then it's free will acting there."
>
>> More like "if even we can't predict what our own choice would be in advance,
>> then it's a free choice."
>
> I don't consider purely random events "making a choice" either.
No, but a pile of random events with self-awareness might be making a
choice. :-)
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Warp wrote:
> Darren New <dne### [at] san rr com> wrote:
>> Warp wrote:
>>> I don't think so. The very definition of "deterministic" is predictability.
>
>> That's where we disagree. It's completely possible to be both deterministic
>> and unpredictable. Indeed, that's exactly what the halting problem is all about.
>
> But a deterministic program will always behave the same way. It doesn't
> change its behavior from one execution to another. Thus it behaves predictably.
> "It will do the same thing it did the last time."
Only if presented with the exact same input. And you have to run it at
least once to find out what that result is. And, in the real world, once
you've run it, its previous choice has become part of the input for the next
run. You cannot present a human with the same choice twice.
A turing machine you couldn't start in a known state would be both
deterministic and unpredictable.
Now, I guess you could investigate free will if you could travel in time and
see whether the same events lead to the same results, but that would require
there being no randomness either.
>>> The very word itself is saying so. It's the opposite of "non-deterministic",
>>> which is unpredictability.
>
>> Also not quite true. Non-deterministic turing machines are very predictable
>> in their behavior.
>
> If it's non-deterministic, you cannot say how it will behave the next time
> it will be executed.
You can if it has exactly the same input. (Assuming there's only one
shortest path to the halting state, of course.)
I'm just pointing out that "non-deterministic" doesn't technically mean it
behaves differently each time. It simply means it's unpredictable *before*
you run the program. With the same input, an NDTM will run the same steps
each time. You just can't tell what those steps are before you run it.
>>> A chain of events is deterministic if it happens in a certain way because
>>> there's no other way it could have happened. If the exact same initial setup
>>> can be replicated, then the chain of events will happen in the exact same
>>> way again, completely predictably. That's the very definition of
>>> deterministic.
>
>> Yet, oddly, NDTMs behave that way. ;-)
>
> If it always behaves the same way, isn't it by definition deterministic?
Nope.
An NDTM has instructions that could do either X or Y when they're in the
same state with the same input. That's the "nondeterministic" part. But the
machine is required to execute the X (or the Y, whichever) that will cause
it to halt in the fewest number of steps. Which will it follow? You can't
tell, because you can't solve the halting problem. If it *does* halt, and
you feed it the same input again, will it follow the same steps again? Yes.
(Unless there are two paths that each lead to halting in the same number of
steps, at which point you really don't *care*, by the definition of the
math.) I.e., an NDTM will factor the number 18372819 the same way every
time: it'll guess that number's factors, and print them to the tape. You
just can't tell by looking at the machine what it'll guess.
I don't think you can tell if something's deterministic or not if you can't
examine its inner workings and you can't feed it the same input twice. Both
of these are true for people.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
> Mass and energy is quantized, and thus there's only a finite number of
> ways the balls can act. It doesn't matter how many possibilities there
> are,
> they are still finite.
Down at that level of detail, when stuff collides there is some random noise
due to the unpredictability of where sub-atomic particles will be and at
what velocity. Unless you can predict that (currently I believe it is
thought to be impossible) then you have no chance of exactly predicting the
outcome of a collision between two balls. The best bet you have will be to
say that within a certain percentage confidence the ball will be in a
certain range of positions, and as there are more collisions and the game
goes on, the range of the expected position will grow exponentially. It
probably means that even with perfect knowledge of the starting positions,
after 10 shots with multiple collisions the best you could say is that
you're 99% sure the balls will be on the table :-)
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Warp wrote:
> You are not deluded into thinking that something happened even though it
> didn't. You are deluded into thinking that it happened because you chose
> for it to happen, rather than it happening just as a consequence of previous
> events. If it happened as a consequence, it's not a choice.
>
Snort. Tell me something I don't know. Its not like I, at least, am not
perfectly aware that studies on how the brain thinks have shown
"awareness" as a post hock attempt to invent justification for an
action, which was already beginning, even before we became consciously
aware of it. Sentience is a deterministic machine, basing its actions on
thousands of independent *processes*, all which take place before
awareness, lying to itself about why it did something. Its a useful lie,
since sometimes the more integrated "conscious" mind can filter multiple
alternate paths, which might have been chosen, and ignored data, and
tell the action, which is already in progress, to halt, on the basis of
higher level reasoning, including recognition that the action in
progress is dangerous, or unwanted. The fiction is necessary, to make
sure that false actions, which can harm the individual, or things it
values, do not cause the harm, which was not directly perceptible on the
*simpler* levels. Duh!
--
void main () {
if version = "Vista" {
call slow_by_half();
call DRM_everything();
}
call functional_code();
}
else
call crash_windows();
}
<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models,
3D Content, and 3D Software at DAZ3D!</A>
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Patrick Elliott wrote:
> perfectly aware that studies on how the brain thinks have shown
> "awareness" as a post hock attempt to invent justification for an
> action,
Which isn't at odds with the supposition that there's something supernatural
involved in helping people make "free will" choices. Perhaps the
supernatural part is what starts the chain of events.
> Sentience is a deterministic machine,
I don't think you know that either. :-) Certainly there's room for quantum
effects, even if you leave out the supernatural.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
scott wrote:
> collisions and the game goes on, the range of the expected position will
> grow exponentially.
In particular, it's caused to a great extent by the balls being convex,
thereby amplifying any difference between predicted and actual.
> It probably means that even with perfect knowledge
> of the starting positions, after 10 shots with multiple collisions the
> best you could say is that you're 99% sure the balls will be on the
> table :-)
The calculation I remember was after 15 shots you can't tell if any are on
the table. :-)
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Warp wrote:
> Maybe free will cannot exist, and is just an invented, artificial and
> ultimately false notion. An illusion.
Actually, I had a bit of an insight while food shopping this morning.
You're defining free will as something supernatural, and asking whether
that's an illusion.
I prefer to define free will as the illusion, and then asserting "of course
we have free will."
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
Darren New <dne### [at] san rr com> wrote:
> How complex can the computer get before you'd say it's a sentient being
> making a choice based on what it wants? If we had SciFi levels of AI
> around, would you claim they're not sentient, not really making choices?
One (maybe not completely physical) possibility would be if the
computer/brain is able to make decision as a closed system. In other
words, it's capable of processing and changing information, and making
decisions without those decisions being the direct and inevitable
consequence of external input or quantum randomness. The decisions may
be *based* on the external input, but they are not the inevitable and
deterministic consequence of it. The computer/brain might be able to
use its own internal logic to make choices based on the input, but in a
way that from the outside it's impossible to predict which choises will
be made.
Can such closed system exist in the physical world? Could that idea break
some laws of physics (eg. something along the lines that new information
cannot be generated in a closed system or something)?
--
- Warp
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|  |
|
 |
On 14 Sep 2009 15:03:06 -0400, Warp <war### [at] tag povray org> wrote:
>
>> IMO it matters not a jot if we have free will or not. My stance is that we
>> should act as if we do have free will. The other way of living is a good excuse
>> for bad behaviour, again IMO.
>
> Why would it be a good excuse? Regardless of whether you have free will or
>not, you can still feel pain. If you make bad things, you are very probably
>going to suffer some physical or mental pain as a consequence (eg. sent to
>jail or whatever). It makes no sense to go to jail just because you think
>you have no free will.
Now I was thinking about certain religious sects who believed that they were the
"chosen ones" and would go to heaven no mater what they did. It was predestined.
But to answer your post. Not all people would feel bad because they did bad
things. If they could justify their actions by saying that, that was the way
they were then it was OK. Sad but not their fault.
You may not have met people like that but, unfortunately I have.
--
Regards
Stephen
Post a reply to this message
|
 |
|  |
|  |
|
 |
|
 |
|  |
|
 |