|
|
Am 20.09.2011 22:25, schrieb Darren New:
> On 9/20/2011 9:55, Mike Raiford wrote:
>> 1/0 is undefined, however the limit of 1/n as n approaches 0 is
>> infinity. :)
>
> It depends on your math, really. The only reason people invented
> "limits" is because they thought dividing by zero was somehow impure,
> like "irrational" numbers used to be.
Wrong. First of all they were invented for the general case where f(x)
is an element of the real (or complex) numbers for certain x, or is
undefined due to ambiguities. While f(x)=1/x is the most prominent
example, there are others, e.g.:
f(x) = arctan x
f(x) = x^0
f(x) = 0^x
f(x) = g'(x), with g(x) = |x|
f(x) = |g'(x)|, with g(x) = |x|
Also it was not invented because someone thought that some mathematical
operations were "somehow impure", but to systematically prove some
assumptions that had simply been taken for granted in earlier
mathematical work. It is also a great tool for working with functions
where lim[x->X]f(x) depends on from which direction you approach X, or
to examine the behaviour of a function as the parameter approaches
(positive or negative) infinity.
Besides, there is no such thing in (modern day) mathematics as "somehow
impure"; there is just "defined" and "undefined", or "proven" and
"unproven". And x=1/0 /is/ undefined on the set of real (or complex)
numbers, because there is no solution to x*0=1 in that domain.
For any domain that includes an element inf:=1/0 satisfying inf*0=1,
each and every mathematical property of the body of real (or complex)
numbers must be re-evaluated with respect to this new element inf, and
exceptions need to be established for various mathematical operations.
For instance, if you want inf to have the property that inf+x=inf for
x!=0, you'll obviously need to break even such simple properties as
y+x=y <=> x=0.
Post a reply to this message
|
|