|
 |
On 16-11-2009 17:15, Invisible wrote:
> Darren New wrote:
>
>> Which just goes to show the problem I have with 90% of all
>> matehmatical notation. It's so utterly inconsistent that even
>> something like (f(f(x))) is ambiguous.
>
> Several millennia of mathematical discoveries, all made by different
> people in different places, and apparently several of them discovered
> the same or similar things, but gave them different names - or gave them
> names which clash with existing but inrelated things they didn't know
> about.
>
> Just for giggles: how many meanings can you find for "normal"?
>
> There's the normal distribution, normal vectors, a normed space...
Wednesday I have again an opportunity to give my talk on deriving
programs from specifications. One of the things I mention is that the
humble '=' has at least five and possibly more meanings depending on
context, interpretation, and the type op the object that it is applied
to*. And I will not even mention that the general use is inconsistent in
the context of A=B+C where A,B,and C are matrices. Here '+' is
pointwise, while '=' has an implicit summation**. As far as I know there
is not even an generally accepted symbol to express pointwise equality.
I know of people that try the opposite and make the summation explicit,
but that breaks most of other mathematical uses.
* equivalence, equality, definition, EXNOR, assignment and perhaps one
or more that don't have names. The type plays a role in A=B=C which is
OK if they are all booleans (or A or C is) but not if they are all
integers. In A=5 it can be an expression of equality, if A was undefined
it can be a binding/assignment, but it can also be a boolean that is
false everywhere except at 5.
** See also the concept of '=' in OO languages. Are two objects the same
if all fields are the same?
Post a reply to this message
|
 |