|
![](/i/fill.gif) |
On 17/05/2013 4:24 AM, Nekar Xenos wrote:
> On Thu, 16 May 2013 23:17:42 +0200, Stephen <mca### [at] aol com> wrote:
>
>> On 16/05/2013 9:55 PM, Christian Froeschlin wrote:
>>> an algebraic mystification that is easily resolved ;)
>>>
>>> 1.0 * min + 0.5 * (max - min)
>>> = 1.0 * min + 0.5 * max - 0.5 * min
>>> = 0.5 * min + 0.5 * max
>>> = 0.5 * (min + max)
>>
>> You are right of course.
>> But why put that "1.0 + " on the first line?
>>
>> I think of it (the first line), from my mid 20th Cent schooling, as:
>> Half of the difference between max and min plus the "offset" of min.
>>
> It did help to clarify the issue though.
>
I am not criticizing, it is interesting and educational to know how
others see and do things. Is that a mathematicians view?
And why multiply by a half instead of dividing by two? Is that more
computer/CPU friendly?
--
Regards
Stephen
Post a reply to this message
|
![](/i/fill.gif) |