POV-Ray : Newsgroups : povray.off-topic : Verizon math fail : Re: Verizon math fail Server Time
6 Sep 2024 07:16:02 EDT (-0400)
  Re: Verizon math fail  
From: Warp
Date: 6 Feb 2009 14:05:30
Message: <498c89fa@news.povray.org>
Kevin Wampler <wampler+pov### [at] uwashingtonedu> wrote:
> I heard of some study once which looked at what the poorest and best 
> students had learned in a math class.  It turned out that the worst 
> students had actually learned *more* concepts.  For instance, they 
> learned the equation for the area of a square, the area of a rectangle, 
> the area of a right triangle, etc.  The best students, however, only 
> learned a few concepts, like "area" which they could than apply to solve 
> many different problems.

> I wonder if what's going on here is something like that where people 
> have learned the concept of "multiply small numbers" without ever really 
> grasping the meaning of "multiply" in general (or many even "number" in 
> general).

  It probably is indeed a big problem that some people seem incapable of
seeing the big picture, the generic rules behind the specific examples,
when dealing with certain subjects. They only see (and often memorize)
the individual examples, but are incapable of making the connection to
a more abstract, more generic rule behind them.

  What I find strange in this particular example is that several people
at that company had the exact same misconception about numbers and monetary
units. (Deducing from the unedited recording those were not the first two
persons he was talking with about the subject.)

  I suppose that in their mind it goes something like this: Any monetary
amount which is larger or equal to 1 is in dollars, and anything smaller
than 1 (but larger than 0) is some amount of cents, ie. not dollars. Thus
for example a price of 10 is "dollars" (ten of them), while a price of
0.1 is "less than a dollar, ie. cents". Now their mind just tells them
that it's an amount in "cents", but fail to conceptualize exactly how
many cents. They just understand that it's "cents". Thus it becomes
"0.1 cents". Consequently "0.1 dollars" doesn't make sense to them because
0.1 is less than 1 dollar, and there can't be anything "less than a dollar".
Something less than a dollar is in cents, and thus it must be "0.1 cents".

  Of course where this becomes very confusing is what their brain is
telling them when they encounter something like "10 cents". They understand
that "10 cents" is *not* the same thing as "10 dollars", that 10 cents is
actually less than a dollar, but... Well, my logic fails to follow exactly
what goes through their mind at this point.

  Perhaps they can grasp the concept of integral amounts in different
units, but have hard time understanding the concept of fractional amounts
in different units. 10 is a nice integer, easy to count, so maybe they
understand "10 cents", because you can have physically 10 coints of 1 cent
each. However, when it becomes "0.1" it immediately becomes impossible
to grasp because you can't have 0.1 of a coin. At this point their brain
switches to a completely different "0.something is cents" mode.

  Maybe there's a connection failure between integers and decimals less
than 1. The general rule which applies to both is not there.

-- 
                                                          - Warp


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.