|
 |
Darren New wrote:
> Actually, the annoyance isn't the lack of precision. It's that a literal
> without a marking saying what kind of literal it is is assumed to be a
> specific type even when the compiler can trivially determine that that
> type is the wrong type.
I could repeat myself and once again mention that Haskell allows a
numeric literal to represent any possible number type, and automatically
deduces which one you intended without you having to do anything
special. And you can extend this to custom numeric types you invent
yourself. But since nobody here uses Haskell, I guess it's still moot. ;-)
Then again, Haskell is a language which automatically deduces types all
over the place anyway, so I guess it's not that surprising.
Unfortunately, while Haskell does this for numbers, it does *not* do it
for strings. (I believe there's a little-known GHC pragma which allows
you to do this...) If you write a string literal, it's type is "list of
characters". Which is kind of annoying if you actually wanted "array of
characters" or something else.
More irritating still is the fact that Haskell has a special notation
for writing lists, but it only works for lists. If you want to
initialise an array, you have to write the initialiser as a list; how
dumb is that?
Also: I've just finished The Milkman Conspiracy level from Psychonauts,
and as I type this my computer screen literally seems to be bending.
Which is really freaky, by the way...
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
 |