|
 |
Warp wrote:
> You can hardly blame C (developed in 1972) and C++ (developed in 1979) for
> not having native support for Unicode (first standardized in 1991).
I have no idea why you think I'm blaming inanimate objects for anything at
all. It's an example. I'm sure in 20 years I'll be complaining that the
grammars for my speech recognition library and my natural language deduction
library aren't compatible.
If I asked how to get a program written before graphics displays were common
running on a modern UI, I'd get a whole raft of suggestions, everything from
using a terminal emulator to using a VM to ....
Yet if I mention that there's a lack of information published about solving
a common problem in computer programming, and I happen to mention that it's
affecting my *C* programming, suddenly I'm somehow insulting you or
something? You're upset that I'm programming in C and C++ and asking people
how they solve the sorts of problems I'm encountering? I'm sorry you don't
ever have that sort of problem, but if you haven't anything to add to the
conversation, why not just let it go?
But yes, now that you mention it, I *am* exactly blaming C, first developed
in 1972, for not having standard unicode support first standardized in 1991.
What or who else would I blame for that, if blame must be assigned? Indeed,
the fact that C predates applications where you need logging, unicode,
network protocol stacks, message catalogs, and even standard size integers
for talking to custom hardware is *exactly* the source of the problem I'm
trying to solve. Why wouldn't I blame C for that? I certainly can't blame
the inventors of C, and it's certainly the age of C that's the fault for it
missing many of the features people commonly use 30 years after its invention...
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
 |