POV-Ray : Newsgroups : povray.off-topic : Land of Lisp Server Time
3 Sep 2024 21:17:11 EDT (-0400)
  Land of Lisp (Message 11 to 20 of 22)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 2 Messages >>>
From: Invisible
Subject: Re: Land of Lisp
Date: 2 Nov 2010 06:24:41
Message: <4ccfe6e9$1@news.povray.org>
On 02/11/2010 02:00 AM, nemesis wrote:
> Invisible<voi### [at] devnull>  wrote:
>> Is it possible to create variables with local rather than global scope?
>
> you're kidding me, right?

Well, I've never seen anybody do it before, that's all.

>> When you get to Haskell, of course, automatic type inference blows #1
>> out of the water
>
> yeah, except when it can't decide on the types all by itself and requests type
> annotations anyway.

Which is fairly rare.

In truth, you're probably going to put annotations on all your top-level 
stuff anyway, for documentation's sake. What automatic type inference 
means is that you don't have to dot type annotations absolutely 
everywhere you go (and change them all over the place when something 
changes), just in the few places you need them.

> I would also say the Hindley-Milner kinda cheats in this regard.  You usually
> don't have to type annotate whenever the function operates on numbers, strings
> or Algebraic Data Types.  And Algebraic Datatypes are a dead on giveaway of the
> type of an expression, because it usually have a name of its own to describe
> values of its type.  So, a function like:
>
> withBuf run = run . Buf 0 ini =<<  mallocBytes ini
>    where ini = 1024
>
> needs no type annotation because it already has a friggin reference there to
> type:
>
> data Buf = Buf !Int !Int !(Ptr Word8)
>
> so, you exchange type annotations everywhere for type names for values of this
> type in the expressions.  Not that much of a fair trade:  instead of providing a
> single type declaration, you provide it everywhere you use it.

I wouldn't agree with that.

If you want a point, you say Point 3 7. Which, as you say, rather gives 
away the fact that it's a Point value. But how about saying "True"? 
That's of type Bool, but you didn't write that in your expression. 
Similarly, when you write

   Branch (Leaf 3) (Leaf 7)

you didn't say anywhere that this is of type Tree Int, did you? The 
compiler infers this automatically. And don't even get me started on 
expressions which are polymorphic, and could have /several/ possible types.

> I'm talking ill of Hindley-Milner type system, but you should be aware that the
> same lameness can happen in Lisps too:  here and now you will see things like
> fl* fx+ or fl<... that is, functions to operate on specific numeric types.
> Again, they exchange a single type declaration for type-specific operators
> everywhere.  Makes me wonder if the Algol folks got it right...

Not so in Haskell. If you want to add two numbers, you use (+). It's 
polymorphic like that. And functions that use it end up being 
polymorphic too (unless you specifically don't want them to be, usually 
for performance reasons).

I'm actually surprised that a dynamically-typed language would do 
something like that. I thought the whole point of dynamic typing was 
that you can completely ignore types.

(Haskell also has the nice trick of numeric literals being polymorphic. 
Sadly, the same does not hold for lists or strings...)

>> Really, in the modern age, there is no reason for dynamically-typed
>> languages to exist at all. It just invites bugs.
>
> Dynamically typed languages are just perfect for quick prototyping of systems
> where invariants are still not well known.  The fact that some of them can not
> just quickly prototype working systems, but prototype them with good performance
> is only a plus.

All I know is that every time I've used a dynamically-typed language, 
I've spent 90% of my time fixing type errors - errors which *could* have 
and *should* have been caught automatically by the machine, but I have 
ended up catching them myself, by hand. Yuck!

>> "Why is Lisp the best?"
>> "Because it has macros!"
>
> I almost never use macros and still find Lisp the best. :)

I just don't get what's to like about it. It seems messy and ad hoc to me.

> I suspect its homoiconic nature has lots to do with it.  Plus, if you've never
> edited Lisp code with hierarchical parenthetical editing you don't know what a
> bless it is compared to any other language in existence.

More like "without automatic bracket match, it's impossible to produce 
code which is even syntactically correct". It wouldn't be so bad, but it 
seems all Lisp programmers habitually indent their code incorrectly.

I can see how having a language where everything is a list is 
technically quite a nice idea. It's simple and elegant. Unfortunately, 
the language as a whole seems to be messy and inconsistent. Too much 
backwards compatibility, too much muddled thinking. Still, I guess it 
was the first attempt...

> you don't need to type brackets with a proper Lisp editor such as emacs or
> DrRacket.  You usually just type alt+shift+( and it opens a pair and puts you
> right in.  You can select a big code section spanning several lines by just
> alt+shift+(right or left).

In other words "Lisp is so hard to write that you need a special program 
to help you do it". (The same accusation could be levelled at XML of 
course...)

>> I still don't think Lisp is very functional. :-P
>
> It's because you still don't know it well.

Sure, functions are first-class. But it lacks purity. It also doesn't 
appear to place any emphasis at all on coding in a vaguely functional 
style - not in the language design nor the standard libraries.

>> Depends on what you consider to be "functional programming". There are
>> really two separate characteristics that most FP languages have:
>>
>> 1. Pure functions.
>> 2. Functions as first-class objects.
>>
>> Any programming language that supports functions can support writing
>> pure functions. But as far as I can tell, Lisp does not in any way
>> encourage or facilitate doing so. Heck, C++ has facilities for marking
>> things that won't be altered after being initialised, whereas Lisp does
>> not. That makes C++ slightly more FP than Lisp.
>
> That's purely an implementation detail, not a language feature. Pure math-like
> functions with no side-effects only depend on you writing side-effect-free code.

No, a design that enforces purity has profound consequences for any 
implementation of that design. (Not to mention the entire way that the 
language is used.)

> Would you really call C++ more FP than ML because nothing in the language spec
> say "hey, this section is pure, you can optimize it away for concurrency"?

Who said anything about ML? I just said C++ seems more functional than Lisp.

> BTW, Haskell is the only such (well-known) language enforcing purity even for
> IO.

And Clean doesn't count because...?

>> You complain about "operator obfuscation" and then tout "custom syntax"
>> as improving readability. Which is it? :-P
>
> new syntax doesn't mean&*%#$.  macros have as much self-describing names as
> functions.

To me, "new syntax" means that I can write something that doesn't look 
like the host language. From what little I've seen, a Lisp DSL just 
looks like Lisp.

>> Since Lisp explicitly permits self-modifying code, wouldn't that mean
>> that the compiler has to be present at run-time?
>
> You got that "self-modifying code" all messed up.  Lisp macros can operate on
> Lisp code to produce new custom code, but that happens at compile-time.
>
> If you really want it, you have eval at runtime, but it's as slow as everywhere
> else.

As I say, it's news to me that Lisp can be compiled. I thought the fact 
that you have an eval function more or less prevents this. Every other 
language I've ever seen that has an eval function has been interpreted.

>>>> And, unlike Lisp
>>>> macros (AFAIK), Template Haskell statically guarantees that the code
>>>> written by a template is *always* syntactically correct. (It does _not_
>>>> guarantee that it is well-typed, just that there are no syntax errors.)
>>>
>>> that's good for a language heavy on syntax.
>>
>> I won't claim to be a Lisp expert, but I was under the impression that
>> not every possible list is a valid executable Lisp expression.
>
> I mean there not much syntax in Lisp besides brackets, an expression evaluating
> to a function (or macro) at the head and arguments for that function (or macro)
> in the rest of that list.

And a Haskell expression consists of some combination of exactly 6 (go 
count them) structures. The rest is just sugar. That's peanuts compared 
to (say) the number of possible statements in Java. (For-loops, 
while-loops, if-then-else, return, switch/break...)

>>    From what I can tell, Lisp macros work by you passing some data to
>> them, which they then transform into some sort of executable code.
>> Template Haskell allows you to inspect parts of the current module not
>> passed in as arguments. So you can, e.g., look up the definition of a
>> user-defined data structure.
>
> that sounds as painfully side-effectful as when I first heard it.  Hence Ninja
> Patching.  google it up...

So, what, I can tell the compiler to import some extra modules that I'm 
about to reference, and I can look at the definition of a user-defined 
data type and generate some boilerplate code to work on it? Yes, sounds 
terrible.

Or perhaps you meant that people could abuse it? Well, yes, obviously 
any powerful feature can be abused. So what?

>>> BTW, you know quasi-quoting comes from Lisp, don't you?  `(let ((r ,(fact 5)))
>>> r) will turn into (let ((r 120)) r) at compile-time... it's so friggin' easy
>>> it's used a lot.
>>
>> I must be missing something... those two expressions look almost
>> identical to me.
>
> yeah, you're missing that you write the first and the compiler turns it into the
> second, at compile time:  yes, it evaluates (fact 5) during compilation.  It
> would certainly not do any magic and evaluate (fact n) if n was not known at
> compile time, but since it's a literal...

Oh, right, I hadn't noticed that fact 5 had vanished.

Now a Haskell compiler /might/ evaluate that at compile-time without any 
special hints. You know why? IT CAN'T HAVE SIDE EFFECTS. :-P

Anyway, with Template Haskell, I can merely write

   $(let x = factorial 5 in [| x |])

to achieve the same effect. Not exactly intractably complex, eh?

>>> you're taking a comic book too seriously...
>>
>> Perhaps. Maybe I'm just tired of Lisp fanboys claiming that this one
>> single language is some kind of perfect Utopia.
>
> now, there are not that many Lisp fanboys out there, there are?  Most I know are
> more like Lisp fanoldmen...
>
> Scheme doesn't have many fanboys either, more like irate compsci freshmen...

Whatever. I try not to make wild claims like "Haskell is perfect". I 
think it's a fantastic language, but I know for a fact that it's not 
perfect.

>>> nah, inc is 1 char shorter. ;)
>>
>> And (+1) is more descriptive.
>
> not in Lisp where it looks like a silly function application... :p

It does what it says on the tin.

>> Also, I can soon change that to "iterate (*2) 1" to generate a geometric
>> sequence instead. I bet Lisp doesn't have a predefined "multiply by 2"
>> function. :-P
>
> we could have something like:
> (define (pf op x) (lambda (n) (op n x)))
>
> (iterate (pf * 2) 0)

Yes, that's just as succinct as (*2). Oh, wait...

>> Actually, just the other day I was trying to pin down in my head exactly
>> which sorts of programs Haskell is good for and bad for.
>
> math problems and formal proof automation would be my guess for best suited.
> Real world is more complicated.  say a script to automate a task.  You don't
> really need all that type theory to get it done...

You don't really need "all that type theory" to understand or use 
Haskell. :-P

Anyway, the basic answer seems to be programs where getting the right 
answer is more important than getting it fast, or where figuring out how 
to solve the problem at all is really complex and you need all the help 
you can get...


Post a reply to this message

From: Darren New
Subject: Re: Land of Lisp
Date: 2 Nov 2010 13:13:24
Message: <4cd046b4@news.povray.org>
Invisible wrote:
> All I know is that every time I've used a dynamically-typed language, 
> I've spent 90% of my time fixing type errors 

You're not thinking clearly, then.  Type errors in a dynamically typed 
language are not something that crop up a lot for people who use such 
languages frequently. It takes a little getting used to to remember what 
types variables are while you're writing a function, but how hard can that be?

Now, if you use a lot of crappy undocumented libraries, sure, that can be 
problematic. But that's true in any language.

The only time I get type problems is I'll occasionally mix up something like 
a list of list of X with a list of X, but that bombs out the first time you 
run it if the type system isn't absurdly forgiving, so it's not really any 
more of a problem than it getting caught the first time you compile it.

(Of course, if your IDE catches such things before you've even finished 
typing the body of the function, it's nice.)

 > From what little I've seen, a Lisp DSL just looks like Lisp.

That's because the people writing and using the DSLs are comfortable with 
LISP. LISP had read macros, so they can look like anything you want.

 > Every other language I've ever seen that has an eval function has been 
interpreted.

The *eval* call has to be interpreted, sure. But if you don't use eval, 
everything is compiled. If you do use eval, everything but the eval is compiled.

You're aware that C# has (or will have, depending how strict you want to be) 
the equivalent of eval, right?  As does Java, in some sense?

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 2 Nov 2010 13:15:00
Message: <web.4cd046952bc10e38b282d9920@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> On 02/11/2010 02:00 AM, nemesis wrote:
> > Invisible<voi### [at] devnull>  wrote:
> >> Is it possible to create variables with local rather than global scope?
> >
> > you're kidding me, right?
>
> Well, I've never seen anybody do it before, that's all.

no, you've never seen before, period.  Even previous examples I posted.  If
you'd had any curiosity on the subject, I'm sure plenty of examples abound in
places like projecteuler or language shootout.

> If you want a point, you say Point 3 7. Which, as you say, rather gives
> away the fact that it's a Point value. But how about saying "True"?
> That's of type Bool, but you didn't write that in your expression.
> Similarly, when you write
>
>    Branch (Leaf 3) (Leaf 7)
>
> you didn't say anywhere that this is of type Tree Int, did you? The
> compiler infers this automatically.

it infers from unique names relating to a type like those sprinkled everywhere
in the code.  Think about it... :)

> And don't even get me started on
> expressions which are polymorphic, and could have /several/ possible types.

expressions are polymorphic by default in dynamically-typed languages. :)

(define (foo bar) (boz (boz bar)))
(define (boz bar) (+ bar bar))

an implementation could infer foo ultimately works on numbers because boz calls
+.  But + could be redefined as

(define +
  (let ((o+ +))
    (lambda args
      (cond
        ((null? args) 0)
        ((number? (car args)) (apply o+ args))
        ((string? (car args)) (apply string-append args))))))

and thus foo works on numbers, strings and what more gets into the redefinition
of +.

or at least it could in scheme a few iterations ago.  Today implementations say
+ can't be redefined... :p

> > I'm talking ill of Hindley-Milner type system, but you should be aware that the
> > same lameness can happen in Lisps too:  here and now you will see things like
> > fl* fx+ or fl<... that is, functions to operate on specific numeric types.
> > Again, they exchange a single type declaration for type-specific operators
> > everywhere.  Makes me wonder if the Algol folks got it right...
>
> Not so in Haskell. If you want to add two numbers, you use (+). It's
> polymorphic like that.

yes, + in Lisp is polymorphic over numbers, be it float or integer.  fl+ or fx+
are performance-geared optimizations.

I still think simple declaration of a variable as of a given type would be
better than sprinkling code with operators/names that force expressions into a
given type.

> I'm actually surprised that a dynamically-typed language would do
> something like that. I thought the whole point of dynamic typing was
> that you can completely ignore types.

you can, except when you want performance.

> All I know is that every time I've used a dynamically-typed language,
> I've spent 90% of my time fixing type errors - errors which *could* have
> and *should* have been caught automatically by the machine, but I have
> ended up catching them myself, by hand. Yuck!

only you could do that in a language "without types" :p

I don't have a trouble providing arguments of the right types to functions, but
I must confess I've been flamed already by typos in the code only detected at
runtime. :p

Thankfully, the real nice thing about development with a REPL is that you can do
fine-grained iterative development:  write little expressions, test them, put
them into a function, repeat.   You write and test at the same time.  shame that
haskell's REPL is very limited and doesn't allow for full type definitions and
such.

> > I suspect its homoiconic nature has lots to do with it.  Plus, if you've never
> > edited Lisp code with hierarchical parenthetical editing you don't know what a
> > bless it is compared to any other language in existence.
>
> More like "without automatic bracket match, it's impossible to produce
> code which is even syntactically correct". It wouldn't be so bad, but it
> seems all Lisp programmers habitually indent their code incorrectly.

Lisp n00bs usually indent their code incorrectly, specially those who actually
care about the parentheses and treat them like brackets in C languages, even
going as far as matching open and close in the same indented column.  Lisp
experts have long developed a code style that makes brackets virtually invisible
by hiding them to the rightmost end.

BTW, the expressions above I typed here, but copied them to a scheme buffer to
test them.  You can do marvels by merely typing () in sequence and going in
between them next.

> I can see how having a language where everything is a list is
> technically quite a nice idea. It's simple and elegant. Unfortunately,
> the language as a whole seems to be messy and inconsistent. Too much
> backwards compatibility, too much muddled thinking. Still, I guess it
> was the first attempt...

There is quite a lot of backwards compatibility in CL, that's for sure:  it was
designed by a comittee to comply with several older lisp dialects, so they
basically just made a big union of features.  Not so much with scheme.

I'm specially annoyed at the car/cdr family of operators so dear to old Lispers.
 Those were the names of two macro-assembler instructions in the first machine
Lisp was implemented in.  They are truly lame legacy misnomers for head/tail...

I discussed a bit about them here to some strong reactions from old farts:

http://groups.google.com.br/group/comp.lang.lisp/browse_thread/thread/16135e3a09d5f6f3#

> > It's because you still don't know it well.
>
> Sure, functions are first-class. But it lacks purity.

like I said, aside from Haskell, no other language goes that far to ensure a
pure, side-effect-free mystical environment.  Including others from functional
programming land, like scheme or ML.

this is side-effect free:

(let fact ((n 25) (r 1)) (if (< n 2) r (fact (- n 1) (* n r))))

even though implementations so far don't go as far as detecting its purity to
allow for fine-grained paralellism.

(BTW, once again written in the browser)

> It also doesn't
> appear to place any emphasis at all on coding in a vaguely functional
> style - not in the language design nor the standard libraries.

if you're talking about Common Lisp you're right:  it's a very imperative
version of Lisp.  Scheme was in part developed in the 70's as a functional
rework of the original Lisp principles.

> No, a design that enforces purity has profound consequences for any
> implementation of that design. (Not to mention the entire way that the
> language is used.)

true.  But whenever I want all pure and white design I know where I can find
haskell... ;)

> And Clean doesn't count because...?

because it is even more obscure than haskell itself?

> To me, "new syntax" means that I can write something that doesn't look
> like the host language. From what little I've seen, a Lisp DSL just
> looks like Lisp.

true.  We actually consider that a benefit. ;)

> And a Haskell expression consists of some combination of exactly 6 (go
> count them) structures.

nobody codes haskell without such sugar just as nobody codes scheme like

((lambda (fact) (fact fact 5))
 (lambda (fact n) (if (< n 2) 1 (* n (fact fact (- n 1))))))

(once again written here)

> The rest is just sugar. That's peanuts compared
> to (say) the number of possible statements in Java. (For-loops,
> while-loops, if-then-else, return, switch/break...)

that's for sure is a gain of all functional programming languages:  creating
larger things from the composition of a few core functions.

> > (iterate (pf * 2) 0)
>
> Yes, that's just as succinct as (*2). Oh, wait...

it is as succint as you can get without having to alter the parser to allow for
special syntax and corner cases here and there.  Simpler design. :)


Post a reply to this message

From: Orchid XP v8
Subject: Re: Land of Lisp
Date: 2 Nov 2010 14:39:45
Message: <4cd05af1@news.povray.org>
>> If you want a point, you say Point 3 7. Which, as you say, rather gives
>> away the fact that it's a Point value. But how about saying "True"?
>> That's of type Bool, but you didn't write that in your expression.
>> Similarly, when you write
>>
>>     Branch (Leaf 3) (Leaf 7)
>>
>> you didn't say anywhere that this is of type Tree Int, did you? The
>> compiler infers this automatically.
>
> it infers from unique names relating to a type like those sprinkled everywhere
> in the code.  Think about it... :)

For monomorphic expressions, sure. You do tend to end up sprinkling 
these things around. But that's really no different to (say) Java having 
"new Foo()", "new Bar()" etc sprinkled everywhere. [Except that you 
don't have to uselessly duplicate it in the variable declarations as well!]

The polymorphic code, the stuff that works for any type, is mostly 
devoid of this kind of stuff.

>> Not so in Haskell. If you want to add two numbers, you use (+). It's
>> polymorphic like that.
>
> yes, + in Lisp is polymorphic over numbers, be it float or integer.  fl+ or fx+
> are performance-geared optimizations.

Now, see, in Haskell you'd just write a type signature that fixes the 
expression to one specific type, and the compiler does the rest. 
Typically you only need to fix it in one place too.

> I still think simple declaration of a variable as of a given type would be
> better than sprinkling code with operators/names that force expressions into a
> given type.

And I still think it requires a lot less "sprinkling" than you seem to 
think.

>> All I know is that every time I've used a dynamically-typed language,
>> I've spent 90% of my time fixing type errors - errors which *could* have
>> and *should* have been caught automatically by the machine, but I have
>> ended up catching them myself, by hand. Yuck!
>
> only you could do that in a language "without types" :p

Oh, it *has* types, it's just that it only checks them when it's about 
to use them. Which means that a tiny number of programs are accepted 
that might otherwise be rejected, and a vast number of programs are 
accepted that crash as soon as you run them - or worse...

> I don't have a trouble providing arguments of the right types to functions, but
> I must confess I've been flamed already by typos in the code only detected at
> runtime. :p
>
> Thankfully, the real nice thing about development with a REPL is that you can do
> fine-grained iterative development:  write little expressions, test them, put
> them into a function, repeat.   You write and test at the same time.

And this is significantly easier with pure code. :-P

> Shame that
> haskell's REPL is very limited and doesn't allow for full type definitions and
> such.

Now that _is_ a shame. It seems such a pointless and arbitrary 
limitation, IMHO.

>> More like "without automatic bracket match, it's impossible to produce
>> code which is even syntactically correct". It wouldn't be so bad, but it
>> seems all Lisp programmers habitually indent their code incorrectly.
>
> Lisp
> experts have long developed a code style that makes brackets virtually invisible
> by hiding them to the rightmost end.

...which is worrying, given that the brackets are fundamental to the 
language syntax, and moving them around slightly radically transforms 
the entire meaning of an expression.

>> I can see how having a language where everything is a list is
>> technically quite a nice idea. It's simple and elegant. Unfortunately,
>> the language as a whole seems to be messy and inconsistent. Too much
>> backwards compatibility, too much muddled thinking. Still, I guess it
>> was the first attempt...
>
> There is quite a lot of backwards compatibility in CL, that's for sure:

I dislike backwards compatibility. Even Haskell has too much of that.

> I'm specially annoyed at the car/cdr family of operators so dear to old Lispers.

Yes, that's specifically the single most annoying part I came across. I 
imagine entire flamewars have been fought over that one...

>> Sure, functions are first-class. But it lacks purity.
>
> like I said, aside from Haskell, no other language goes that far to ensure a
> pure, side-effect-free mystical environment.

IMHO, this is why Haskell is superior to every other language.

>> It also doesn't
>> appear to place any emphasis at all on coding in a vaguely functional
>> style - not in the language design nor the standard libraries.
>
> if you're talking about Common Lisp you're right:  it's a very imperative
> version of Lisp.  Scheme was in part developed in the 70's as a functional
> rework of the original Lisp principles.

No wonder they hate each other! ;-)

>> And Clean doesn't count because...?
>
> because it is even more obscure than haskell itself?

Funny, people seem to have heard of Clean. And OCaml. And Erlang. And 
Lisp. And even Prolog. And yet, nobody seems to have ever heard of 
Haskell...

>> To me, "new syntax" means that I can write something that doesn't look
>> like the host language. From what little I've seen, a Lisp DSL just
>> looks like Lisp.
>
> true.  We actually consider that a benefit. ;)

What, that you can define a new syntax which is identical to the 
existing one? Why bother?

>> And a Haskell expression consists of some combination of exactly 6 (go
>> count them) structures.
>
> nobody codes haskell without such sugar.

And? I'm talking about the ease with which you can programmatically 
construct expressions using Template Haskell. The fact that the entire 
language is 6 constructs makes that real easy.

> that's for sure is a gain of all functional programming languages:  creating
> larger things from the composition of a few core functions.

Amen.

>>> (iterate (pf * 2) 0)
>>
>> Yes, that's just as succinct as (*2). Oh, wait...
>
> it is as succint as you can get without having to alter the parser to allow for
> special syntax and corner cases here and there.  Simpler design. :)

...and if you language already supports it, you don't need to alter the 
parser. ;-)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Orchid XP v8
Subject: Re: Land of Lisp
Date: 2 Nov 2010 15:42:18
Message: <4cd0699a$1@news.povray.org>
>> All I know is that every time I've used a dynamically-typed language,
>> I've spent 90% of my time fixing type errors
>
> You're not thinking clearly, then.

Right. Sure.

> The only time I get type problems is I'll occasionally mix up something
> like a list of list of X with a list of X, but that bombs out the first
> time you run it if the type system isn't absurdly forgiving, so it's not
> really any more of a problem than it getting caught the first time you
> compile it.

Unless, of course, the problem is on a rarely-executed code path. Like, 
you make the same mistake a few times, and all the commonly-used code 
paths get fixed, but there's one slightly rare path that you 
accidentally miss.

The other thing, of course, is that you can't *change* part of the code 
and then just follow the type errors to find all the parts that need 
updating. You have to manually do several million test runs and hope you 
hit everything.

Don't get me wrong, I can do it. (See my crazy JavaScript stuff, for 
example.) It's just drastically harder than it should be.

> You're aware that C# has (or will have, depending how strict you want to
> be) the equivalent of eval, right? As does Java, in some sense?

You're aware that C# and Java are interpretted languages, right? ;-) (In 
the case of C#, it's not even possible to install the runtime without 
installing the compiler too...)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Darren New
Subject: Re: Land of Lisp
Date: 2 Nov 2010 18:46:26
Message: <4cd094c2$1@news.povray.org>
Orchid XP v8 wrote:
> Unless, of course, the problem is on a rarely-executed code path. Like, 
> you make the same mistake a few times, and all the commonly-used code 
> paths get fixed, but there's one slightly rare path that you 
> accidentally miss.

So you're writing code that you never tested? That's not something a 
compiler will help with.

> The other thing, of course, is that you can't *change* part of the code 
> and then just follow the type errors to find all the parts that need 
> updating. You have to manually do several million test runs and hope you 
> hit everything.

This is true. But not what you were talking about.

I'm not saying dynamic languages are superior. I'm simply saying that when 
you're used to using dynamic languages, the "you got the wrong type" is 
really a rare problem.

> You're aware that C# and Java are interpretted languages, right?

No they're not.  No more than the ability to load a DLL means that C is an 
interpreted language.

Plus, look up "C# aot compiler" and rejoice in the glory of it. For example,
http://msdn.microsoft.com/en-us/library/6t9t5wcf%28VS.80%29.aspx
or http://tirania.org/blog/archive/2006/Aug-17.html

> the case of C#, it's not even possible to install the runtime without 
> installing the compiler too...)

Nor is that true. What are you smoking over there?  Can I have some?

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 2 Nov 2010 19:20:00
Message: <web.4cd09c1c2bc10e38b282d9920@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Orchid XP v8 wrote:
> > Unless, of course, the problem is on a rarely-executed code path. Like,
> > you make the same mistake a few times, and all the commonly-used code
> > paths get fixed, but there's one slightly rare path that you
> > accidentally miss.
>
> So you're writing code that you never tested? That's not something a
> compiler will help with.
>
> > The other thing, of course, is that you can't *change* part of the code
> > and then just follow the type errors to find all the parts that need
> > updating. You have to manually do several million test runs and hope you
> > hit everything.
>
> This is true. But not what you were talking about.
>
> I'm not saying dynamic languages are superior. I'm simply saying that when
> you're used to using dynamic languages, the "you got the wrong type" is
> really a rare problem.

thanks, I wouldn't say any better...

> > the case of C#, it's not even possible to install the runtime without
> > installing the compiler too...)
>
> Nor is that true. What are you smoking over there?  Can I have some?

must be the lady.  They can turn bright men into imbeciles. :)


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 3 Nov 2010 05:18:39
Message: <4cd128ef$1@news.povray.org>
>> Unless, of course, the problem is on a rarely-executed code path.
>> Like, you make the same mistake a few times, and all the commonly-used
>> code paths get fixed, but there's one slightly rare path that you
>> accidentally miss.
>
> So you're writing code that you never tested? That's not something a
> compiler will help with.

I take it you haven't heard the expression "if it compiles, it usually 
works correctly" then?

Obviously no compiler is ever going to eliminate the need for testing 
completely. But it can drastically reduce the amount of time you have to 
spend hunting for silly typos.

>> The other thing, of course, is that you can't *change* part of the
>> code and then just follow the type errors to find all the parts that
>> need updating. You have to manually do several million test runs and
>> hope you hit everything.
>
> This is true. But not what you were talking about.

Isn't it?

> I'm not saying dynamic languages are superior. I'm simply saying that
> when you're used to using dynamic languages, the "you got the wrong
> type" is really a rare problem.

I've always preferred statically-typed languages. They catch so much 
more of the type of bugs I tend to write. With an inflexible type system 
like Java, I can understand people being exasperated by it and wanting 
to use a dynamically-typed language instead. But really, the benefits of 
static typing vastly outweigh the drawbacks.

>> You're aware that C# and Java are interpretted languages, right?
>
> No they're not. No more than the ability to load a DLL means that C is
> an interpreted language.

Right. And the fact that it runs on top of a VM doesn't count because...?

>> the case of C#, it's not even possible to install the runtime without
>> installing the compiler too...)
>
> Nor is that true. What are you smoking over there? Can I have some?

Uhuh. And so when you install the .NET "runtime", and it spends 45 
minutes running "ngen.exe", "native code generator", that doesn't count 
as "running the compiler" because...?


Post a reply to this message

From: Mike Raiford
Subject: Re: Land of Lisp
Date: 3 Nov 2010 09:49:18
Message: <4cd1685e$1@news.povray.org>
On 11/3/2010 4:18 AM, Invisible wrote:

>>> You're aware that C# and Java are interpretted languages, right?
>>
>> No they're not. No more than the ability to load a DLL means that C is
>> an interpreted language.
>
> Right. And the fact that it runs on top of a VM doesn't count because...?
>

Not a VM at all. CLR IL is compiled on-the-fly as its encountered. This 
creates a few initial delays, but once everything has been jitted, it's 
every bit as native as C. Actually, if you step through a CLR app in a 
debugger and switch over to the disassembly, everything is in x86 
machine code by that point.

>>> the case of C#, it's not even possible to install the runtime without
>>> installing the compiler too...)
>>
>> Nor is that true. What are you smoking over there? Can I have some?
>
> Uhuh. And so when you install the .NET "runtime", and it spends 45
> minutes running "ngen.exe", "native code generator", that doesn't count
> as "running the compiler" because...?

I believe you can turn that off, but it slows application start-up for 
.NET apps.

-- 
~Mike


Post a reply to this message

From: Darren New
Subject: Re: Land of Lisp
Date: 3 Nov 2010 11:17:25
Message: <4cd17d05$1@news.povray.org>
Invisible wrote:
> I take it you haven't heard the expression "if it compiles, it usually 
> works correctly" then?

That happens with me,b ut that's because I have 30 years experience.

As I say, I almost never have that sort of typo.

>> I'm not saying dynamic languages are superior. I'm simply saying that
>> when you're used to using dynamic languages, the "you got the wrong
>> type" is really a rare problem.
> 
> I've always preferred statically-typed languages. They catch so much 
> more of the type of bugs I tend to write. With an inflexible type system 
> like Java, I can understand people being exasperated by it and wanting 
> to use a dynamically-typed language instead. But really, the benefits of 
> static typing vastly outweigh the drawbacks.

I'm not disagreeing.

>>> You're aware that C# and Java are interpretted languages, right?
>>
>> No they're not. No more than the ability to load a DLL means that C is
>> an interpreted language.
> 
> Right. And the fact that it runs on top of a VM doesn't count because...?

Because the VM compiles them to machine code before it runs, and the VM it 
runs on top of is statically typed, strongly checked, and not unlike a CPU 
in architecture.

> Uhuh. And so when you install the .NET "runtime", and it spends 45 
> minutes running "ngen.exe", "native code generator", that doesn't count 
> as "running the compiler" because...?

It's translating CIL into machine code for your specific machine. It's 
basically pre-JITting your libraries.

I'm not sure how you can decide that NGEN is a compiler, then tell me that 
C# isn't compiled because it runs on top of a VM.

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 2 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.