POV-Ray : Newsgroups : povray.off-topic : Land of Lisp Server Time
3 Sep 2024 21:12:41 EDT (-0400)
  Land of Lisp (Message 13 to 22 of 22)  
<<< Previous 10 Messages Goto Initial 10 Messages
From: nemesis
Subject: Re: Land of Lisp
Date: 2 Nov 2010 13:15:00
Message: <web.4cd046952bc10e38b282d9920@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> On 02/11/2010 02:00 AM, nemesis wrote:
> > Invisible<voi### [at] devnull>  wrote:
> >> Is it possible to create variables with local rather than global scope?
> >
> > you're kidding me, right?
>
> Well, I've never seen anybody do it before, that's all.

no, you've never seen before, period.  Even previous examples I posted.  If
you'd had any curiosity on the subject, I'm sure plenty of examples abound in
places like projecteuler or language shootout.

> If you want a point, you say Point 3 7. Which, as you say, rather gives
> away the fact that it's a Point value. But how about saying "True"?
> That's of type Bool, but you didn't write that in your expression.
> Similarly, when you write
>
>    Branch (Leaf 3) (Leaf 7)
>
> you didn't say anywhere that this is of type Tree Int, did you? The
> compiler infers this automatically.

it infers from unique names relating to a type like those sprinkled everywhere
in the code.  Think about it... :)

> And don't even get me started on
> expressions which are polymorphic, and could have /several/ possible types.

expressions are polymorphic by default in dynamically-typed languages. :)

(define (foo bar) (boz (boz bar)))
(define (boz bar) (+ bar bar))

an implementation could infer foo ultimately works on numbers because boz calls
+.  But + could be redefined as

(define +
  (let ((o+ +))
    (lambda args
      (cond
        ((null? args) 0)
        ((number? (car args)) (apply o+ args))
        ((string? (car args)) (apply string-append args))))))

and thus foo works on numbers, strings and what more gets into the redefinition
of +.

or at least it could in scheme a few iterations ago.  Today implementations say
+ can't be redefined... :p

> > I'm talking ill of Hindley-Milner type system, but you should be aware that the
> > same lameness can happen in Lisps too:  here and now you will see things like
> > fl* fx+ or fl<... that is, functions to operate on specific numeric types.
> > Again, they exchange a single type declaration for type-specific operators
> > everywhere.  Makes me wonder if the Algol folks got it right...
>
> Not so in Haskell. If you want to add two numbers, you use (+). It's
> polymorphic like that.

yes, + in Lisp is polymorphic over numbers, be it float or integer.  fl+ or fx+
are performance-geared optimizations.

I still think simple declaration of a variable as of a given type would be
better than sprinkling code with operators/names that force expressions into a
given type.

> I'm actually surprised that a dynamically-typed language would do
> something like that. I thought the whole point of dynamic typing was
> that you can completely ignore types.

you can, except when you want performance.

> All I know is that every time I've used a dynamically-typed language,
> I've spent 90% of my time fixing type errors - errors which *could* have
> and *should* have been caught automatically by the machine, but I have
> ended up catching them myself, by hand. Yuck!

only you could do that in a language "without types" :p

I don't have a trouble providing arguments of the right types to functions, but
I must confess I've been flamed already by typos in the code only detected at
runtime. :p

Thankfully, the real nice thing about development with a REPL is that you can do
fine-grained iterative development:  write little expressions, test them, put
them into a function, repeat.   You write and test at the same time.  shame that
haskell's REPL is very limited and doesn't allow for full type definitions and
such.

> > I suspect its homoiconic nature has lots to do with it.  Plus, if you've never
> > edited Lisp code with hierarchical parenthetical editing you don't know what a
> > bless it is compared to any other language in existence.
>
> More like "without automatic bracket match, it's impossible to produce
> code which is even syntactically correct". It wouldn't be so bad, but it
> seems all Lisp programmers habitually indent their code incorrectly.

Lisp n00bs usually indent their code incorrectly, specially those who actually
care about the parentheses and treat them like brackets in C languages, even
going as far as matching open and close in the same indented column.  Lisp
experts have long developed a code style that makes brackets virtually invisible
by hiding them to the rightmost end.

BTW, the expressions above I typed here, but copied them to a scheme buffer to
test them.  You can do marvels by merely typing () in sequence and going in
between them next.

> I can see how having a language where everything is a list is
> technically quite a nice idea. It's simple and elegant. Unfortunately,
> the language as a whole seems to be messy and inconsistent. Too much
> backwards compatibility, too much muddled thinking. Still, I guess it
> was the first attempt...

There is quite a lot of backwards compatibility in CL, that's for sure:  it was
designed by a comittee to comply with several older lisp dialects, so they
basically just made a big union of features.  Not so much with scheme.

I'm specially annoyed at the car/cdr family of operators so dear to old Lispers.
 Those were the names of two macro-assembler instructions in the first machine
Lisp was implemented in.  They are truly lame legacy misnomers for head/tail...

I discussed a bit about them here to some strong reactions from old farts:

http://groups.google.com.br/group/comp.lang.lisp/browse_thread/thread/16135e3a09d5f6f3#

> > It's because you still don't know it well.
>
> Sure, functions are first-class. But it lacks purity.

like I said, aside from Haskell, no other language goes that far to ensure a
pure, side-effect-free mystical environment.  Including others from functional
programming land, like scheme or ML.

this is side-effect free:

(let fact ((n 25) (r 1)) (if (< n 2) r (fact (- n 1) (* n r))))

even though implementations so far don't go as far as detecting its purity to
allow for fine-grained paralellism.

(BTW, once again written in the browser)

> It also doesn't
> appear to place any emphasis at all on coding in a vaguely functional
> style - not in the language design nor the standard libraries.

if you're talking about Common Lisp you're right:  it's a very imperative
version of Lisp.  Scheme was in part developed in the 70's as a functional
rework of the original Lisp principles.

> No, a design that enforces purity has profound consequences for any
> implementation of that design. (Not to mention the entire way that the
> language is used.)

true.  But whenever I want all pure and white design I know where I can find
haskell... ;)

> And Clean doesn't count because...?

because it is even more obscure than haskell itself?

> To me, "new syntax" means that I can write something that doesn't look
> like the host language. From what little I've seen, a Lisp DSL just
> looks like Lisp.

true.  We actually consider that a benefit. ;)

> And a Haskell expression consists of some combination of exactly 6 (go
> count them) structures.

nobody codes haskell without such sugar just as nobody codes scheme like

((lambda (fact) (fact fact 5))
 (lambda (fact n) (if (< n 2) 1 (* n (fact fact (- n 1))))))

(once again written here)

> The rest is just sugar. That's peanuts compared
> to (say) the number of possible statements in Java. (For-loops,
> while-loops, if-then-else, return, switch/break...)

that's for sure is a gain of all functional programming languages:  creating
larger things from the composition of a few core functions.

> > (iterate (pf * 2) 0)
>
> Yes, that's just as succinct as (*2). Oh, wait...

it is as succint as you can get without having to alter the parser to allow for
special syntax and corner cases here and there.  Simpler design. :)


Post a reply to this message

From: Orchid XP v8
Subject: Re: Land of Lisp
Date: 2 Nov 2010 14:39:45
Message: <4cd05af1@news.povray.org>
>> If you want a point, you say Point 3 7. Which, as you say, rather gives
>> away the fact that it's a Point value. But how about saying "True"?
>> That's of type Bool, but you didn't write that in your expression.
>> Similarly, when you write
>>
>>     Branch (Leaf 3) (Leaf 7)
>>
>> you didn't say anywhere that this is of type Tree Int, did you? The
>> compiler infers this automatically.
>
> it infers from unique names relating to a type like those sprinkled everywhere
> in the code.  Think about it... :)

For monomorphic expressions, sure. You do tend to end up sprinkling 
these things around. But that's really no different to (say) Java having 
"new Foo()", "new Bar()" etc sprinkled everywhere. [Except that you 
don't have to uselessly duplicate it in the variable declarations as well!]

The polymorphic code, the stuff that works for any type, is mostly 
devoid of this kind of stuff.

>> Not so in Haskell. If you want to add two numbers, you use (+). It's
>> polymorphic like that.
>
> yes, + in Lisp is polymorphic over numbers, be it float or integer.  fl+ or fx+
> are performance-geared optimizations.

Now, see, in Haskell you'd just write a type signature that fixes the 
expression to one specific type, and the compiler does the rest. 
Typically you only need to fix it in one place too.

> I still think simple declaration of a variable as of a given type would be
> better than sprinkling code with operators/names that force expressions into a
> given type.

And I still think it requires a lot less "sprinkling" than you seem to 
think.

>> All I know is that every time I've used a dynamically-typed language,
>> I've spent 90% of my time fixing type errors - errors which *could* have
>> and *should* have been caught automatically by the machine, but I have
>> ended up catching them myself, by hand. Yuck!
>
> only you could do that in a language "without types" :p

Oh, it *has* types, it's just that it only checks them when it's about 
to use them. Which means that a tiny number of programs are accepted 
that might otherwise be rejected, and a vast number of programs are 
accepted that crash as soon as you run them - or worse...

> I don't have a trouble providing arguments of the right types to functions, but
> I must confess I've been flamed already by typos in the code only detected at
> runtime. :p
>
> Thankfully, the real nice thing about development with a REPL is that you can do
> fine-grained iterative development:  write little expressions, test them, put
> them into a function, repeat.   You write and test at the same time.

And this is significantly easier with pure code. :-P

> Shame that
> haskell's REPL is very limited and doesn't allow for full type definitions and
> such.

Now that _is_ a shame. It seems such a pointless and arbitrary 
limitation, IMHO.

>> More like "without automatic bracket match, it's impossible to produce
>> code which is even syntactically correct". It wouldn't be so bad, but it
>> seems all Lisp programmers habitually indent their code incorrectly.
>
> Lisp
> experts have long developed a code style that makes brackets virtually invisible
> by hiding them to the rightmost end.

...which is worrying, given that the brackets are fundamental to the 
language syntax, and moving them around slightly radically transforms 
the entire meaning of an expression.

>> I can see how having a language where everything is a list is
>> technically quite a nice idea. It's simple and elegant. Unfortunately,
>> the language as a whole seems to be messy and inconsistent. Too much
>> backwards compatibility, too much muddled thinking. Still, I guess it
>> was the first attempt...
>
> There is quite a lot of backwards compatibility in CL, that's for sure:

I dislike backwards compatibility. Even Haskell has too much of that.

> I'm specially annoyed at the car/cdr family of operators so dear to old Lispers.

Yes, that's specifically the single most annoying part I came across. I 
imagine entire flamewars have been fought over that one...

>> Sure, functions are first-class. But it lacks purity.
>
> like I said, aside from Haskell, no other language goes that far to ensure a
> pure, side-effect-free mystical environment.

IMHO, this is why Haskell is superior to every other language.

>> It also doesn't
>> appear to place any emphasis at all on coding in a vaguely functional
>> style - not in the language design nor the standard libraries.
>
> if you're talking about Common Lisp you're right:  it's a very imperative
> version of Lisp.  Scheme was in part developed in the 70's as a functional
> rework of the original Lisp principles.

No wonder they hate each other! ;-)

>> And Clean doesn't count because...?
>
> because it is even more obscure than haskell itself?

Funny, people seem to have heard of Clean. And OCaml. And Erlang. And 
Lisp. And even Prolog. And yet, nobody seems to have ever heard of 
Haskell...

>> To me, "new syntax" means that I can write something that doesn't look
>> like the host language. From what little I've seen, a Lisp DSL just
>> looks like Lisp.
>
> true.  We actually consider that a benefit. ;)

What, that you can define a new syntax which is identical to the 
existing one? Why bother?

>> And a Haskell expression consists of some combination of exactly 6 (go
>> count them) structures.
>
> nobody codes haskell without such sugar.

And? I'm talking about the ease with which you can programmatically 
construct expressions using Template Haskell. The fact that the entire 
language is 6 constructs makes that real easy.

> that's for sure is a gain of all functional programming languages:  creating
> larger things from the composition of a few core functions.

Amen.

>>> (iterate (pf * 2) 0)
>>
>> Yes, that's just as succinct as (*2). Oh, wait...
>
> it is as succint as you can get without having to alter the parser to allow for
> special syntax and corner cases here and there.  Simpler design. :)

...and if you language already supports it, you don't need to alter the 
parser. ;-)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Orchid XP v8
Subject: Re: Land of Lisp
Date: 2 Nov 2010 15:42:18
Message: <4cd0699a$1@news.povray.org>
>> All I know is that every time I've used a dynamically-typed language,
>> I've spent 90% of my time fixing type errors
>
> You're not thinking clearly, then.

Right. Sure.

> The only time I get type problems is I'll occasionally mix up something
> like a list of list of X with a list of X, but that bombs out the first
> time you run it if the type system isn't absurdly forgiving, so it's not
> really any more of a problem than it getting caught the first time you
> compile it.

Unless, of course, the problem is on a rarely-executed code path. Like, 
you make the same mistake a few times, and all the commonly-used code 
paths get fixed, but there's one slightly rare path that you 
accidentally miss.

The other thing, of course, is that you can't *change* part of the code 
and then just follow the type errors to find all the parts that need 
updating. You have to manually do several million test runs and hope you 
hit everything.

Don't get me wrong, I can do it. (See my crazy JavaScript stuff, for 
example.) It's just drastically harder than it should be.

> You're aware that C# has (or will have, depending how strict you want to
> be) the equivalent of eval, right? As does Java, in some sense?

You're aware that C# and Java are interpretted languages, right? ;-) (In 
the case of C#, it's not even possible to install the runtime without 
installing the compiler too...)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Darren New
Subject: Re: Land of Lisp
Date: 2 Nov 2010 18:46:26
Message: <4cd094c2$1@news.povray.org>
Orchid XP v8 wrote:
> Unless, of course, the problem is on a rarely-executed code path. Like, 
> you make the same mistake a few times, and all the commonly-used code 
> paths get fixed, but there's one slightly rare path that you 
> accidentally miss.

So you're writing code that you never tested? That's not something a 
compiler will help with.

> The other thing, of course, is that you can't *change* part of the code 
> and then just follow the type errors to find all the parts that need 
> updating. You have to manually do several million test runs and hope you 
> hit everything.

This is true. But not what you were talking about.

I'm not saying dynamic languages are superior. I'm simply saying that when 
you're used to using dynamic languages, the "you got the wrong type" is 
really a rare problem.

> You're aware that C# and Java are interpretted languages, right?

No they're not.  No more than the ability to load a DLL means that C is an 
interpreted language.

Plus, look up "C# aot compiler" and rejoice in the glory of it. For example,
http://msdn.microsoft.com/en-us/library/6t9t5wcf%28VS.80%29.aspx
or http://tirania.org/blog/archive/2006/Aug-17.html

> the case of C#, it's not even possible to install the runtime without 
> installing the compiler too...)

Nor is that true. What are you smoking over there?  Can I have some?

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 2 Nov 2010 19:20:00
Message: <web.4cd09c1c2bc10e38b282d9920@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Orchid XP v8 wrote:
> > Unless, of course, the problem is on a rarely-executed code path. Like,
> > you make the same mistake a few times, and all the commonly-used code
> > paths get fixed, but there's one slightly rare path that you
> > accidentally miss.
>
> So you're writing code that you never tested? That's not something a
> compiler will help with.
>
> > The other thing, of course, is that you can't *change* part of the code
> > and then just follow the type errors to find all the parts that need
> > updating. You have to manually do several million test runs and hope you
> > hit everything.
>
> This is true. But not what you were talking about.
>
> I'm not saying dynamic languages are superior. I'm simply saying that when
> you're used to using dynamic languages, the "you got the wrong type" is
> really a rare problem.

thanks, I wouldn't say any better...

> > the case of C#, it's not even possible to install the runtime without
> > installing the compiler too...)
>
> Nor is that true. What are you smoking over there?  Can I have some?

must be the lady.  They can turn bright men into imbeciles. :)


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 3 Nov 2010 05:18:39
Message: <4cd128ef$1@news.povray.org>
>> Unless, of course, the problem is on a rarely-executed code path.
>> Like, you make the same mistake a few times, and all the commonly-used
>> code paths get fixed, but there's one slightly rare path that you
>> accidentally miss.
>
> So you're writing code that you never tested? That's not something a
> compiler will help with.

I take it you haven't heard the expression "if it compiles, it usually 
works correctly" then?

Obviously no compiler is ever going to eliminate the need for testing 
completely. But it can drastically reduce the amount of time you have to 
spend hunting for silly typos.

>> The other thing, of course, is that you can't *change* part of the
>> code and then just follow the type errors to find all the parts that
>> need updating. You have to manually do several million test runs and
>> hope you hit everything.
>
> This is true. But not what you were talking about.

Isn't it?

> I'm not saying dynamic languages are superior. I'm simply saying that
> when you're used to using dynamic languages, the "you got the wrong
> type" is really a rare problem.

I've always preferred statically-typed languages. They catch so much 
more of the type of bugs I tend to write. With an inflexible type system 
like Java, I can understand people being exasperated by it and wanting 
to use a dynamically-typed language instead. But really, the benefits of 
static typing vastly outweigh the drawbacks.

>> You're aware that C# and Java are interpretted languages, right?
>
> No they're not. No more than the ability to load a DLL means that C is
> an interpreted language.

Right. And the fact that it runs on top of a VM doesn't count because...?

>> the case of C#, it's not even possible to install the runtime without
>> installing the compiler too...)
>
> Nor is that true. What are you smoking over there? Can I have some?

Uhuh. And so when you install the .NET "runtime", and it spends 45 
minutes running "ngen.exe", "native code generator", that doesn't count 
as "running the compiler" because...?


Post a reply to this message

From: Mike Raiford
Subject: Re: Land of Lisp
Date: 3 Nov 2010 09:49:18
Message: <4cd1685e$1@news.povray.org>
On 11/3/2010 4:18 AM, Invisible wrote:

>>> You're aware that C# and Java are interpretted languages, right?
>>
>> No they're not. No more than the ability to load a DLL means that C is
>> an interpreted language.
>
> Right. And the fact that it runs on top of a VM doesn't count because...?
>

Not a VM at all. CLR IL is compiled on-the-fly as its encountered. This 
creates a few initial delays, but once everything has been jitted, it's 
every bit as native as C. Actually, if you step through a CLR app in a 
debugger and switch over to the disassembly, everything is in x86 
machine code by that point.

>>> the case of C#, it's not even possible to install the runtime without
>>> installing the compiler too...)
>>
>> Nor is that true. What are you smoking over there? Can I have some?
>
> Uhuh. And so when you install the .NET "runtime", and it spends 45
> minutes running "ngen.exe", "native code generator", that doesn't count
> as "running the compiler" because...?

I believe you can turn that off, but it slows application start-up for 
.NET apps.

-- 
~Mike


Post a reply to this message

From: Darren New
Subject: Re: Land of Lisp
Date: 3 Nov 2010 11:17:25
Message: <4cd17d05$1@news.povray.org>
Invisible wrote:
> I take it you haven't heard the expression "if it compiles, it usually 
> works correctly" then?

That happens with me,b ut that's because I have 30 years experience.

As I say, I almost never have that sort of typo.

>> I'm not saying dynamic languages are superior. I'm simply saying that
>> when you're used to using dynamic languages, the "you got the wrong
>> type" is really a rare problem.
> 
> I've always preferred statically-typed languages. They catch so much 
> more of the type of bugs I tend to write. With an inflexible type system 
> like Java, I can understand people being exasperated by it and wanting 
> to use a dynamically-typed language instead. But really, the benefits of 
> static typing vastly outweigh the drawbacks.

I'm not disagreeing.

>>> You're aware that C# and Java are interpretted languages, right?
>>
>> No they're not. No more than the ability to load a DLL means that C is
>> an interpreted language.
> 
> Right. And the fact that it runs on top of a VM doesn't count because...?

Because the VM compiles them to machine code before it runs, and the VM it 
runs on top of is statically typed, strongly checked, and not unlike a CPU 
in architecture.

> Uhuh. And so when you install the .NET "runtime", and it spends 45 
> minutes running "ngen.exe", "native code generator", that doesn't count 
> as "running the compiler" because...?

It's translating CIL into machine code for your specific machine. It's 
basically pre-JITting your libraries.

I'm not sure how you can decide that NGEN is a compiler, then tell me that 
C# isn't compiled because it runs on top of a VM.

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: Nicolas Alvarez
Subject: Re: Land of Lisp
Date: 29 Nov 2010 10:12:06
Message: <4cf3c2c6@news.povray.org>
Orchid XP v8 wrote:
>>> And a Haskell expression consists of some combination of exactly 6 (go
>>> count them) structures.
>>
>> nobody codes haskell without such sugar.
> 
> And? I'm talking about the ease with which you can programmatically
> construct expressions using Template Haskell. The fact that the entire
> language is 6 constructs makes that real easy.

The entire Whirl language consists of 2 instructions. That doesn't make it 
simple or easy.


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 29 Nov 2010 10:44:33
Message: <4cf3ca61$1@news.povray.org>
>> And? I'm talking about the ease with which you can programmatically
>> construct expressions using Template Haskell. The fact that the entire
>> language is 6 constructs makes that real easy.
>
> The entire Whirl language consists of 2 instructions. That doesn't make it
> simple or easy.

It does if you're interested in how easily a machine can write it, not 
how easily a human can write it.


Post a reply to this message

<<< Previous 10 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.