POV-Ray : Newsgroups : povray.off-topic : Land of Lisp Server Time
3 Sep 2024 23:30:36 EDT (-0400)
  Land of Lisp (Message 3 to 12 of 22)  
<<< Previous 2 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Warp
Subject: Re: Land of Lisp
Date: 29 Oct 2010 15:12:11
Message: <4ccb1c8b@news.povray.org>
10 PRINT "BASIC RULES!"
20 GOTO 10

-- 
                                                          - Warp


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 29 Oct 2010 15:40:00
Message: <web.4ccb21ed2bc10e381b46489c0@news.povray.org>
Invisible <voi### [at] devnull> wrote:

so friggin' big I thought it would be fair for the original author to take a
look, otherwise all criticism was in vain... so I mailed it to Conrad Barski...


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 30 Oct 2010 00:00:01
Message: <web.4ccb979b2bc10e38d995c6290@news.povray.org>
in any case...

Invisible <voi### [at] devnull> wrote:
> "LAND of LISP: Secrets of the Seven Guilds."
>
> Hey, you really *don't* need drawing skills to do a web comic!

you should know that by now, XKCD fan...

> "Any humans foolish enough to resist with their primitive weapons were
> dealt with in short order."
>
> There goes Java, Python, C# and Ruby. (What, no C or C++?)

I think it's implicit... ;)

> "There's a long-forgotten place where they have weapons SO POWERFUL that
> they can defeat ANY BUG! They call it the Land of Lisp."
>
> Uhuh. So an untyped language with a single global namespace and which
> touts self-modifying code as its single most significant feature is the
> way to beat program bugs?

if by global namespace you mean no modules/libs, you're very wrong.

Lisp is not untyped either, just leaves type-handling to runtime. :)

> "Greetings, your highness!"
>
> Creatures with 12 eyes and an arm for a nose: WTF-O-Meter: 1.5

your WTF-O-Meter should go up once you realize that is how he depicts Lispers...
which BTW is not really that far from reality... :)

> "SILENCE! Back in the eighties, we showed you how to program WITHOUT
> HAVING ANY BUGS! We warned you what would happen, but you didn't listen
> to us."
>
> Uh, yeah, right. I'm pretty sure there's no programming language in
> existence that completely prevents bugs. :-P

comic license... you did notice the title can be shortened as LOL, right?

> "Each of the Seven Guilds possesses a powerful bug-killing weapon that
> is unique to Lisp!"
>
> Oh yeah?
>
> So macros, functional programming, software transactional memory,
> restartable code and "conciseness" are unique only to Lisp?

all in the same line, yes. :)

well, not quite "conciseness" as Lisp languages tend to favor words and
long-descriptive names rather than single-char operator obfuscation.  Unless
we're talking about big problems where DSL-building with macros give a clear
gain in readability can indeed bring conciseness...

> "Functional programming is a mathematical approach to programming that
> was pioneered by Lisp."
>
> I beg to differ.

you don't:  in the 50's and 60's when Lisp was born there was no Haskell,
Miranda, ML nor even lowly C:  FORTRAN or the soon to appear COBOL were your
only alternatives for high-level code... and those were definitely not
functional.  Lisp was.

> Actually, I'm not even sure why people consider Lisp to be a functional
> programming language. JavaScript is about as functional as Lisp!

It's because any language with support for creating functions on the fly,
receiving functions as arguments to other functions and returning functions as
the result of evaluation of functions is able to use the functional programming
paradigm.

> And writing code in a language which *enforces* a functional style makes
> it drastically easier to debug. :-P

yes, and writing it in a lazy evaluation language makes it damn hard to debug as
you also well know.

>    if_then True  x y = x
>    if_then False x y = y
>
> is sooo much harder to implement, eh? Newsflash: lazy evaluation
> eliminates one of the major reasons for wanting macros in the first place.

It's true that unevaluated argument-passing (call-by-name) is one of the fine
reasons for macros or, obviously, lazy evaluation.  But I'd say it's a minor
bonus when compared to its main use to generate code at compile-time and build
custom syntax for the sake of readability.

> For the other reasons, we have Template Haskell. This allows you to
> write Haskell at compile-time.

> (Lisp, AFAIK, doesn't have a
> "compile-time", so things are slightly different.)

Pretty much all Common Lisp implementations are compilers, specially noted the
commercial Allegro CL and the open-source sbcl.  In Scheme, there's gambit,
bigloo and a few other such batch-compilers.  So, yeah, macros are expanded at
compile-time and the generated expressions are all eventually transformed down
to lambdas in continuation-passing-style which are them finally compiled to
either C or native code.

But even in implementations using JITs to compile to native-code, like racket,
there's a clear separation of phases, even though no executable is formally
created.

Now, given their dynamically-typed natures, they won't usually generate as fast
code as static type compilers, but hopefully not by much:

http://shootout.alioth.debian.org/u64q/benchmark.php?test=all&lang=sbcl&lang2=ghc

OTOH, when writing code for speed, you actually pin down types for variables
with (declare ...) compiler directives in the source code.  CL allows it
intrusively, in the bodies of functions, but I prefer gambit scheme way, by
allowing such declares in the command line for the compiler. :)

In any case, it seems racket (previously PLT-Scheme) JIT kinda humiliates sbcl
old-way batch compiler without even needing to pin down types:

http://shootout.alioth.debian.org/u64/benchmark.php?test=all&lang=racket&lang2=ghc

> And, unlike Lisp
> macros (AFAIK), Template Haskell statically guarantees that the code
> written by a template is *always* syntactically correct. (It does _not_
> guarantee that it is well-typed, just that there are no syntax errors.)

that's good for a language heavy on syntax.

> Also unlike Lisp macros, a Haskell template can inspect and alter
> anything in the current module (up to and including completely rewriting
> it).

sounds as deadly as ninja patching. :)

other than that, I'm not sure what you mean by inspect and alter anything in the
current module...

do you have any practical example other than the short but confusing already
examples at:

http://www.haskell.org/ghc/docs/6.12.2/html/users_guide/template-haskell.html

the examples there seem to deal with a problem that doesn't exist in Lisp in the
first place:  generate code to deal with different types.

> Like Lisp macros, Haskell templates are written in vanilla Haskell.
>
> "Using restarts and the Lisp REPL, a bug can be fixed in a running program."
>
> While that /does/ sound pretty cool, it's only possible because Lisp is
> interpreted and untyped.

they compile a module and load it into the same executable image in order to run
in place of the older version.

> "They're using this incredible device called a Wii. Say cadet, why
> aren't you shooting anything?"
>
> "I'm trying to, but the controller keeps thinking that I want to HUG the
> insectiod storm-troopers!"
>
> LOLrus.

completely random Nintendo ad... :p

> "Those ships are from the DSL Guild."
>
> Uhuh. Because no other language allows you to embed a DSL right into
> your programming language. Or even, you know, embed a language with a
> syntax entirely different from that of the host language. (Haskell's
> "quasi-quoting" allows you to embed a DSL that gets read using a parser
> that you write, using any parser libraries you desire.)

yeah, but it seems so mindnumbing complex nobody uses.

BTW, you know quasi-quoting comes from Lisp, don't you?  `(let ((r ,(fact 5)))
r) will turn into (let ((r 120)) r) at compile-time... it's so friggin' easy
it's used a lot.

> "Those ships are from the CLOS Guild."
>
> I have no idea if this is good or not. Nor do Lispers, apparently.

Common Lisp Object System is one of the main prides old Lispers enjoy chatting
about.  It's the ultra-flexible polymorphic, late-binding object system that
comes with Common Lisp.

I never used it but can see it's value for OO folks.

> "What the... Oh, I forgot about those obnoxious Schemers from the
> Continuation Guild."
>
> Again, because no other programming language has continuations, right?

yes, this is actually an innacurate description of Scheme, but they got it right
that Lispers usually don't get along with schemers, even though it's all Lisp.
Scheme is also far more geared towards functional programming than CL, another
factual error there.

All programs have continuations, it's just that so few programming languages
offer them as first-class objects.

first-class continuations and the tail-call optimization used by all functional
programming languages were born or brought to light with Scheme as described in
the lambda-the-ultimate papers in the 70's:

http://library.readscheme.org/page1.html

> "Basically, continuations let you put 'time travel' into your code."
>
> Actually, you don't need continuations to do that, necessarily.

you may not need special syntax for it if you explicitly write your programs in
continuation-passing-style, in which case you get continuations for free.  Of
course, that requires tail-call optimizations otherwise your stack blows...

> "Weaknesses: Continuations are such an awesome feature that they don't
> really have a downside."
>
> O RLY?
>
> How about the ease with which you can make your program so complicated
> that it becomes totally unmaintainable?

remember continuations are essentially gotos. ;)

you're taking a comic book too seriously...

> "Those ships are form the Brevity Guild."
>
> Yeah, true. There are no other programming languages that are brief,
> right? (Scroll upwards for one tiny Haskell v Lisp example.)

some people consider python a quite apt Lisp.  Including Google's old Lisp
visionaire Peter Norvig.

> "(accum a (for n 1 1000 (unless (some [is 0 (mod n _)] (range 2 (- n
> 1))) a.n)))"
>
> That computes the prime numbers from 1 to 1000? OK, how about
>
> primes = let f (p:xs) = f $ filter (\x -> x `mod` p > 0) xs in f [2..]
>
> Yes, *clearly* only Lisp can be brief and unintelligible.
>
> (Note well that the Haskell variant generates *all* the prime numbers in
> the universe, not just the ones less than 1000.)
>

> As an aside, does Lisp have arbitrary-precision arithmetic yet? Cos that
> Haskell fragment gives you arbitrary-precision results. Using the GMP,
> no less.

arbitrary-precision arithmetic is what Lisp got since ancient times -- they call
it a full numeric tower, no overflow error.  In fact, if you want performance
out of small toy examples, you should state something like fixnum... they also
predate GMP, but some newer implementations are using it for the sake of
maintenance...

> "(take 20 (filter even? (iterate inc 0)))"
>
> Or, to put it another way,
>
>    take 20 (filter even (iterate (+1) 0))
>
> or even
>
>    take 20 $ filter even $ iterate (+1) 0

nah, inc is 1 char shorter. ;)

whew, I'll call it a draw here and reply about the Seven Guilds of Haskell
later...


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 30 Oct 2010 00:05:01
Message: <web.4ccb99342bc10e38d995c6290@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> 10 PRINT "BASIC RULES!"
> 20 GOTO 10

(let goto ()
  (display "Scheme rulez! ")
  (goto))

:)


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 1 Nov 2010 04:58:51
Message: <4cce814b$1@news.povray.org>
On 30/10/2010 05:04 AM, nemesis wrote:
> Warp<war### [at] tagpovrayorg>  wrote:
>> 10 PRINT "BASIC RULES!"
>> 20 GOTO 10
>
> (let goto ()
>    (display "Scheme rulez! ")
>    (goto))

fix (putStrLn "Haskell rules." >>)


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 1 Nov 2010 05:20:54
Message: <4cce8676$1@news.povray.org>
On 29/10/2010 08:35 PM, nemesis wrote:
> so friggin' big

That's what SHE said...

> I thought it would be fair for the original author to take a
> look, otherwise all criticism was in vain... so I mailed it to Conrad Barski...

Well... just so long as he knows I wasn't being entirely serious. 
(Neither was the comic - presumably.)


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 1 Nov 2010 06:07:58
Message: <4cce917e$1@news.povray.org>
>> "There's a long-forgotten place where they have weapons SO POWERFUL that
>> they can defeat ANY BUG! They call it the Land of Lisp."
>>
>> Uhuh. So an untyped language with a single global namespace and which
>> touts self-modifying code as its single most significant feature is the
>> way to beat program bugs?
>
> if by global namespace you mean no modules/libs, you're very wrong.

Is it possible to create variables with local rather than global scope?

> Lisp is not untyped either, just leaves type-handling to runtime. :)

Strictly speaking, an /untyped/ language is one where you can apply any 
operator to any data - no matter how stupid. (E.g., you can multiply two 
strings together, or take the cosine of a pointer.) Very few real 
programming languages fall into this category (although a few 
intermediate compiler languages do).

The main distinction is between statically-typed and dynamically-typed 
languages. A statically-typed language does extensive and usually 
exhaustive testing in a fully automated way to statically guarantee that 
your program cannot malfunction due to mismatched types. A 
dynamically-typed language doesn't bother doing any compile-time 
checking, and just lets your program crash at runtime if you make a mistake.

The advantages of static typing are obvious (i.e., vast swathes of bugs 
are eliminated before you even *get* to run-time). Dynamic typing, on 
the other hand, has only two tiny advantages: 1; You don't have to write 
type signatures everywhere. 2; The compiler doesn't stop you performing 
valid actions just because it upsets the type system.

Most statically-typed languages have a primitive, puny type system and 
demand explicit type signatures everywhere. This makes the language 
designer's job much easier, but isn't much fun for the programmer. 
Coding in (say) Java, it's actually quite common to need to work around 
the type system.

When you get to Haskell, of course, automatic type inference blows #1 
out of the water, and the sophisticated Turing-complete type system more 
or less kills #2 as well. If you really can't express what you want to 
do easily, you can always make a small, carefully controlled hole in the 
corner of the type system to get around that, and still keep the 
overwhelming benefits of strong type-checking for the rest of the program.

Really, in the modern age, there is no reason for dynamically-typed 
languages to exist at all. It just invites bugs.

>> "Greetings, your highness!"
>>
>> Creatures with 12 eyes and an arm for a nose: WTF-O-Meter: 1.5
>
> your WTF-O-Meter should go up once you realize that is how he depicts Lispers...
> which BTW is not really that far from reality... :)

I've always been puzzled by the assertion that Lisp is THE ultimate 
solution to every programming problem. It seems to me to be more of a 
religion than a rational argument. The exchange

"Why is Lisp the best?"
"Because it has macros!"

is as nonsensical as

"Why do you believe in God?"
"Because Christ died on the cross to save mankind from sin!"

It makes no sense.

>> "Each of the Seven Guilds possesses a powerful bug-killing weapon that
>> is unique to Lisp!"
>>
>> Oh yeah?
>>
>> So macros, functional programming, software transactional memory,
>> restartable code and "conciseness" are unique only to Lisp?
>
> all in the same line, yes. :)

I'm not sure how you came to that particular conclusion...

> well, not quite "conciseness" as Lisp languages tend to favor words and
> long-descriptive names rather than single-char operator obfuscation.  Unless
> we're talking about big problems where DSL-building with macros give a clear
> gain in readability can indeed bring conciseness...

I would say that (0 - b + sqrt (b*b - 4*a*c))/(2*a) is significantly 
clearer and more concise than (div (add (sub 0 b) (sqrt (sub (mul b b) 
(mul (mul 4 a) c)))) (mul 2 a)). (Note that I had to get out a text 
editor with syntax highlighting just to type all that with the correct 
number of brackets!)

Yes, it's possible to overuse special characters as operators. But 
generally a few carefully chosen operator names can greatly shorten the 
code /and/ make it more readable.

>> "Functional programming is a mathematical approach to programming that
>> was pioneered by Lisp."
>>
>> I beg to differ.
>
> you don't:  in the 50's and 60's when Lisp was born there was no Haskell,
> Miranda, ML nor even lowly C:  FORTRAN or the soon to appear COBOL were your
> only alternatives for high-level code... and those were definitely not
> functional.  Lisp was.

I still don't think Lisp is very functional. :-P

>> Actually, I'm not even sure why people consider Lisp to be a functional
>> programming language. JavaScript is about as functional as Lisp!
>
> It's because any language with support for creating functions on the fly,
> receiving functions as arguments to other functions and returning functions as
> the result of evaluation of functions is able to use the functional programming
> paradigm.

Depends on what you consider to be "functional programming". There are 
really two separate characteristics that most FP languages have:

1. Pure functions.
2. Functions as first-class objects.

Any programming language that supports functions can support writing 
pure functions. But as far as I can tell, Lisp does not in any way 
encourage or facilitate doing so. Heck, C++ has facilities for marking 
things that won't be altered after being initialised, whereas Lisp does 
not. That makes C++ slightly more FP than Lisp.

>> And writing code in a language which *enforces* a functional style makes
>> it drastically easier to debug. :-P
>
> yes, and writing it in a lazy evaluation language makes it damn hard to debug as
> you also well know.

No, lacking a decent debugger makes debugging hard. Lazy evaluation 
makes *performance* harder to predict/control, but it makes *debugging* 
easier, since functions always produce repeatable results.

And, in fact, LoL touts lazy evaluation as an *advantage* just a few 
points further down the list. :-P

>> Newsflash: lazy evaluation
>> eliminates one of the major reasons for wanting macros in the first place.
>
> It's true that unevaluated argument-passing (call-by-name) is one of the fine
> reasons for macros or, obviously, lazy evaluation.  But I'd say it's a minor
> bonus when compared to its main use to generate code at compile-time and build
> custom syntax for the sake of readability.

You complain about "operator obfuscation" and then tout "custom syntax" 
as improving readability. Which is it? :-P

>> (Lisp, AFAIK, doesn't have a
>> "compile-time", so things are slightly different.)
>
> Pretty much all Common Lisp implementations are compilers

Since Lisp explicitly permits self-modifying code, wouldn't that mean 
that the compiler has to be present at run-time?

>> And, unlike Lisp
>> macros (AFAIK), Template Haskell statically guarantees that the code
>> written by a template is *always* syntactically correct. (It does _not_
>> guarantee that it is well-typed, just that there are no syntax errors.)
>
> that's good for a language heavy on syntax.

I won't claim to be a Lisp expert, but I was under the impression that 
not every possible list is a valid executable Lisp expression.

>> Also unlike Lisp macros, a Haskell template can inspect and alter
>> anything in the current module (up to and including completely rewriting
>> it).
>
> sounds as deadly as ninja patching. :)

I have no idea what that is.

> other than that, I'm not sure what you mean by inspect and alter anything in the
> current module...

 From what I can tell, Lisp macros work by you passing some data to 
them, which they then transform into some sort of executable code. 
Template Haskell allows you to inspect parts of the current module not 
passed in as arguments. So you can, e.g., look up the definition of a 
user-defined data structure.

> do you have any practical example other than the short but confusing already
> examples at:

No.

> the examples there seem to deal with a problem that doesn't exist in Lisp in the
> first place:  generate code to deal with different types.

The examples demonstrate how to use TH, not why you'd /need/ TH. It's 
perfectly possible to write highly polymorphic code with nothing but 
plain vanilla Haskell '98.

>> "They're using this incredible device called a Wii. Say cadet, why
>> aren't you shooting anything?"
>>
>> "I'm trying to, but the controller keeps thinking that I want to HUG the
>> insectiod storm-troopers!"
>>
>> LOLrus.
>
> completely random Nintendo ad... :p

Or... anti-ad I suppose?

>> "Those ships are from the DSL Guild."
>>
>> Uhuh. Because no other language allows you to embed a DSL right into
>> your programming language. Or even, you know, embed a language with a
>> syntax entirely different from that of the host language. (Haskell's
>> "quasi-quoting" allows you to embed a DSL that gets read using a parser
>> that you write, using any parser libraries you desire.)
>
> yeah, but it seems so mindnumbing complex nobody uses.

I get the impression that people avoid it because it's not part of the 
official language spec, it's GHC-specific. But mainly, because it's not 
really needed especially often.

> BTW, you know quasi-quoting comes from Lisp, don't you?  `(let ((r ,(fact 5)))
> r) will turn into (let ((r 120)) r) at compile-time... it's so friggin' easy
> it's used a lot.

I must be missing something... those two expressions look almost 
identical to me.

>> "Basically, continuations let you put 'time travel' into your code."
>>
>> Actually, you don't need continuations to do that, necessarily.
>
> you may not need special syntax for it if you explicitly write your programs in
> continuation-passing-style, in which case you get continuations for free.  Of
> course, that requires tail-call optimizations otherwise your stack blows...

Haskell provides a monad version of CPS, which means you don't even have 
to obfuscate all your code to do crazy continuation manipulations. 
Personally, this kind of stuff makes my mind melt. :-}

>> "Weaknesses: Continuations are such an awesome feature that they don't
>> really have a downside."
>>
>> O RLY?
>>
>> How about the ease with which you can make your program so complicated
>> that it becomes totally unmaintainable?
>
> remember continuations are essentially gotos. ;)

I think that says it all. :-)

> you're taking a comic book too seriously...

Perhaps. Maybe I'm just tired of Lisp fanboys claiming that this one 
single language is some kind of perfect Utopia.

>> As an aside, does Lisp have arbitrary-precision arithmetic yet? Cos that
>> Haskell fragment gives you arbitrary-precision results. Using the GMP,
>> no less.
>
> arbitrary-precision arithmetic is what Lisp got since ancient times.

OK, fair enough.

>> "(take 20 (filter even? (iterate inc 0)))"
>>
>> Or, to put it another way,
>>
>>     take 20 (filter even (iterate (+1) 0))
>>
>> or even
>>
>>     take 20 $ filter even $ iterate (+1) 0
>
> nah, inc is 1 char shorter. ;)

And (+1) is more descriptive.

Also, I can soon change that to "iterate (*2) 1" to generate a geometric 
sequence instead. I bet Lisp doesn't have a predefined "multiply by 2" 
function. :-P

> whew, I'll call it a draw here

My point was that here the Lisp and Haskell versions are so utterly 
similar that nobody can seriously claim that Lisp is more concise than 
Haskell (or vice versa). So saying "only Lisp can be concise - look at 
this!" isn't very compelling.

> and reply about the Seven Guilds of Haskell later...

Heh, yeah, I need to go think about that myself.

Actually, just the other day I was trying to pin down in my head exactly 
which sorts of programs Haskell is good for and bad for.


Post a reply to this message

From: nemesis
Subject: Re: Land of Lisp
Date: 1 Nov 2010 22:05:00
Message: <web.4ccf70b62bc10e3852ad8a6a0@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> Is it possible to create variables with local rather than global scope?

you're kidding me, right?

> (let ((x 1)) (+ 1 x))
=> 2
> x
reference to undefined identifier: x

BTW, let is mere syntatic sugar for function application:
((lambda (x) (+ 1 x)) 1)

> (let fact ((n 5)) (if (< n 2) 1 (* n (fact (- n 1)))))
=> 120
> fact
reference to undefined identifier: fact

fully lexically scoped.

> When you get to Haskell, of course, automatic type inference blows #1
> out of the water

yeah, except when it can't decide on the types all by itself and requests type
annotations anyway.

I would also say the Hindley-Milner kinda cheats in this regard.  You usually
don't have to type annotate whenever the function operates on numbers, strings
or Algebraic Data Types.  And Algebraic Datatypes are a dead on giveaway of the
type of an expression, because it usually have a name of its own to describe
values of its type.  So, a function like:

withBuf run = run . Buf 0 ini =<< mallocBytes ini
  where ini = 1024

needs no type annotation because it already has a friggin reference there to
type:

data Buf = Buf !Int !Int !(Ptr Word8)

so, you exchange type annotations everywhere for type names for values of this
type in the expressions.  Not that much of a fair trade:  instead of providing a
single type declaration, you provide it everywhere you use it.

I'm talking ill of Hindley-Milner type system, but you should be aware that the
same lameness can happen in Lisps too:  here and now you will see things like
fl* fx+ or fl<... that is, functions to operate on specific numeric types.
Again, they exchange a single type declaration for type-specific operators
everywhere.  Makes me wonder if the Algol folks got it right...

> Really, in the modern age, there is no reason for dynamically-typed
> languages to exist at all. It just invites bugs.

Dynamically typed languages are just perfect for quick prototyping of systems
where invariants are still not well known.  The fact that some of them can not
just quickly prototype working systems, but prototype them with good performance
is only a plus.

> "Why is Lisp the best?"
> "Because it has macros!"

I almost never use macros and still find Lisp the best. :)

I suspect its homoiconic nature has lots to do with it.  Plus, if you've never
edited Lisp code with hierarchical parenthetical editing you don't know what a
bless it is compared to any other language in existence.

> I would say that (0 - b + sqrt (b*b - 4*a*c))/(2*a) is significantly
> clearer and more concise than (div (add (sub 0 b) (sqrt (sub (mul b b)
> (mul (mul 4 a) c)))) (mul 2 a)). (Note that I had to get out a text
> editor with syntax highlighting just to type all that with the correct
> number of brackets!)

you don't need to type brackets with a proper Lisp editor such as emacs or
DrRacket.  You usually just type alt+shift+( and it opens a pair and puts you
right in.  You can select a big code section spanning several lines by just
alt+shift+(right or left).

> I still don't think Lisp is very functional. :-P

It's because you still don't know it well.

> Depends on what you consider to be "functional programming". There are
> really two separate characteristics that most FP languages have:
>
> 1. Pure functions.
> 2. Functions as first-class objects.
>
> Any programming language that supports functions can support writing
> pure functions. But as far as I can tell, Lisp does not in any way
> encourage or facilitate doing so. Heck, C++ has facilities for marking
> things that won't be altered after being initialised, whereas Lisp does
> not. That makes C++ slightly more FP than Lisp.

That's purely an implementation detail, not a language feature.  Pure math-like
functions with no side-effects only depend on you writing side-effect-free code.
 Would you really call C++ more FP than ML because nothing in the language spec
say "hey, this section is pure, you can optimize it away for concurrency"?

BTW, Haskell is the only such (well-known) language enforcing purity even for
IO.

> You complain about "operator obfuscation" and then tout "custom syntax"
> as improving readability. Which is it? :-P

new syntax doesn't mean &*%#$.  macros have as much self-describing names as
functions.

> >> (Lisp, AFAIK, doesn't have a
> >> "compile-time", so things are slightly different.)
> >
> > Pretty much all Common Lisp implementations are compilers
>
> Since Lisp explicitly permits self-modifying code, wouldn't that mean
> that the compiler has to be present at run-time?

You got that "self-modifying code" all messed up.  Lisp macros can operate on
Lisp code to produce new custom code, but that happens at compile-time.

If you really want it, you have eval at runtime, but it's as slow as everywhere
else.

> >> And, unlike Lisp
> >> macros (AFAIK), Template Haskell statically guarantees that the code
> >> written by a template is *always* syntactically correct. (It does _not_
> >> guarantee that it is well-typed, just that there are no syntax errors.)
> >
> > that's good for a language heavy on syntax.
>
> I won't claim to be a Lisp expert, but I was under the impression that
> not every possible list is a valid executable Lisp expression.

I mean there not much syntax in Lisp besides brackets, an expression evaluating
to a function (or macro) at the head and arguments for that function (or macro)
in the rest of that list.

>  From what I can tell, Lisp macros work by you passing some data to
> them, which they then transform into some sort of executable code.
> Template Haskell allows you to inspect parts of the current module not
> passed in as arguments. So you can, e.g., look up the definition of a
> user-defined data structure.

that sounds as painfully side-effectful as when I first heard it.  Hence Ninja
Patching.  google it up...

> > BTW, you know quasi-quoting comes from Lisp, don't you?  `(let ((r ,(fact 5)))
> > r) will turn into (let ((r 120)) r) at compile-time... it's so friggin' easy
> > it's used a lot.
>
> I must be missing something... those two expressions look almost
> identical to me.

yeah, you're missing that you write the first and the compiler turns it into the
second, at compile time:  yes, it evaluates (fact 5) during compilation.  It
would certainly not do any magic and evaluate (fact n) if n was not known at
compile time, but since it's a literal...

> > you're taking a comic book too seriously...
>
> Perhaps. Maybe I'm just tired of Lisp fanboys claiming that this one
> single language is some kind of perfect Utopia.

now, there are not that many Lisp fanboys out there, there are?  Most I know are
more like Lisp fanoldmen...

Scheme doesn't have many fanboys either, more like irate compsci freshmen...

> >> "(take 20 (filter even? (iterate inc 0)))"
> >>
> >> Or, to put it another way,
> >>
> >>     take 20 (filter even (iterate (+1) 0))
> >>
> >> or even
> >>
> >>     take 20 $ filter even $ iterate (+1) 0
> >
> > nah, inc is 1 char shorter. ;)
>
> And (+1) is more descriptive.

not in Lisp where it looks like a silly function application... :p

> Also, I can soon change that to "iterate (*2) 1" to generate a geometric
> sequence instead. I bet Lisp doesn't have a predefined "multiply by 2"
> function. :-P

we could have something like:
(define (pf op x) (lambda (n) (op n x)))

(iterate (pf * 2) 0)

> Actually, just the other day I was trying to pin down in my head exactly
> which sorts of programs Haskell is good for and bad for.

math problems and formal proof automation would be my guess for best suited.
Real world is more complicated.  say a script to automate a task.  You don't
really need all that type theory to get it done...


Post a reply to this message

From: Invisible
Subject: Re: Land of Lisp
Date: 2 Nov 2010 06:24:41
Message: <4ccfe6e9$1@news.povray.org>
On 02/11/2010 02:00 AM, nemesis wrote:
> Invisible<voi### [at] devnull>  wrote:
>> Is it possible to create variables with local rather than global scope?
>
> you're kidding me, right?

Well, I've never seen anybody do it before, that's all.

>> When you get to Haskell, of course, automatic type inference blows #1
>> out of the water
>
> yeah, except when it can't decide on the types all by itself and requests type
> annotations anyway.

Which is fairly rare.

In truth, you're probably going to put annotations on all your top-level 
stuff anyway, for documentation's sake. What automatic type inference 
means is that you don't have to dot type annotations absolutely 
everywhere you go (and change them all over the place when something 
changes), just in the few places you need them.

> I would also say the Hindley-Milner kinda cheats in this regard.  You usually
> don't have to type annotate whenever the function operates on numbers, strings
> or Algebraic Data Types.  And Algebraic Datatypes are a dead on giveaway of the
> type of an expression, because it usually have a name of its own to describe
> values of its type.  So, a function like:
>
> withBuf run = run . Buf 0 ini =<<  mallocBytes ini
>    where ini = 1024
>
> needs no type annotation because it already has a friggin reference there to
> type:
>
> data Buf = Buf !Int !Int !(Ptr Word8)
>
> so, you exchange type annotations everywhere for type names for values of this
> type in the expressions.  Not that much of a fair trade:  instead of providing a
> single type declaration, you provide it everywhere you use it.

I wouldn't agree with that.

If you want a point, you say Point 3 7. Which, as you say, rather gives 
away the fact that it's a Point value. But how about saying "True"? 
That's of type Bool, but you didn't write that in your expression. 
Similarly, when you write

   Branch (Leaf 3) (Leaf 7)

you didn't say anywhere that this is of type Tree Int, did you? The 
compiler infers this automatically. And don't even get me started on 
expressions which are polymorphic, and could have /several/ possible types.

> I'm talking ill of Hindley-Milner type system, but you should be aware that the
> same lameness can happen in Lisps too:  here and now you will see things like
> fl* fx+ or fl<... that is, functions to operate on specific numeric types.
> Again, they exchange a single type declaration for type-specific operators
> everywhere.  Makes me wonder if the Algol folks got it right...

Not so in Haskell. If you want to add two numbers, you use (+). It's 
polymorphic like that. And functions that use it end up being 
polymorphic too (unless you specifically don't want them to be, usually 
for performance reasons).

I'm actually surprised that a dynamically-typed language would do 
something like that. I thought the whole point of dynamic typing was 
that you can completely ignore types.

(Haskell also has the nice trick of numeric literals being polymorphic. 
Sadly, the same does not hold for lists or strings...)

>> Really, in the modern age, there is no reason for dynamically-typed
>> languages to exist at all. It just invites bugs.
>
> Dynamically typed languages are just perfect for quick prototyping of systems
> where invariants are still not well known.  The fact that some of them can not
> just quickly prototype working systems, but prototype them with good performance
> is only a plus.

All I know is that every time I've used a dynamically-typed language, 
I've spent 90% of my time fixing type errors - errors which *could* have 
and *should* have been caught automatically by the machine, but I have 
ended up catching them myself, by hand. Yuck!

>> "Why is Lisp the best?"
>> "Because it has macros!"
>
> I almost never use macros and still find Lisp the best. :)

I just don't get what's to like about it. It seems messy and ad hoc to me.

> I suspect its homoiconic nature has lots to do with it.  Plus, if you've never
> edited Lisp code with hierarchical parenthetical editing you don't know what a
> bless it is compared to any other language in existence.

More like "without automatic bracket match, it's impossible to produce 
code which is even syntactically correct". It wouldn't be so bad, but it 
seems all Lisp programmers habitually indent their code incorrectly.

I can see how having a language where everything is a list is 
technically quite a nice idea. It's simple and elegant. Unfortunately, 
the language as a whole seems to be messy and inconsistent. Too much 
backwards compatibility, too much muddled thinking. Still, I guess it 
was the first attempt...

> you don't need to type brackets with a proper Lisp editor such as emacs or
> DrRacket.  You usually just type alt+shift+( and it opens a pair and puts you
> right in.  You can select a big code section spanning several lines by just
> alt+shift+(right or left).

In other words "Lisp is so hard to write that you need a special program 
to help you do it". (The same accusation could be levelled at XML of 
course...)

>> I still don't think Lisp is very functional. :-P
>
> It's because you still don't know it well.

Sure, functions are first-class. But it lacks purity. It also doesn't 
appear to place any emphasis at all on coding in a vaguely functional 
style - not in the language design nor the standard libraries.

>> Depends on what you consider to be "functional programming". There are
>> really two separate characteristics that most FP languages have:
>>
>> 1. Pure functions.
>> 2. Functions as first-class objects.
>>
>> Any programming language that supports functions can support writing
>> pure functions. But as far as I can tell, Lisp does not in any way
>> encourage or facilitate doing so. Heck, C++ has facilities for marking
>> things that won't be altered after being initialised, whereas Lisp does
>> not. That makes C++ slightly more FP than Lisp.
>
> That's purely an implementation detail, not a language feature. Pure math-like
> functions with no side-effects only depend on you writing side-effect-free code.

No, a design that enforces purity has profound consequences for any 
implementation of that design. (Not to mention the entire way that the 
language is used.)

> Would you really call C++ more FP than ML because nothing in the language spec
> say "hey, this section is pure, you can optimize it away for concurrency"?

Who said anything about ML? I just said C++ seems more functional than Lisp.

> BTW, Haskell is the only such (well-known) language enforcing purity even for
> IO.

And Clean doesn't count because...?

>> You complain about "operator obfuscation" and then tout "custom syntax"
>> as improving readability. Which is it? :-P
>
> new syntax doesn't mean&*%#$.  macros have as much self-describing names as
> functions.

To me, "new syntax" means that I can write something that doesn't look 
like the host language. From what little I've seen, a Lisp DSL just 
looks like Lisp.

>> Since Lisp explicitly permits self-modifying code, wouldn't that mean
>> that the compiler has to be present at run-time?
>
> You got that "self-modifying code" all messed up.  Lisp macros can operate on
> Lisp code to produce new custom code, but that happens at compile-time.
>
> If you really want it, you have eval at runtime, but it's as slow as everywhere
> else.

As I say, it's news to me that Lisp can be compiled. I thought the fact 
that you have an eval function more or less prevents this. Every other 
language I've ever seen that has an eval function has been interpreted.

>>>> And, unlike Lisp
>>>> macros (AFAIK), Template Haskell statically guarantees that the code
>>>> written by a template is *always* syntactically correct. (It does _not_
>>>> guarantee that it is well-typed, just that there are no syntax errors.)
>>>
>>> that's good for a language heavy on syntax.
>>
>> I won't claim to be a Lisp expert, but I was under the impression that
>> not every possible list is a valid executable Lisp expression.
>
> I mean there not much syntax in Lisp besides brackets, an expression evaluating
> to a function (or macro) at the head and arguments for that function (or macro)
> in the rest of that list.

And a Haskell expression consists of some combination of exactly 6 (go 
count them) structures. The rest is just sugar. That's peanuts compared 
to (say) the number of possible statements in Java. (For-loops, 
while-loops, if-then-else, return, switch/break...)

>>    From what I can tell, Lisp macros work by you passing some data to
>> them, which they then transform into some sort of executable code.
>> Template Haskell allows you to inspect parts of the current module not
>> passed in as arguments. So you can, e.g., look up the definition of a
>> user-defined data structure.
>
> that sounds as painfully side-effectful as when I first heard it.  Hence Ninja
> Patching.  google it up...

So, what, I can tell the compiler to import some extra modules that I'm 
about to reference, and I can look at the definition of a user-defined 
data type and generate some boilerplate code to work on it? Yes, sounds 
terrible.

Or perhaps you meant that people could abuse it? Well, yes, obviously 
any powerful feature can be abused. So what?

>>> BTW, you know quasi-quoting comes from Lisp, don't you?  `(let ((r ,(fact 5)))
>>> r) will turn into (let ((r 120)) r) at compile-time... it's so friggin' easy
>>> it's used a lot.
>>
>> I must be missing something... those two expressions look almost
>> identical to me.
>
> yeah, you're missing that you write the first and the compiler turns it into the
> second, at compile time:  yes, it evaluates (fact 5) during compilation.  It
> would certainly not do any magic and evaluate (fact n) if n was not known at
> compile time, but since it's a literal...

Oh, right, I hadn't noticed that fact 5 had vanished.

Now a Haskell compiler /might/ evaluate that at compile-time without any 
special hints. You know why? IT CAN'T HAVE SIDE EFFECTS. :-P

Anyway, with Template Haskell, I can merely write

   $(let x = factorial 5 in [| x |])

to achieve the same effect. Not exactly intractably complex, eh?

>>> you're taking a comic book too seriously...
>>
>> Perhaps. Maybe I'm just tired of Lisp fanboys claiming that this one
>> single language is some kind of perfect Utopia.
>
> now, there are not that many Lisp fanboys out there, there are?  Most I know are
> more like Lisp fanoldmen...
>
> Scheme doesn't have many fanboys either, more like irate compsci freshmen...

Whatever. I try not to make wild claims like "Haskell is perfect". I 
think it's a fantastic language, but I know for a fact that it's not 
perfect.

>>> nah, inc is 1 char shorter. ;)
>>
>> And (+1) is more descriptive.
>
> not in Lisp where it looks like a silly function application... :p

It does what it says on the tin.

>> Also, I can soon change that to "iterate (*2) 1" to generate a geometric
>> sequence instead. I bet Lisp doesn't have a predefined "multiply by 2"
>> function. :-P
>
> we could have something like:
> (define (pf op x) (lambda (n) (op n x)))
>
> (iterate (pf * 2) 0)

Yes, that's just as succinct as (*2). Oh, wait...

>> Actually, just the other day I was trying to pin down in my head exactly
>> which sorts of programs Haskell is good for and bad for.
>
> math problems and formal proof automation would be my guess for best suited.
> Real world is more complicated.  say a script to automate a task.  You don't
> really need all that type theory to get it done...

You don't really need "all that type theory" to understand or use 
Haskell. :-P

Anyway, the basic answer seems to be programs where getting the right 
answer is more important than getting it fast, or where figuring out how 
to solve the problem at all is really complex and you need all the help 
you can get...


Post a reply to this message

From: Darren New
Subject: Re: Land of Lisp
Date: 2 Nov 2010 13:13:24
Message: <4cd046b4@news.povray.org>
Invisible wrote:
> All I know is that every time I've used a dynamically-typed language, 
> I've spent 90% of my time fixing type errors 

You're not thinking clearly, then.  Type errors in a dynamically typed 
language are not something that crop up a lot for people who use such 
languages frequently. It takes a little getting used to to remember what 
types variables are while you're writing a function, but how hard can that be?

Now, if you use a lot of crappy undocumented libraries, sure, that can be 
problematic. But that's true in any language.

The only time I get type problems is I'll occasionally mix up something like 
a list of list of X with a list of X, but that bombs out the first time you 
run it if the type system isn't absurdly forgiving, so it's not really any 
more of a problem than it getting caught the first time you compile it.

(Of course, if your IDE catches such things before you've even finished 
typing the body of the function, it's nice.)

 > From what little I've seen, a Lisp DSL just looks like Lisp.

That's because the people writing and using the DSLs are comfortable with 
LISP. LISP had read macros, so they can look like anything you want.

 > Every other language I've ever seen that has an eval function has been 
interpreted.

The *eval* call has to be interpreted, sure. But if you don't use eval, 
everything is compiled. If you do use eval, everything but the eval is compiled.

You're aware that C# has (or will have, depending how strict you want to be) 
the equivalent of eval, right?  As does Java, in some sense?

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

<<< Previous 2 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.