POV-Ray : Newsgroups : povray.off-topic : Land of Lisp : Re: Land of Lisp Server Time
3 Sep 2024 17:19:51 EDT (-0400)
  Re: Land of Lisp  
From: nemesis
Date: 1 Nov 2010 22:05:00
Message: <web.4ccf70b62bc10e3852ad8a6a0@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> Is it possible to create variables with local rather than global scope?

you're kidding me, right?

> (let ((x 1)) (+ 1 x))
=> 2
> x
reference to undefined identifier: x

BTW, let is mere syntatic sugar for function application:
((lambda (x) (+ 1 x)) 1)

> (let fact ((n 5)) (if (< n 2) 1 (* n (fact (- n 1)))))
=> 120
> fact
reference to undefined identifier: fact

fully lexically scoped.

> When you get to Haskell, of course, automatic type inference blows #1
> out of the water

yeah, except when it can't decide on the types all by itself and requests type
annotations anyway.

I would also say the Hindley-Milner kinda cheats in this regard.  You usually
don't have to type annotate whenever the function operates on numbers, strings
or Algebraic Data Types.  And Algebraic Datatypes are a dead on giveaway of the
type of an expression, because it usually have a name of its own to describe
values of its type.  So, a function like:

withBuf run = run . Buf 0 ini =<< mallocBytes ini
  where ini = 1024

needs no type annotation because it already has a friggin reference there to
type:

data Buf = Buf !Int !Int !(Ptr Word8)

so, you exchange type annotations everywhere for type names for values of this
type in the expressions.  Not that much of a fair trade:  instead of providing a
single type declaration, you provide it everywhere you use it.

I'm talking ill of Hindley-Milner type system, but you should be aware that the
same lameness can happen in Lisps too:  here and now you will see things like
fl* fx+ or fl<... that is, functions to operate on specific numeric types.
Again, they exchange a single type declaration for type-specific operators
everywhere.  Makes me wonder if the Algol folks got it right...

> Really, in the modern age, there is no reason for dynamically-typed
> languages to exist at all. It just invites bugs.

Dynamically typed languages are just perfect for quick prototyping of systems
where invariants are still not well known.  The fact that some of them can not
just quickly prototype working systems, but prototype them with good performance
is only a plus.

> "Why is Lisp the best?"
> "Because it has macros!"

I almost never use macros and still find Lisp the best. :)

I suspect its homoiconic nature has lots to do with it.  Plus, if you've never
edited Lisp code with hierarchical parenthetical editing you don't know what a
bless it is compared to any other language in existence.

> I would say that (0 - b + sqrt (b*b - 4*a*c))/(2*a) is significantly
> clearer and more concise than (div (add (sub 0 b) (sqrt (sub (mul b b)
> (mul (mul 4 a) c)))) (mul 2 a)). (Note that I had to get out a text
> editor with syntax highlighting just to type all that with the correct
> number of brackets!)

you don't need to type brackets with a proper Lisp editor such as emacs or
DrRacket.  You usually just type alt+shift+( and it opens a pair and puts you
right in.  You can select a big code section spanning several lines by just
alt+shift+(right or left).

> I still don't think Lisp is very functional. :-P

It's because you still don't know it well.

> Depends on what you consider to be "functional programming". There are
> really two separate characteristics that most FP languages have:
>
> 1. Pure functions.
> 2. Functions as first-class objects.
>
> Any programming language that supports functions can support writing
> pure functions. But as far as I can tell, Lisp does not in any way
> encourage or facilitate doing so. Heck, C++ has facilities for marking
> things that won't be altered after being initialised, whereas Lisp does
> not. That makes C++ slightly more FP than Lisp.

That's purely an implementation detail, not a language feature.  Pure math-like
functions with no side-effects only depend on you writing side-effect-free code.
 Would you really call C++ more FP than ML because nothing in the language spec
say "hey, this section is pure, you can optimize it away for concurrency"?

BTW, Haskell is the only such (well-known) language enforcing purity even for
IO.

> You complain about "operator obfuscation" and then tout "custom syntax"
> as improving readability. Which is it? :-P

new syntax doesn't mean &*%#$.  macros have as much self-describing names as
functions.

> >> (Lisp, AFAIK, doesn't have a
> >> "compile-time", so things are slightly different.)
> >
> > Pretty much all Common Lisp implementations are compilers
>
> Since Lisp explicitly permits self-modifying code, wouldn't that mean
> that the compiler has to be present at run-time?

You got that "self-modifying code" all messed up.  Lisp macros can operate on
Lisp code to produce new custom code, but that happens at compile-time.

If you really want it, you have eval at runtime, but it's as slow as everywhere
else.

> >> And, unlike Lisp
> >> macros (AFAIK), Template Haskell statically guarantees that the code
> >> written by a template is *always* syntactically correct. (It does _not_
> >> guarantee that it is well-typed, just that there are no syntax errors.)
> >
> > that's good for a language heavy on syntax.
>
> I won't claim to be a Lisp expert, but I was under the impression that
> not every possible list is a valid executable Lisp expression.

I mean there not much syntax in Lisp besides brackets, an expression evaluating
to a function (or macro) at the head and arguments for that function (or macro)
in the rest of that list.

>  From what I can tell, Lisp macros work by you passing some data to
> them, which they then transform into some sort of executable code.
> Template Haskell allows you to inspect parts of the current module not
> passed in as arguments. So you can, e.g., look up the definition of a
> user-defined data structure.

that sounds as painfully side-effectful as when I first heard it.  Hence Ninja
Patching.  google it up...

> > BTW, you know quasi-quoting comes from Lisp, don't you?  `(let ((r ,(fact 5)))
> > r) will turn into (let ((r 120)) r) at compile-time... it's so friggin' easy
> > it's used a lot.
>
> I must be missing something... those two expressions look almost
> identical to me.

yeah, you're missing that you write the first and the compiler turns it into the
second, at compile time:  yes, it evaluates (fact 5) during compilation.  It
would certainly not do any magic and evaluate (fact n) if n was not known at
compile time, but since it's a literal...

> > you're taking a comic book too seriously...
>
> Perhaps. Maybe I'm just tired of Lisp fanboys claiming that this one
> single language is some kind of perfect Utopia.

now, there are not that many Lisp fanboys out there, there are?  Most I know are
more like Lisp fanoldmen...

Scheme doesn't have many fanboys either, more like irate compsci freshmen...

> >> "(take 20 (filter even? (iterate inc 0)))"
> >>
> >> Or, to put it another way,
> >>
> >>     take 20 (filter even (iterate (+1) 0))
> >>
> >> or even
> >>
> >>     take 20 $ filter even $ iterate (+1) 0
> >
> > nah, inc is 1 char shorter. ;)
>
> And (+1) is more descriptive.

not in Lisp where it looks like a silly function application... :p

> Also, I can soon change that to "iterate (*2) 1" to generate a geometric
> sequence instead. I bet Lisp doesn't have a predefined "multiply by 2"
> function. :-P

we could have something like:
(define (pf op x) (lambda (n) (op n x)))

(iterate (pf * 2) 0)

> Actually, just the other day I was trying to pin down in my head exactly
> which sorts of programs Haskell is good for and bad for.

math problems and formal proof automation would be my guess for best suited.
Real world is more complicated.  say a script to automate a task.  You don't
really need all that type theory to get it done...


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.