|
 |
On 29/10/2010 01:12 AM, nemesis wrote:
> http://landoflisp.com/
>
> This is a very fun webcomic turned into book, go well down into the rabbit hole
> to see what it is all about.
"Convinced? BUY NOW! Unsure? DOWNLOAD FREE CHAPTER! Skeptical? Scroll
down..."
.
.
.
.
[Christ, how far down does this thing go?!]
.
.
.
[Still going...]
.
.
[Hey, I can see my house from here!]
.
.
"LAND of LISP: Secrets of the Seven Guilds."
Hey, you really *don't* need drawing skills to do a web comic!
Giant talking ants? What the... oh, I see what you did there. *sigh*
"Holy cow, I LOVE honey!"
WTF-O-Meter: 4.6
"Any humans foolish enough to resist with their primitive weapons were
dealt with in short order."
There goes Java, Python, C# and Ruby. (What, no C or C++?)
"There's a long-forgotten place where they have weapons SO POWERFUL that
they can defeat ANY BUG! They call it the Land of Lisp."
Uhuh. So an untyped language with a single global namespace and which
touts self-modifying code as its single most significant feature is the
way to beat program bugs?
O RLY?
"Greetings, your highness!"
Creatures with 12 eyes and an arm for a nose: WTF-O-Meter: 1.5
"SILENCE! Back in the eighties, we showed you how to program WITHOUT
HAVING ANY BUGS! We warned you what would happen, but you didn't listen
to us."
Uh, yeah, right. I'm pretty sure there's no programming language in
existence that completely prevents bugs. :-P
"Each of the Seven Guilds possesses a powerful bug-killing weapon that
is unique to Lisp!"
Oh yeah?
So macros, functional programming, software transactional memory,
restartable code and "conciseness" are unique only to Lisp?
I say again: O RLY?
"Functional programming is a mathematical approach to programming that
was pioneered by Lisp."
I beg to differ.
Actually, I'm not even sure why people consider Lisp to be a functional
programming language. JavaScript is about as functional as Lisp!
"How it kills bugs: Writing code in a functional style guarantees that
[...] This makes it very easy to debug."
And writing code in a language which *enforces* a functional style makes
it drastically easier to debug. :-P
(No need to worry about whether any of the libraries you're using took
shortcuts. No temptation to take shortcuts yourself.)
"True macros are one of Lisp's most unique and amazing features. True
macros allow you to add new functionality to Lisp in a very fundamental
way."
Such as...?
"Lisp macros are so powerful that you can write your own if-then command!
(defmacro three-way-if (expr a b &rest c)
(let ((val (gensym)))
`(let ((,val ,expr))
(cond ((and (numberp ,val) (zerop ,val)) a)
(,val ,@c)
(t ,b)))))
[...] you'll make your like much easier by writing a macro!"
Riiight. Because
if_then True x y = x
if_then False x y = y
is sooo much harder to implement, eh? Newsflash: lazy evaluation
eliminates one of the major reasons for wanting macros in the first place.
For the other reasons, we have Template Haskell. This allows you to
write Haskell at compile-time. (Lisp, AFAIK, doesn't have a
"compile-time", so things are slightly different.) And, unlike Lisp
macros (AFAIK), Template Haskell statically guarantees that the code
written by a template is *always* syntactically correct. (It does _not_
guarantee that it is well-typed, just that there are no syntax errors.)
Also unlike Lisp macros, a Haskell template can inspect and alter
anything in the current module (up to and including completely rewriting
it). Like Lisp macros, Haskell templates are written in vanilla Haskell.
"Using restarts and the Lisp REPL, a bug can be fixed in a running program."
While that /does/ sound pretty cool, it's only possible because Lisp is
interpreted and untyped. Also, they managed to give an example fragment
that is a perfect candidate for software transactions. That's cute.
(I am unable to determine enough about the semantics of restarts to
comment further.)
"To modify the value of a variable in Common Lisp, you use setf.
However, this command also has an amazing special power: Instead of a
variable name, you can pass it a complex Lisp expression that retrieves
a value."
Right. Because you could never implement something like this in any
other language...
"They're using this incredible device called a Wii. Say cadet, why
aren't you shooting anything?"
"I'm trying to, but the controller keeps thinking that I want to HUG the
insectiod storm-troopers!"
LOLrus.
"Those ships are from the DSL Guild."
Uhuh. Because no other language allows you to embed a DSL right into
your programming language. Or even, you know, embed a language with a
syntax entirely different from that of the host language. (Haskell's
"quasi-quoting" allows you to embed a DSL that gets read using a parser
that you write, using any parser libraries you desire.)
"Those ships are from the CLOS Guild."
I have no idea if this is good or not. Nor do Lispers, apparently.
"What the... Oh, I forgot about those obnoxious Schemers from the
Continuation Guild."
Again, because no other programming language has continuations, right?
"Basically, continuations let you put 'time travel' into your code."
Actually, you don't need continuations to do that, necessarily.
"Weaknesses: Continuations are such an awesome feature that they don't
really have a downside."
O RLY?
How about the ease with which you can make your program so complicated
that it becomes totally unmaintainable?
"Holy s**t! What are those?! Jeez! Those are not the kind of bugs we had
in the eighties! Wait... I can't believe it! The NEW Lisp guilds have
come to join the battle!"
OK...
"Those ships are form the Brevity Guild."
Yeah, true. There are no other programming languages that are brief,
right? (Scroll upwards for one tiny Haskell v Lisp example.)
"(accum a (for n 1 1000 (unless (some [is 0 (mod n _)] (range 2 (- n
1))) a.n)))"
That computes the prime numbers from 1 to 1000? OK, how about
primes = let f (p:xs) = f $ filter (\x -> x `mod` p > 0) xs in f [2..]
Yes, *clearly* only Lisp can be brief and unintelligible.
(Note well that the Haskell variant generates *all* the prime numbers in
the universe, not just the ones less than 1000.)
As an aside, does Lisp have arbitrary-precision arithmetic yet? Cos that
Haskell fragment gives you arbitrary-precision results. Using the GMP,
no less.
"Now that computers have multiple cores, there is a lot of interest in
finding elegant ways to use them. One popular approach is Software
Transactional Memory."
Wow, that sounds amazing! Yep, that's definitely a unique Lisp feature.
Oh, wait...
(Pity you can't guarantee that the code in your transactions has no
side-effects, eh?)
"And finally, the Lazy Guild."
Oh wow. This, truly, no other programming language has ever had...
"(take 20 (filter even? (iterate inc 0)))"
Or, to put it another way,
take 20 (filter even (iterate (+1) 0))
or even
take 20 $ filter even $ iterate (+1) 0
No bracket-counting. :-P
"Humanity had been saved! Programmers could go back to programming, no
longer in fear of bugs. By learning the lessons of the Ten Guilds of
Lisp, programs could become richer and more robust than ever before."
I realise that this is of course propaganda, and nobody is _seriously_
suggesting that Lisp is the silver bullet to end all programming
problems, but seriously. Any programmer who does not fear bugs either
doesn't care if his programs work or not, or is a fool.
Hmm, now here's a thought. What would the Seven Guilds of Haskell look
like? What are the top ten bug-killing features of the language?
1. The type system.
Haskellers have an expression: If it compiles, it usually does what it's
supposed to.
While I emphasize "usually" [i.e., not "always"], this expression exists
for a reason. It's almost creepy how often the type checker can figure
out that your code is broken without even running it. Yes, it's annoying
when the type checker stops your code from running when it's perfectly
"obvious" what it should do. But I lose count of the number of times
I've battled against the type system, only to eventually realise some
fatal flaw in the design of my program - all without ever having
actually run it! The type system forces you to *think* about what you're
doing, and often you figure out that you've missed an important
possibility because the types don't line up.
And then, yes, now and then you want to do something which actually *is*
perfectly safe, but the type system won't let you. It doesn't happen
very often though...
2. Functional purity.
The comic talks about time travel and how useful it is. Well, if your
data never changes, going backwards in time becomes a trivial operation.
(Going forwards, of course, is another matter...)
Go look at the interactive Huffman compressor. I initially wrote it
using destructive updates. But after battling with all sorts of obscure
bugs, I eventually came to realise that I wanted to keep different
copies of (say) the min-heap containing the symbol probabilities at
different stages of construction. And that's really, really hard if you
keep destructively updating it. If you look at the code I realised,
you'll find almost all the data is actually immutable.
(You may say that's because I've been brainwashed by Haskell and that
I'm incapable of programming any other way now. I would suggest that
this is not the case. Immutable data actually makes the code simpler to
write in this instance.)
The trouble with destructive updates is that you can easily forget that
you're doing them. Haskell, with its wonderful type system, *lets* you
perform destructive updates, but *reminds* you that you're doing them
with the type signatures.
3. ???
Haskell has many fantastic features. I'm not sure which ones I could
specifically point to as "bug killers" though.
Multi-core programming would probably be one to point at. Writing
multi-core programs is notoriously bug-prone. Where Smalltalk, Java,
Eiffel and so on give you threads and locks, Haskell (or more precisely,
the de facto Haskell implementation) give you half a dozen different
tools, all of which can be used simultaneously:
- OS threads.
- Lightweight threads.
- Sparks.
- Nested data parallelism.
- Locks.
- Transactions.
Lightweight threads scale to thousands or tens of thousands of threads
in a single program, without killing a modest laptop. You can use them
with locks, or you can use transactions (the Software Transactional
Memory mentioned by Lisp).
That handles /concurrency/ (i.e., doing several different things at
once). For /parallelism/ (i.e., doing the same thing with multiple
cores), we have sparks and data parallelism.
Sparks are trivially lightweight code annotations. You add one extra
function call, and it's parallel. No threads, no locks, no semaphores,
no race conditions, no memory leaks, nothing. The only possible hazard
is laziness. (I.e., your program might not do as much work in parallel
as you had intended.)
Nested data parallelism is even more mental. It's still in early
development, but the idea is that you specify a set of array
transformations, and the run-time engine figures out the best way to
spread the work across multiple cores, taking into account cache
coherence, load-balancing, and so forth. (And who says you need Matlab
to do that?)
Then there are tools like Thread Scope, which let you interactively
visualise your parallel [or serial] Haskell application's run-time
performance in real-time. (Currently only analysis thread performance
though. Older profiling tools handle space analysis and so on.)
But we're drifting away from bug-killing now.
Also related to bug-killing is Haskell's total absence of null-pointers,
and it's "typed unions". Slightly less related is the monadic action
notation, which allows you to do the kinds of time-travelling the Lisp
comic talks about, but with a syntax that's no different than ordinary
I/O operations.
(This of course is due to Haskell's hardwired "do-notation".
Unfortunately other sequencing primitives such as arrows get no such
benefit, sadly. I guess we need somebody to write a quasi-quoter or
something...)
> It's the brainchild of Conrad Barski, from lisperati. He also has something to
> say about haskell:
>
> http://lisperati.com/haskell/
Hoookay, let's take a look...
"The only preparation you need to do is to install the Glasgow Haskell
compiler - you can get the latest version from here."
Actually the preferred way to do this now is to install the Haskell
Platform. But anybody following the link will find that out, so no drama.
"Now you have all you need to run the Hello World program below."
16 lines of code for Hello World?
Oh, there goes half our audience...
Best of all, the only 2 lines of /executable/ code could actually be
trivially rewritten as just 1 line. But I'm presuming he's done it this
way because he intends to append to it. (?)
"runhaskell tutorial.hs"
Well, at least he did it the correct way (runhaskell, not runghc). He
could have told everyone to put a hash-bang at the top... >_<
"Every programmer should know about regular expressions."
O RLY?
Also, according to /my/ documentation, Text.Regex doesn't exist. You
have to import Text.Regex.Posix (or some other sub-module, depending on
exactly which regex variant you want - but only POSIX is provided by
default). Then again, we don't know what GHC version he tested against
(and therefore which version of the regex package).
"I'm not happy if I can't use 'comparing', and I think we all want this
to be a /happy/ tutorial."
WTF-O-Meter: 1.9
"The way Haskell compilers handle types puts them head and shoulders
above most other compilers."
I think you mean "the way the Haskell language handles types", but sure,
it's one of the major strong points of the language.
"The first few 'type' lines should be self-explanatory; a Point is just
a tuple of floating numbers, a a Polygon is just a list of Points"
Yeah, you neglected to mention that "[Int]" means "list of Int". :-P
The fact that (Float,Float) is a pair of Floats is fairly self-evident,
but [Point] being a list of points is decidedly /not/ obvious.
"For instance, the EnergyFunction takes an arbitrary type 'a'"
You haven't explained why "Float" is a concrete type, but "a" is an
arbitrary type. In fact, you haven't even explained that Haskell can
handle arbitrary types polymorphically.
Then again, this is a tutorial. Maybe he's expecting to go over this later.
"Just paste this new code to the button of the existing program."
Thought so.
I wonder how many people will notice the indentation? It's critical that
it matches the code above it, and this isn't very visually obvious. (!)
"
let people :: [Person]
people = read people_text
"
Or, as I like to put it,
let people = read people_text :: [Person]
"Something else that's really 'good' is that when we read our people
from the file, we didn't have to tell Haskell what type of data we were
reading."
Uh, looks to /me/ like you did, actually. :-P
Now, if you have arranged your example so that the compiler could
/infer/ the type, and didn't need a type signature, your statements
would seem far more persuasive.
"We're going to draw pictures in a really cool way: We're going to write
our own SVG files from scratch!"
I was waiting to see how a vanilla GHC install does graphics. I was
thinking maybe PPM, but SVG is a master-stroke... I'll have to remember
that one.
<big wedge of code>
OK, that's a pretty big wedge of code. In fact, it demonstrates a common
Haskell characteristic: it's a zillion miles wide, and only a few lines
long.
He's apparently defining every function in the program as a local
variable. That's very bad style, but presumably we wants to be able to
just concatenate each block of code to the end of the program.
Also, there's a HELL of a lot of concatMap calls in there. I doubt too
many people will figure out WTF that is all about.
Also, "zip.repeat"? I think you mean "zip . repeat". :-P And you're
going to have to explain a whole bunch about curried functions and
function composition before it makes any semblance of sense.
"Haskellers never use loops- Instead, they either use recursion to do
looping, or they use functions like map that take other functions as
parameters. Functions that do this are called higher order functions."
Well that's as clear as mud. (Functions that do "this"? Which "this" are
you referring to?) Sure, *I* know what he's talking about, but it's not
the best sentence structure ever.
"In this example, we're using a clever variant called concatMap that
also concatenates the result of the mapping together, saving us a step."
...which only makes sense once you know that the result of the mapping
step is required to be a list or string.
"If you can understand what colorize = zip.repeat means, you'll
understand probably most of what Haskell has to offer in just three words!"
Yeah, that's what I said a few paragraphs ago. :-P
"You've gotta admit, a language that can do that is pretty cool! Writing
functions with the "dangling values" removed is called writing in a
point-free style."
I'm sure at least a few people are at this point thinking "dude, WTF?
That's not *cool*, it's *confusing*!"
"Arguably, Haskell is the one language that has the toughest type
checking and allows for the briefest code- That means Haskell allows you
write code that's more bug free than almost any other language. I would
estimate that I spent only about 3% of my development time debugging
this program, which I could never accomplish that in another language.
(Now, mind you, it can be a b*** to get your program to compile properly
in Haskell in the first place, but once you get to that point, you're
home free :-)"
I think he likes it...
<another huge chunk of code>
Oh... oh my god. Now I see why he's using the regex package. He's taking
SVG as input, and using a regex to process it, rather than a real
parsing library. o_O
(Need I even begin to explain why using a regex to process XML is an
absurdly bad idea?)
Still, I guess it's better than explaining how to use Parsec [which is
also available out-of-the-box] to build a true parser. Properly parsing
XML is a large problem. Still, it would be nice to see him try...
"This code illustrates another situation where Haskell's laziness really
makes things easy for us: All the regular expression functions just take
regular 'ol text strings."
Yeah, lazy I/O can be great. It can also result in massive memory leaks
or files held open too long if you're not careful. Might be worth
mentioning that.
"Somewhere, deep in the brain of every programmer, is a neuron that has
only one role in life: Whenever a sound is recorded by the hair cells of
the ear that sounds like "Cartesian Product", this neuron will dump
unimaginable quantities of neurotransmitters at the nodes of other
neurons, that translate into the neuron-equivalent of "Oh my God!!
INEFFICIENCY ALARM RED!! All Neurons to full alert! RUN RUN RUN!!!"
Enough said."
LOLrus.
"What's Bad About this Code?
Well, these four functions are going to be the bread and butter of our
this program- Probably, about 99% of CPU time will be spent in this
little bit of code... Consequently, every little inefficiency in this
code will hurt performance hundredfold... And boy, is this code
inefficient."
"These things can be fixed relatively easily by ugli-fying the code"
Well there's a good advert for using Haskell then. >_<
"Of course, the total number of annealing steps we're doing (500) is not
enough for a very good annealing- You'd need to run a few million steps
and use GHC to compile the program to machine language to get an optimal
result- Here's how you'd compile it:
ghc -O2 -fglasgow-exts -optc-march=pentium4 -optc-O2 -optc-mfpmath=sse
-optc-msse2 --make picnic.hs"
Or, you know,
ghc -O2 --make picnic
which is going to produce near enough the same damned result. (Most
especially, -fglasgow-exts enables extra language features which the
tutorial DOES NOT USE. Adding this has NO EFFECT!) To make it go faster,
you should change the horrifyingly inefficient algorithms mentioned
above, not twiddle compiler switches.
"As this example shows, Haskell's powerful typing system allows us to
prevent leakage from different sections of code in ways almost no other
language can match."
The statement is true, but the example does not demonstrate this
particularly vividly.
"Despite its many advantages, I humbly suggest, therefore, that in the
future there will continue to be a rift between the "imperative" and
"functional" camps of programming, until someone comes up with a truly
robust way of uniting these two camps- And I think that some profound
programming discoveries still need to be made in the future before this
problem is really resolved- I get the feeling it's just not good enough
to wave at the problem and say "Monads"."
We shall see...
*I* would suggest that there are much more real problems preventing
widespread adoption of Haskell. For example, half of Hackage won't
compile on Windows. That's a pretty big show-stopper, right there.
Calling "floor" is 5x slower in Haskell than in C. That's a pretty big
show-stopper if you want numerical performance. Nothing to do with lofty
questions of theory, just dull old real-world concerns.
Random note: Take a look at this announcement from Debian.
http://www.debian.org/News/2010/20100806
The freeze notice babbles on about KDE, GNOME, OpenOffice, Apache, PHP,
Python, Ruby, GCC and... GHC? In the same sentence?
It appears people are starting to take notice...
Post a reply to this message
|
 |