POV-Ray : Newsgroups : povray.off-topic : gathering infos from web pages : Re: gathering infos from web pages Server Time
11 Oct 2024 05:22:16 EDT (-0400)
  Re: gathering infos from web pages  
From: Invisible
Date: 21 Nov 2007 10:46:43
Message: <474452e3$1@news.povray.org>
Fa3ien wrote:

>> Mmm, because everybody hates Haskell? ;-)
> 
> Fears, not hate.

Ah, OK. I rephrase then: *most* people hate Haskell. The rest just ph33r 
it. ;-)

> Personally, whenever you post Haskell code, I'm pretty
> admirative of the powerfulness of what you say it does with such concise
> code.  But I am also scared by the fact that I don't understand a bit
> about how what it does relates to what the code looks like.

It seems Haskell has both the power to be completely transparent, and 
also entirely opaque. A bit like mathematical formulas, really!

> I've been delighted to see that this line of code,
> build with my thin newly acquired knowledge :
> 
> print Net::HTTP.get(URI.parse("http://www.google.be"))
> 
> produced exactly what I expected it to do ! (putting the content
> of a web page in a string)

Yeah. In Haskell you'd have to spend a few minutes installing GHC, a few 
more minutes downloading and compiling the 3rd party HTTP library, and 
then you'd have to write something like

   let uri = fromMaybe $ parseURI "http://www.google.be"
   maybePage <- httpGet uri
   let page = fromMaybe maybePage

(And replace those fromMaybe calls with some slightly larger construct 
if you actually want to do real error handling.)

> Ruby is a gem !

LOL! I bet you're not the first to think that one up...



Alternatively, if you feel ill, you might try to write the Haskell 
version as

   page <- (httpGet $ fromMaybe $ parseURI "http://www.google.be") >>= 
(return . fromMaybe)

Certainly I can see where the "scary" issue comes from...


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.