POV-Ray : Newsgroups : povray.off-topic : Wikipath Server Time
1 Oct 2024 05:19:17 EDT (-0400)
  Wikipath (Message 8 to 17 of 47)  
<<< Previous 7 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Darren New
Subject: Re: Wikipath
Date: 7 Aug 2008 11:44:33
Message: <489b1861$1@news.povray.org>
Invisible wrote:
> don't know much about COBOL myself. All I heard is "if you miss out one 
> dot, the compiler will report an error message 800 lines later".

Well, I can tell you that if you spell it "IDNETIFICATION DIVISION" on 
the first line, you'll get 700 error messages on a 300-line program. ;-)

COBOL wasn't bad for its time. It's wordy because everything is line 
oriented, and it was supposed to be easy for non-programmers to read, 
and a 32K machine with 4 meg of disk space and milisecond cycle times 
was a high-end mainframe.

There's no dynamic layout or allocation of memory. operating systems 
weren't nearly as uniform as they are today.

(COBOL has evolved, of course, but I'm talking in the 60's versions.)

> (I don't know whether that's true or not, but any programming language 
> where beginners might realistically write an 800 line program worries me!)

It was, IIRC, some 350 lines to write a program to look through the 
employee file and calculate and print how many days each employee had 
been employed, in descending order. COBOL was pretty darn verbose.

A great deal of that was boilerplate, of course, identifying the 
program, naming the files it used and the layouts of their records, etc 
etc etc. The actual body of the code was probably 100 lines.

-- 
Darren New / San Diego, CA, USA (PST)
  Ever notice how people in a zombie movie never already know how to
  kill zombies? Ask 100 random people in America how to kill someone
  who has reanimated from the dead in a secret viral weapons lab,
  and how many do you think already know you need a head-shot?


Post a reply to this message

From: Warp
Subject: Re: Wikipath
Date: 7 Aug 2008 13:02:33
Message: <489b2aa9@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> COBOL wasn't bad for its time. It's wordy because everything is line 
> oriented, and it was supposed to be easy for non-programmers to read, 
> and a 32K machine with 4 meg of disk space and milisecond cycle times 
> was a high-end mainframe.

  I always got the impression that COBOL was the "BASIC" of the 70's,
while all real programmers used Fortran. (Heck, even nowadays some old
gurus state that real programmers use Fortran.)

-- 
                                                          - Warp


Post a reply to this message

From: Jim Henderson
Subject: Re: Wikipath
Date: 7 Aug 2008 13:04:00
Message: <489b2b00@news.povray.org>
On Thu, 07 Aug 2008 13:02:33 -0400, Warp wrote:

>   I always got the impression that COBOL was the "BASIC" of the 70's,

Try the 60's. ;-)

Jim


Post a reply to this message

From: Clarence1898
Subject: Re: Wikipath
Date: 7 Aug 2008 14:15:00
Message: <web.489b3adeeafae58b91a2a0d40@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Invisible wrote:
> > don't know much about COBOL myself. All I heard is "if you miss out one
> > dot, the compiler will report an error message 800 lines later".
>
> Well, I can tell you that if you spell it "IDNETIFICATION DIVISION" on
> the first line, you'll get 700 error messages on a 300-line program. ;-)
>
> COBOL wasn't bad for its time. It's wordy because everything is line
> oriented, and it was supposed to be easy for non-programmers to read,
> and a 32K machine with 4 meg of disk space and milisecond cycle times
> was a high-end mainframe.
>
> There's no dynamic layout or allocation of memory. operating systems
> weren't nearly as uniform as they are today.
>
> (COBOL has evolved, of course, but I'm talking in the 60's versions.)
>
> > (I don't know whether that's true or not, but any programming language
> > where beginners might realistically write an 800 line program worries me!)
>
> It was, IIRC, some 350 lines to write a program to look through the
> employee file and calculate and print how many days each employee had
> been employed, in descending order. COBOL was pretty darn verbose.
>
> A great deal of that was boilerplate, of course, identifying the
> program, naming the files it used and the layouts of their records, etc
> etc etc. The actual body of the code was probably 100 lines.
>
> --
> Darren New / San Diego, CA, USA (PST)
>   Ever notice how people in a zombie movie never already know how to
>   kill zombies? Ask 100 random people in America how to kill someone
>   who has reanimated from the dead in a secret viral weapons lab,
>   and how many do you think already know you need a head-shot?

Cobol has evolved quite a bit.  In its latest incarnation it has built in
support for CICS, XML, Websphere, DB2. It can even talk to Java.  That has to
at least double the number of error messages.

Isaac


Post a reply to this message

From: Clarence1898
Subject: Re: Wikipath
Date: 7 Aug 2008 14:30:00
Message: <web.489b3dfdeafae58b91a2a0d40@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> Darren New <dne### [at] sanrrcom> wrote:
> > COBOL wasn't bad for its time. It's wordy because everything is line
> > oriented, and it was supposed to be easy for non-programmers to read,
> > and a 32K machine with 4 meg of disk space and milisecond cycle times
> > was a high-end mainframe.
>
>   I always got the impression that COBOL was the "BASIC" of the 70's,
> while all real programmers used Fortran. (Heck, even nowadays some old
> gurus state that real programmers use Fortran.)
>
> --
>                                                           - Warp

COBOL was a little more than BASIC, but it was "THE" business programming
language.  For most large businesses today, it still is the primary programming
language for their mainstream applications (ie. order processing, inventory,
billing, etc.), at least in the USA.  Since I'm not in the scientific or
engineering field, I don't know how much Fortran is still being used.  Most
COBOL programmers looked down their noses at the nerdy little techie Fortran
programmers, while as you said, Fortran programmers never considered COBOL
programmers as real programmers. Since Fortran was my first language, I never
have cared much for COBOL, but unfortunatly I have to deal with it most every
day.

Isaac


Post a reply to this message

From: Darren New
Subject: Re: Wikipath
Date: 7 Aug 2008 14:57:07
Message: <489b4583$1@news.povray.org>
Warp wrote:
>   I always got the impression that COBOL was the "BASIC" of the 70's,

Vice versa. BASIC was the COBOL of the 70's. They were both pretty much 
the same programming model, tho. Flat code space, flat data space, 
everything preallocated, etc.

> while all real programmers used Fortran. (Heck, even nowadays some old
> gurus state that real programmers use Fortran.)

Depends what you were programming. People doing databases and business 
math (i.e., decimal, reports, etc) didn't use FORTRAN. People doing 
physics simulations didn't use COBOL.

-- 
Darren New / San Diego, CA, USA (PST)
  Ever notice how people in a zombie movie never already know how to
  kill zombies? Ask 100 random people in America how to kill someone
  who has reanimated from the dead in a secret viral weapons lab,
  and how many do you think already know you need a head-shot?


Post a reply to this message

From: Invisible
Subject: Re: Wikipath
Date: 14 Aug 2008 06:49:39
Message: <48a40dc3$1@news.povray.org>
Clarence1898 wrote:

> Since I am unfamiliar with lambda calculus, not being taught in the FORTRAN
> class I took at college, I looked it up on wikipedia.  After a few paragraphs,
> my eyes glazed over and could no longer focus.

In seriousness now...

A Turing machine (as you may or may not know) is a "computer" simplified 
down to the barest essentials. It turns out, the simpler a machine 
becomes, the harder it becomes to program it.

[Strictly, a *universal* Turing machine is a simplified programmable 
computer, if you want to nitpick.]

Similarly, the Lambda calculus is sort of the simplest possible 
programming language. And again, it turns out the simpler the language 
is, the harder it is to program anything with it!

The Lambda calculus is a programming language with 1 datatype and 1 
operator. And it's Turing-complete. That's a pretty impressive result!

The only datatype is "function", and the only operator is "function 
call". You pass each function an argument (which must be a function, 
since there aren't any other datatypes), and it returns a result (which, 
again, has got to be another function).

It's like binary. Binary isn't "complicated" - actually it's extremely 
simple. The confusing thing is that you need *a lot* of binary digits to 
actually "say" anything. And 0010010101100100110111010111010110 makes 
you go dizzy after a while.

Similarly, the Lambda calculus isn't "complicated", it's just that the 
whole "function takes a function and yields a function" seems a little 
confusing initially.

Many people speak of the "enriched lambda calculus", which is lambda 
calculus with more datatypes and operators so it's not *quite* so insane 
to use. The enriched lambda calculus really is quite simple. For example,

   λx· 3*x + 5

is a lambda function. It does roughly the same thing is

   function NONAME(x)
   {
     return 3*x + 5;
   }

in JavaScript. The difference - obviously - is that lambda functions 
don't have names.

The freaky part about the "pure" lambda calculus is that you use 
functions to represent things that aren't functions. For example, you 
say that any 2-argument function that throws away the first argument and 
returns the second one represents "true", and any 2-argument function 
that throws away the second argument and returns the first represents 
"false".

You can represent numbers as functions, and then write a function that 
takes two of these functions-that-represent-numbers and constructs a 
function that represents the number you get when you add these two 
numbers together.

...which is a really confusing way of waying you can perform 
mathematical addition in the pure lambda calculus. Instead of taking two 
numbers, converting them to binary and putting them through some digital 
circuitry, you take two numbers and convert them to functions, and 
convert the resulting function back into a number. Same deal, just more 
perplexing.



Now if you REALLY WANT TO MAKE YOUR HEAD FRIGGIN HURT... Try the SKI 
combinator calculus.

The lambda calculus says: If you can construct a function with any 
possible shape, then you can represent any kind of data and perform any 
algorithm on that data.

The SKI combinator calculus says: You can construct any shape of 
function from just 3 functions, named "S", "K" and "I".

For example, the "true" function becomes K, and the "false" function 
becomes KI. The "2" function is S(S(KS)K)I, and so forth.

...in other words, you write entire programs out of lots of S's, K's, 
I's and brackets!



IF YOU WANT TO TOTALLY BREAK YOUR MIND, you may try the Iota calculus.

SKI calculus says: You can make any function from just S, K and I.

Iota calculus says: You can make S, K and I from the X function.

In other words, the Iota calculus means you write whole programs that 
just consist of X's and brackets. That's all! Basically, the shape of 
the parse tree is the program! o_O

Somewhere I had a printout for the Iota calculus program to compute 2+2. 
Suffice it to say, it's 4 lines long, and looks something like

   X(X(XX)X(X)X(XXX)X(X)X(X(XX)))X(X(X))))X(X)...

So when I said that the lambda calculus is the "simplest" programming 
language, actually I lied. The Iota calculus is. If you write

   X(XX)

as

   *X*XX

instead (the "*" being a prefix application operator), then every Iota 
program becomes something like

  ****X*X**XX*X**X*XXX*X*XXX...

and you have a programming language that contains only two symbols - "X" 
and "*" - that is Turing-complete, and hence can express any possible 
algorithm.

The only real way to top that would be to convert the symbols to 1s and 
0s, encode the whole program as a binary integer, and then convert it to 
a unary integer. That way, your program would be a line of (several 
billion) 1s. But even I'm not *that* crazy!

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Mike Raiford
Subject: Re: Wikipath
Date: 14 Aug 2008 07:59:42
Message: <48a41e2e$1@news.povray.org>
Invisible wrote:

> Iota calculus says: You can make S, K and I from the X function.
> 
> In other words, the Iota calculus means you write whole programs that 
> just consist of X's and brackets. That's all! Basically, the shape of 
> the parse tree is the program! o_O
> 
> Somewhere I had a printout for the Iota calculus program to compute 2+2. 
> Suffice it to say, it's 4 lines long, and looks something like
> 
>   X(X(XX)X(X)X(XXX)X(X)X(X(XX)))X(X(X))))X(X)...

I think this is the invention of a sick mind. My brain hurts.

Why would someone want to do this? (other than the obvious self-abuse?)


Post a reply to this message

From: Invisible
Subject: Re: Wikipath
Date: 14 Aug 2008 08:06:26
Message: <48a41fc2$1@news.povray.org>
>> In other words, the Iota calculus means you write whole programs that 
>> just consist of X's and brackets. That's all! Basically, the shape of 
>> the parse tree is the program! o_O

>>   X(X(XX)X(X)X(XXX)X(X)X(X(XX)))X(X(X))))X(X)...

> I think this is the invention of a sick mind. My brain hurts.
> 
> Why would someone want to do this? (other than the obvious self-abuse?)

To prove it can be done.

It's an interesting theoretical question: "How simple can a programming 
language be before it stops working?"

The answer - apparently - is "extremely simple".

As I said, as the language gets simpler, all the programs become more 
complex. Until, eventually, you arrive at the Iota calculus, the 
simplest programming language that can exist. And its programs are 
FREAKIN CRAZY!

I just wish there was some way to "run" these programs to prove that 
they do actually work, and it's not just line noise. ;-)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Darren New
Subject: Re: Wikipath
Date: 14 Aug 2008 12:21:35
Message: <48a45b8f$1@news.povray.org>
Invisible wrote:
> [Strictly, a *universal* Turing machine is a simplified programmable 
> computer, if you want to nitpick.]

More completely, a Turing machine is a (very simple) machine with the 
program in ROM. A universal Turing machine is a Turing machine whose 
program (in ROM) is an interpreter for a program stored in the RAM. 
That's the sense in which it's "programmable".

The benefit is that if you can prove something is true of the universal 
turing machine, you can prove it's true of all turing machines, because 
the universal turing machine can be programmed to be like any other 
turing machine.

Mike Raiford wrote:
 > Why would someone want to do this? (other than the obvious self-abuse?)

Because it's math. If you can prove that X, with enough work, is 
equivalent to any Y, and then you prove some property of X holds, then 
you can prove that property holds for every Y.

If you can prove that (for example) you can't solve a problem with Iota 
calculus, and you can prove that Iota calculus can solve any problem a 
normal desktop computer can solve, then you can prove your desktop 
computer can't solve it.

By being very very simple, you make it easy to prove things about it 
that you'd never really want to *do* in actual real life.

-- 
Darren New / San Diego, CA, USA (PST)
  Ever notice how people in a zombie movie never already know how to
  kill zombies? Ask 100 random people in America how to kill someone
  who has reanimated from the dead in a secret viral weapons lab,
  and how many do you think already know you need a head-shot?


Post a reply to this message

<<< Previous 7 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.