POV-Ray : Newsgroups : povray.off-topic : Prehistoric dust Server Time
4 Sep 2024 15:21:28 EDT (-0400)
  Prehistoric dust (Message 51 to 60 of 145)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: nemesis
Subject: Re: Dusty
Date: 18 May 2010 14:48:48
Message: <4bf2e110@news.povray.org>
Orchid XP v8 escreveu:
>>>>> 1997? o_O
>>>>
>>>> just in time for the first version of GHC... :-)
>>>
>>> Actually about ten years *after* the first version of GHC.
>>>
>>> Yes, I realise that sounds utterly absurd...
>>
>> Timing was just as innacurate as the original post.  All for the sake 
>> of a good joke. :)
> 
> Hey, *I* had to look it up.
> 
> It still slightly frightens me that Haskell is actually this old... Just 
> think how much better the world could be today if its ideas had caught 
> on back then?

hey, Lisp was born before C or even Algol and that didn't help either! :P

-- 
a game sig: http://tinyurl.com/d3rxz9


Post a reply to this message

From: Orchid XP v8
Subject: Re: Dusty
Date: 18 May 2010 14:51:35
Message: <4bf2e1b7@news.povray.org>
>> It still slightly frightens me that Haskell is actually this old... 
>> Just think how much better the world could be today if its ideas had 
>> caught on back then?
> 
> hey, Lisp was born before C or even Algol and that didn't help either! :P

Oh well. I guess as in all other aspects of technology, the least 
effective technology always wins.

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Clarence1898
Subject: Re: Dusty
Date: 18 May 2010 14:55:01
Message: <web.4bf2e17aecb621efaba2b8dc0@news.povray.org>
Orchid XP v8 <voi### [at] devnull> wrote:
> >> So the concept of a filesystem storing named files already existed at
> >> this time?
> >
> > Generally, yes. But you usually wound up pre-allocating files, and they
> > were contiguous on disk.
>
> OK.
>
> Files only on disk? Or on tape too? (From what I've seen, punch cards
> didn't have this level of abstraction. It wouldn't be terribly necessary
> I guess...)

Data on punch cards are treated no differently than data on mag tape, paper tape
or disk.  It is a sequential file with a fixed record length of 80 bytes.
A program only knows it reads 80 byte records, it doesn't care what physical
device its on. Just don't try to read the punch card data backwards.

> >> Interesting. So the system actually "knows" where each field of a
> >> record is then?
> >
> > Records were fixed size, so it was trivial to calculate.
>
> OK. But does the system know where the *fields* in a record are? Or just
> what size the records are?

The file systems does not understand fields.  It understands file organization
(sequential, indexed, random access), record type (fixed, variable, undefined),
and maximum record length and block size.  Interpretation of data at the field
level is done by the program.  If the file is a part of a database, the database
manager handles field definition.

>
> >> Really? I didn't think anybody had mainframes any more... just big
> >> server farms.
> >
> > The people who want to do lots of I/O have machines where instead of
> > GPUs they have IOPs.  A 800,000 line phone switch, for example, is
> > pretty much all IOP, with something like a 68000 running the actual
> > switching part.
> >
> > Of course, what one might call a "PC" nowadays has a terabyte of RAM and
> > 96 quad-core processor chips, so the lines blur.
>
> Yeah, I think the term "mainframe" is probably obsolete now. There are
> probably more exact ways to describe what type of computer you mean.
>

There is a lot of overlap between a "mainframe" and a "PC". There are some PC
configurations that are more powerful than some mainframes.  So what is or is
not a mainframe can be debatable. There are very few things a modern mainframe
can't do that a PC can and vice versa.  You can run Linux on an IBM z/series
machine, just like on a PC.  As a user it would be difficult to tell the
difference.  Its all a matter of scale.

> --
> http://blog.orphi.me.uk/
> http://www.zazzle.com/MathematicalOrchid*

Isaac


Post a reply to this message

From: Clarence1898
Subject: Re: Dusty
Date: 18 May 2010 15:00:01
Message: <web.4bf2e35fecb621efaba2b8dc0@news.povray.org>
Orchid XP v8 <voi### [at] devnull> wrote:
> >> It still somewhat blows my mind that you could do anything useful with
> >> so little memory. Presumably for processing large datasets, most of
> >> the data at any one time would be in secondary storage?
> >
> > Large datasets then were also very tiny compared to large datasets of
> > today. :)
>
> Sure. But 1MB is such a tiny amount of memory, it could only hold a few
> thousand records (depending on their size). It would almost be faster to
> process them by hand then go to all the trouble of punching cards and
> feeding them through a computer. So it must have been possible to
> process larger datasets than that somehow.

You never read the entire dataset into memory. You process it a record at a
time.
The only limit on file size is the media, not memory.  There is no difference in
memory consumption between processing 10 records or 10 million records.

>
> > see the revolution that were programs like ed (and its successor vi) in
> > bringing flexible terminal text editing rather than wasting tons of
> > paper... :)
>
> ...not to mention card...
>
> --
> http://blog.orphi.me.uk/
> http://www.zazzle.com/MathematicalOrchid*

Isaac


Post a reply to this message

From: Orchid XP v8
Subject: Re: Dusty
Date: 18 May 2010 15:14:32
Message: <4bf2e718$1@news.povray.org>
>>>> It still somewhat blows my mind that you could do anything useful with
>>>> so little memory. Presumably for processing large datasets, most of
>>>> the data at any one time would be in secondary storage?
>>> Large datasets then were also very tiny compared to large datasets of
>>> today. :)
>> Sure. But 1MB is such a tiny amount of memory, it could only hold a few
>> thousand records (depending on their size). It would almost be faster to
>> process them by hand then go to all the trouble of punching cards and
>> feeding them through a computer. So it must have been possible to
>> process larger datasets than that somehow.
> 
> You never read the entire dataset into memory. You process it a record at a
> time.

Right. That's what I figured.

> The only limit on file size is the media, not memory.  There is no difference in
> memory consumption between processing 10 records or 10 million records.

If you're trying to, say, sort data into ascending order, how do you do 
that? A bubble sort? (Requires two records in memory at once - but, more 
importantly, requires rewritable secondary storage.)

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: andrel
Subject: Re: Dusty
Date: 18 May 2010 15:20:03
Message: <4BF2E866.9090900@gmail.com>
On 18-5-2010 12:56, Invisible wrote:
> I had always assumed that the first computers were like current 
> computers, just using relays or whatever instead of transisters, and 
> with vastly inferior specifications.
> 
> However, it appears that this isn't the case.
> 
> For example, I thought they all used latch circuits for memory, but 
> apparently not. There were things like core memory, which I'd never 
> heard of. Presumably it's faster and cheaper to make core memory as 
> opposed to wiring up thousands of latch circuits?
> 
> Another example. According to legend, there was a time when if you 
> wanted to run a program, you used a machine not unlike a typewriter to 
> punch holes into a card. You "type in" the program onto punch cards like 
> this, and only once the entire program and all its data has been punched 
> do you even go near the actual *computer*. You feed the cards into a 
> reader. It reads them all, and then spends the next six months running 
> the program. Finally, you get a stack of new punched cards representing 
> the results.
> 
> Does anybody know approximately when this time was?

1980, except that it was the next day and not six months later.

> For that matter, does anybody have a broad timeline of when various 
> technologies were in use? What are the dates for things like core 
> memory, drum memory, punch cards, magnetic tape, relays, vacuum tubes, 
> transistors, ICs, etc?

There is probably a wikipedia page about that, have fun searching.

> Was there ever a time when programs were entered into memory via 
> switches rather than some other medium?

Yes. Use a hex pad like scott myself in 1984 and had a friend who had 
build his own computer and booted it by manually starting the booting 
process using a row of binary toggle switches.

New programs were enetered the same way.

> Was there ever a "punched tape" medium similar to punch cards?

Yes. E.g used for boot loading the machine in my former department until 
1987 or so.

> Similarly, you hear people talk about the VAX, the PDP, the varouis IBM 
> mainframes and Cray supercomputers. Does anybody know the timeline for 
> these, the technologies used and the basic design and performance details?

Wikipedia again I assume

(Some answers on http://www.computerhistory.org/timeline/?category=cmptr)

> (Sure, you can look up individual questions on Wikipedia, but the 
> articles tend to contain huge amounts of minute detail about specific 
> things. I'm trying to get a general overview of an entire era.)

The page you are looking for must be there also.


Post a reply to this message

From: Clarence1898
Subject: Re: Dusty
Date: 18 May 2010 15:30:00
Message: <web.4bf2ea0fecb621efaba2b8dc0@news.povray.org>
Orchid XP v8 <voi### [at] devnull> wrote:
> >>>> It still somewhat blows my mind that you could do anything useful with
> >>>> so little memory. Presumably for processing large datasets, most of
> >>>> the data at any one time would be in secondary storage?
> >>> Large datasets then were also very tiny compared to large datasets of
> >>> today. :)
> >> Sure. But 1MB is such a tiny amount of memory, it could only hold a few
> >> thousand records (depending on their size). It would almost be faster to
> >> process them by hand then go to all the trouble of punching cards and
> >> feeding them through a computer. So it must have been possible to
> >> process larger datasets than that somehow.
> >
> > You never read the entire dataset into memory. You process it a record at a
> > time.
>
> Right. That's what I figured.
>
> > The only limit on file size is the media, not memory.  There is no difference in
> > memory consumption between processing 10 records or 10 million records.
>
> If you're trying to, say, sort data into ascending order, how do you do
> that? A bubble sort? (Requires two records in memory at once - but, more
> importantly, requires rewritable secondary storage.)

Right, you read as many records in as you have memory for, and use rewritable
secondary storage for work files.  Preferably disk, but if your not in a hurry
mag tape will work.  When VIPs would come around for a tour of our datacenter,
we would always start up a sort using tape for work files.  It was fun to watch
the sort program read, read backwards, rewind, and forward space the tapes.
Slow but fun to watch.

>
> --
> http://blog.orphi.me.uk/
> http://www.zazzle.com/MathematicalOrchid*

Isaac


Post a reply to this message

From: Warp
Subject: Re: Prehistoric dust
Date: 18 May 2010 15:33:37
Message: <4bf2eb91@news.povray.org>
Orchid XP v8 <voi### [at] devnull> wrote:
> >>>>>   There were no children many hundred billion years ago.
> >>>> I was waiting for that one... *sigh*
> >>>   It's a so-called mathematician's answer.
> > 
> >> Technically, 1 hundred billion years is about 7.3 times the estimated 
> >> age of the universe
> > 
> >   Depends on whether you are using American billions or European billions.

> NAAAAAARGH!! >_<

> I never did like Americans... (Then again, they probably don't like me 
> either.)

  Then why are you using the American billion above?

-- 
                                                          - Warp


Post a reply to this message

From: Stephen
Subject: Re: Dusty
Date: 18 May 2010 15:45:18
Message: <4bf2ee4e$1@news.povray.org>
On 18/05/2010 7:20 PM, Clarence1898 wrote:
> Stephen<mca### [at] aolDOTcom>  wrote:
>> On 18/05/2010 4:44 PM, Invisible wrote:
>>>> As I recall about 60 characters per second.  The tape was paper, was 8
>>>> holes
>>>> wide, and easily broken or scrunched.
>>>
>>> Mmm, that's fairly fast for an optical system.
>>
>> Was it optical? I seem to remember it was mechanical with spring loaded
>> teeth.
>>
>>
>> --
>>
>> Best Regards,
>>   Stephen
>
> I was thinking it was optical.  I don't ever remember having maintenance on the
> read head.


and down. Of course the high speed tape readers must have been optical.

> But that's been over 35 years ago. Now I sometimes have a hard time
> remembering what I had for breakfast.
>

I get round that problem by having the same thing every day. ;-)

-- 

Best Regards,
	Stephen


Post a reply to this message

From: Stephen
Subject: Re: Dusty
Date: 18 May 2010 15:51:32
Message: <4bf2efc4$1@news.povray.org>
On 18/05/2010 7:23 PM, Orchid XP v8 wrote:
>>> Was there ever a "punched tape" medium similar to punch cards?
>>
>> Yes. That's why DEL is up at 127. Think about it.
>
> Oh... oh dear god. You *are* kidding me, right??

He is not. It the smart thing to do. Is it not?

Oh! on second thoughts you could glue the chad back into the holes. :-P

-- 

Best Regards,
	Stephen


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.