POV-Ray : Newsgroups : povray.off-topic : It had to happen again... Server Time
30 Jul 2024 10:16:51 EDT (-0400)
  It had to happen again... (Message 5 to 14 of 54)  
<<< Previous 4 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Warp
Subject: Re: It had to happen again...
Date: 25 Mar 2011 16:27:36
Message: <4d8cfab8@news.povray.org>
nemesis <nam### [at] gmailcom> wrote:
> In any case, PC gaming is cramped by the success of consoles:  what does it
> matter to have all that power when it's used basically to run console game ports
> with resolutions going higher than your monitor supports and frame rates higher
> than your eyes support...

  With more RAM you can have higher-resolution textures, shadow maps and
so on. Also most game engines support all kinds of graphical effects which
require both RAM and computing prowess and can be turned on and off. With
the console, with its limited RAM and somewhat aging display hardware, they
tune the game settings so that it will work at a good framerate on the
console, while on the PC they leave it up to the player to tune the settings
to the max if he wants and his PC can handle it. And it's not only about
textures and visuals: Also the sheer amount of objects to render can be
limited on the console due to lack of RAM and the aging display hardware.

  The difference can sometimes be pretty striking when comparing side by
side: Textures, view distances, shadows, visual effects can all be
significantly better on a top-of-the-line PC compared to the Xbox 360.

-- 
                                                          - Warp


Post a reply to this message

From: nemesis
Subject: Re: It had to happen again...
Date: 25 Mar 2011 22:15:00
Message: <web.4d8d4b3c984fdae84f17b3b30@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> nemesis <nam### [at] gmailcom> wrote:
> > In any case, PC gaming is cramped by the success of consoles:  what does it
> > matter to have all that power when it's used basically to run console game ports
> > with resolutions going higher than your monitor supports and frame rates higher
> > than your eyes support...
>
>   With more RAM you can have higher-resolution textures, shadow maps and
> so on.

at least as far as producers provide them. ;)

> Also most game engines support all kinds of graphical effects which
> require both RAM and computing prowess and can be turned on and off. With
> the console, with its limited RAM and somewhat aging display hardware, they
> tune the game settings so that it will work at a good framerate on the
> console, while on the PC they leave it up to the player to tune the settings
> to the max if he wants and his PC can handle it. And it's not only about
> textures and visuals: Also the sheer amount of objects to render can be
> limited on the console due to lack of RAM and the aging display hardware.
>
>   The difference can sometimes be pretty striking when comparing side by
> side: Textures, view distances, shadows, visual effects can all be
> significantly better on a top-of-the-line PC compared to the Xbox 360.

yep, but a top-of-the-line PC is also significantly expensive, plus games only
run on Windows.

More than that, my early comparison between Shadow of the Colossus for PS2 and
Assassin's Creed as a current gen tech should illustrate what I expect from
nextgen:  more, better everything, not simply better framerate, higher res
textures or more depth of field showing exactly same geometry.

SotC was topnotch on the PS2, but compared to AC it looks barren and empty, with
far less detailed world, far less detailed character and AC brings not just more
world visible at all at once, but much more detailed and populated with a
stunning number of characters themselves far more detailed than main character
in SotC.  Plus much better texturing, lighting, effects etc.

That's nextgen, not just better tech settings or slightly clear textures...
we're still to see that, even on PC.  I suspect only when Crytek and Epic reveal
their nextgen engines running on latest GPUs...


Post a reply to this message

From: Orchid XP v8
Subject: Re: It had to happen again...
Date: 27 Mar 2011 06:58:40
Message: <4d8f1860$1@news.povray.org>
On 25/03/2011 07:32 PM, Warp wrote:

>    I warmly recommend Assassin's Creed 2. I don't know how it is on the PS3,
> but on the Xbox 360 it looks really gorgeous.

I found the graphics to be fairly tame on the PC versions of AS1 and 
AS2. I also liked the style of AS1 more than AS2. (It's really hard to 
take people seriously when they speak with an Italian accent. It just 
sounds like a bad comedy sketch.) That said, AS2 is certainly not a 
*bad* game. It's much bigger than AS1, and a lot more varied too. It's 
just much harder to keep track of what's supposed to be happening...

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Warp
Subject: Re: It had to happen again...
Date: 28 Mar 2011 13:37:21
Message: <4d90c751@news.povray.org>
Orchid XP v8 <voi### [at] devnull> wrote:
> I found the graphics to be fairly tame on the PC versions of AS1 and 
> AS2.

  Depending on how powerful your PC is, you might be simply comparing it
to other PC games, rather than to other Xbox 360 games.

  The Xbox 360 has a pretty decent CPU and GPU even by today's standards
(although the GPU is probably getting a bit antiquated by the month), but
the absolutely major flaw of the system is the amount of RAM: 512 MB, which
by today's standards is absolutely puny. (Many PC games have had a minimum
requirement of 1GB for years.) What is worse, the RAM is shared by the CPU
and the GPU, in other words, the GPU doesn't have its own RAM (as is
customary in a PC). This means that graphics and other game data have to
share the 512 MB (unlike on a PC, where graphics can be done mostly on the
GPU's RAM and whatever the game needs on the main RAM).

  This is a real shame because the hardware is otherwise pretty capable.
The major problem is that games run out of RAM pretty fast, and have to
be content with lesser memory-hungry graphical settings such as lower
resolution textures, shadow maps, and so on. It also often limits visiblity
distances and forces somewhat drastic LOD changes.

  Having played several Xbox 360 games and, while they were graphically
nice, being slightly disappointed at some of the graphics (especially
low-resolution textures are sometimes a bit annoying), I was positively
surprised at the graphical quality of Assassin's Creed 2. Most textures
are really crisp, and the levels of detail are really high even at very
large distances. While you can sometimes see changes in LOD as you move,
it's surprisingly rare. And this on an open sandbox game.

  I'm sure they had to resort to pretty clever tricks to get this while
being constrained by 512 MB of RAM in total.

-- 
                                                          - Warp


Post a reply to this message

From: Alain
Subject: Re: It had to happen again...
Date: 28 Mar 2011 15:21:41
Message: <4d90dfc5@news.povray.org>

> Orchid XP v8<voi### [at] devnull>  wrote:
>> I found the graphics to be fairly tame on the PC versions of AS1 and
>> AS2.
>
>    Depending on how powerful your PC is, you might be simply comparing it
> to other PC games, rather than to other Xbox 360 games.
>
>    The Xbox 360 has a pretty decent CPU and GPU even by today's standards
> (although the GPU is probably getting a bit antiquated by the month), but
> the absolutely major flaw of the system is the amount of RAM: 512 MB, which
> by today's standards is absolutely puny. (Many PC games have had a minimum
> requirement of 1GB for years.) What is worse, the RAM is shared by the CPU
> and the GPU, in other words, the GPU doesn't have its own RAM (as is
> customary in a PC). This means that graphics and other game data have to
> share the 512 MB (unlike on a PC, where graphics can be done mostly on the
> GPU's RAM and whatever the game needs on the main RAM).
>

It's one of the reason to refuse any PC with an integrated video. Most 
laptops are in that case, and all netbooks. Those also have shared main 
RAM/graphics RAM.
Another problem with that, is that the CPU always have to wait for the 
GPU. Whenever the GPU, and other display hardware, is accessing your 
RAM, NOTHING else can access it. If it was the other way around, you'd 
get corrupted display with shearing, horizontal and vertical rolling, 
and lot of flicker.

Depending on the resolution used, colour depth and refresh rate, it can 
hit your CPU performance by over 50%.
On multicore systems, it's even worst.



Alain


Post a reply to this message

From: Warp
Subject: Re: It had to happen again...
Date: 28 Mar 2011 15:39:40
Message: <4d90e3fc@news.povray.org>
Alain <aze### [at] qwertyorg> wrote:
> It's one of the reason to refuse any PC with an integrated video.

  When I bought my current PC a friend of mine had a bit earlier bought
an almost identical one (same motherboard, same CPU, same amount of RAM).
The difference was that he wasn't going to use his PC for gaming, so he
was content with the integrated GPU on the motherboard, while I bought
one of the top GPUs of the time. (Well, another difference was that he
had more hard drives, but that's inconsequential.)

  Because it's not every day that one can test almost identical hardware
setups side-by-side, with the only difference being the GPU, we tested
with a version of 3DMark. The difference in speed was outright astonishing.
His computer ran it at about 1-2 frames per second at worst, while mine
ran it at a comfortable 15-20 frames per second at the lowest (higher on
average).

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: It had to happen again...
Date: 28 Mar 2011 18:11:57
Message: <4d9107ad$1@news.povray.org>
On 3/28/2011 12:21, Alain wrote:
> Another problem with that, is that the CPU always have to wait for the GPU.

The Amiga (as well as some other machines of that era) and some old 
mainframes would clock the memory at a multiple of the CPU speed and 
time-slice the DMA that way. Now that memory is the slowest non-moving part 
of the system, one can't really do that any more.

-- 
Darren New, San Diego CA, USA (PST)
   "Coding without comments is like
    driving without turn signals."


Post a reply to this message

From: scott
Subject: Re: It had to happen again...
Date: 29 Mar 2011 04:04:35
Message: <4d919293$1@news.povray.org>
>> requirement of 1GB for years.) What is worse, the RAM is shared by the
>> CPU
>> and the GPU, in other words, the GPU doesn't have its own RAM (as is
>> customary in a PC). This means that graphics and other game data have to
>> share the 512 MB (unlike on a PC, where graphics can be done mostly on
>> the
>> GPU's RAM and whatever the game needs on the main RAM).

It has advantages though, like not needing to transfer any data between 
the CPU RAM and the GPU RAM (which is a major bottle-neck in a PC, 
causing many complex algorithms to be developed).  On the xbox the CPU 
can update textures and meshes directly without needing to either write 
a complex vertex/pixel shader or transfer over large amounts of data 
per-frame.

The lack of CPU RAM isn't as large a problem as you might expect.  A 
well written PC game engine tries to minimise transfer between CPU RAM 
and GPU RAM each frame, spreading out large data transfers (eg due to 
the player entering a new part of the level) over many frames.  On a 
console you load the new data from disc directly to the shared RAM 
rather than from CPU RAM.

Of course some things need a lot of CPU RAM which are not possible on 
the xbox (or need some clever programming), but I guess they decided the 
unified RAM between CPU and GPU gave the best bang-for-buck for most games.

> Another problem with that, is that the CPU always have to wait for the
> GPU. Whenever the GPU, and other display hardware, is accessing your
> RAM, NOTHING else can access it.

The difference between the xbox and normal PCs with shared memory is 
that the xbox has a memory bandwidth of 22.4 GB/s.  Cheap PC 
motherboards today with onboard GPU have something like 8 GB/s memory 
bandwidth, laptops probably lower.  Even today the latest DDR3-2133 RAM 
has "only" 17 GB/s.

> If it was the other way around, you'd
> get corrupted display with shearing, horizontal and vertical rolling,
> and lot of flicker.

Well you can still write to the RAM (obviously not at *exactly* the same 
time as the GPU is reading it), you just make sure what you are writing 
to is not needed by the GPU that frame (usually you write to a shadow 
copy which then gets used the next frame).


Post a reply to this message

From: Orchid XP v8
Subject: Re: It had to happen again...
Date: 29 Mar 2011 14:53:03
Message: <4d922a8f$1@news.povray.org>
>> I found the graphics to be fairly tame on the PC versions of AS1 and
>> AS2.
>
>    Depending on how powerful your PC is, you might be simply comparing it
> to other PC games, rather than to other Xbox 360 games.

I haven't seen any Xbox games. I'm just comparing AS1 and AS2 on the PC 
to, say, Team Fortress 2 on the PC. (I could compare it to Crysis, but 
surely nothing else is *that* insane...)

Don't get me wrong, the graphics weren't *bad*. They just weren't 
especially great, as PC games go.

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Alain
Subject: Re: It had to happen again...
Date: 29 Mar 2011 19:11:38
Message: <4d92672a$1@news.povray.org>

>>> requirement of 1GB for years.) What is worse, the RAM is shared by the
>>> CPU
>>> and the GPU, in other words, the GPU doesn't have its own RAM (as is
>>> customary in a PC). This means that graphics and other game data have to
>>> share the 512 MB (unlike on a PC, where graphics can be done mostly on
>>> the
>>> GPU's RAM and whatever the game needs on the main RAM).
>
> It has advantages though, like not needing to transfer any data between
> the CPU RAM and the GPU RAM (which is a major bottle-neck in a PC,
> causing many complex algorithms to be developed). On the xbox the CPU
> can update textures and meshes directly without needing to either write
> a complex vertex/pixel shader or transfer over large amounts of data
> per-frame.

Not at all in the case of the PCs. The GPU don't see the memory used by 
the CPU, and the CPU don't see the memory used by the GPU. For the CPU, 
the RAM allocated to the video just don't exist. The boundary is set on 
the BIOS level and can't change at run time.
The data need to be moved between the main part of the RAM to the video 
part. It passes through a simulated PCI/AGP/PCIe interface.

>
> The lack of CPU RAM isn't as large a problem as you might expect. A well
> written PC game engine tries to minimise transfer between CPU RAM and
> GPU RAM each frame, spreading out large data transfers (eg due to the
> player entering a new part of the level) over many frames. On a console
> you load the new data from disc directly to the shared RAM rather than
> from CPU RAM.

On a PC, there is NO shared RAM. On a game console, you may have it.
On a PC, the video RAM looks just like that of a dedicated video card.

>
> Of course some things need a lot of CPU RAM which are not possible on
> the xbox (or need some clever programming), but I guess they decided the
> unified RAM between CPU and GPU gave the best bang-for-buck for most games.
>
>> Another problem with that, is that the CPU always have to wait for the
>> GPU. Whenever the GPU, and other display hardware, is accessing your
>> RAM, NOTHING else can access it.
>
> The difference between the xbox and normal PCs with shared memory is
> that the xbox has a memory bandwidth of 22.4 GB/s. Cheap PC motherboards
> today with onboard GPU have something like 8 GB/s memory bandwidth,
> laptops probably lower. Even today the latest DDR3-2133 RAM has "only"
> 17 GB/s.

I'd gues that is dual ported DDR, possibly dual channel.

>
>> If it was the other way around, you'd
>> get corrupted display with shearing, horizontal and vertical rolling,
>> and lot of flicker.
>
> Well you can still write to the RAM (obviously not at *exactly* the same
> time as the GPU is reading it), you just make sure what you are writing
> to is not needed by the GPU that frame (usually you write to a shadow
> copy which then gets used the next frame).

There is an hardware, BIOS controled, mutex with absolute priority to 
the GPU. There is only ONE data bus.
To do what you mention, you'd need dual ported RAM (that's totaly 
different from dual channel). Dual port enable you to have 2 concurent 
data exchanges, 2 reads, 2 writes, or a read and a write, at the same 
time from 2 sources or destinations. Some video RAM IS dual ported and 
maybe some consoles do use that kind of memory. It's not new, the Matrox 
video cards used that in the 90's for consumer level dards. I have one, 
a Matrox Mystic.

As warp was able to test it, a distinct video card on a PC can mean a 
tenfold performance boost.

I orinaly mentioned a 50% hit, looks like it's more like an 80 to 90% hit...



Alain


Post a reply to this message

<<< Previous 4 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.