|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
http://www.tomshardware.com/news/windows-cpu-gpu,6645.html
Crysis with no GPU? Now I've seen everything!
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> http://www.tomshardware.com/news/windows-cpu-gpu,6645.html
"Systems built to take advantage of WARP from a hardware standpoint will be
able to display graphics even when the video card is missing-or toasted."
So what are they going to do, put the DVI connector on the motherboard or
something? Sounds cool.
> Crysis with no GPU? Now I've seen everything!
Looks like this is just a beefed up version of the reference rasteriser that
comes with the DirectX SDK.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
scott wrote:
>> http://www.tomshardware.com/news/windows-cpu-gpu,6645.html
>
> "Systems built to take advantage of WARP from a hardware standpoint will
> be able to display graphics even when the video card is missing-or
> toasted."
>
> So what are they going to do, put the DVI connector on the motherboard
> or something? Sounds cool.
I guess the idea is that if the GPU is toast but the card itself still
works, it's still usable... but wouldn't that be the case anyway? *shrugs*
>> Crysis with no GPU? Now I've seen everything!
>
> Looks like this is just a beefed up version of the reference rasteriser
> that comes with the DirectX SDK.
Beefed up to support all DirectX 10.1 features, yeah...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> I guess the idea is that if the GPU is toast but the card itself still
> works, it's still usable... but wouldn't that be the case anyway? *shrugs*
But how is it going to work if the entire video card is "missing" like they
say?
>>> Crysis with no GPU? Now I've seen everything!
>>
>> Looks like this is just a beefed up version of the reference rasteriser
>> that comes with the DirectX SDK.
>
> Beefed up to support all DirectX 10.1 features, yeah...
The reference rasteriser already supports all DirectX 10.1 features, that's
the idea of it, you use it to test/check that your code/GPU combination
produces the same output as the reference rasteriser to rule out any weird
driver issues. I have the DX10 reference rasteriser here from the March
2008 SDK. Amusingly I just tried it on one of the samples, 20fps on the
GPU, 0.2fps on the CPU :-) I guess they made it a bit faster :-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> I guess the idea is that if the GPU is toast but the card itself still
>> works, it's still usable... but wouldn't that be the case anyway?
>> *shrugs*
>
> But how is it going to work if the entire video card is "missing" like
> they say?
Doesn't make any sense to me either. Maybe the reviewer got their facts
wrong?
> The reference rasteriser already supports all DirectX 10.1 features,
> that's the idea of it.
So I'm guessing it's coded for correctness rather than speed. If that's
true, there's probably a few easy ways to speed it up.
> Amusingly I just tried it on one of the
> samples, 20fps on the GPU, 0.2fps on the CPU :-) I guess they made it a
> bit faster :-)
Which GPU? Which CPU? ;-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Doesn't make any sense to me either. Maybe the reviewer got their facts
> wrong?
It seems from reading the MS site that it can be used to render when no GPU
card is present, and then send the results over the network for remote
viewing. This is probably what the reviewer meant.
>> The reference rasteriser already supports all DirectX 10.1 features,
>> that's the idea of it.
>
> So I'm guessing it's coded for correctness rather than speed. If that's
> true, there's probably a few easy ways to speed it up.
Yeh, the MS site also says that WARP is based on the reference rasteriser.
Apparently it is designed to make full use of SSE4.1 and multiple cores.
>> Amusingly I just tried it on one of the samples, 20fps on the GPU, 0.2fps
>> on the CPU :-) I guess they made it a bit faster :-)
>
> Which GPU? Which CPU? ;-)
Quadro FX1700 and E8500.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> Doesn't make any sense to me either. Maybe the reviewer got their
>> facts wrong?
>
> It seems from reading the MS site that it can be used to render when no
> GPU card is present, and then send the results over the network for
> remote viewing. This is probably what the reviewer meant.
Yeah, seems plausible. They didn't explain that part...
(Even so, if you're just worried about your PC booting up, it's already
usable over the network. Hmm.)
>>> The reference rasteriser already supports all DirectX 10.1 features,
>>> that's the idea of it.
>>
>> So I'm guessing it's coded for correctness rather than speed. If
>> that's true, there's probably a few easy ways to speed it up.
>
> Yeh, the MS site also says that WARP is based on the reference
> rasteriser. Apparently it is designed to make full use of SSE4.1 and
> multiple cores.
The review suggests it plain won't work without SSE. But yeah...
>>> Amusingly I just tried it on one of the samples, 20fps on the GPU,
>>> 0.2fps on the CPU :-) I guess they made it a bit faster :-)
>>
>> Which GPU? Which CPU? ;-)
>
> Quadro FX1700 and E8500.
Ah.
Isn't the Quadro designed for fast 2D work?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Isn't the Quadro designed for fast 2D work?
It's basically the same as a normal nVidia card, but they get to charge more
for it:
http://en.wikipedia.org/wiki/Quadro
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
scott schrieb:
> "Systems built to take advantage of WARP from a hardware standpoint will
> be able to display graphics even when the video card is missing-or
> toasted."
>
> So what are they going to do, put the DVI connector on the motherboard
> or something? Sounds cool.
Well, I do have a machine with an onboard DVI connector. Nothing new
actually - onboard graphics have been around since decades. The
performance is far from the top of the crop of course, but perfectly
sufficient for a Linux-based "render slave" (I don't have a display
connected to it anyway :-)).
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> So what are they going to do, put the DVI connector on the motherboard or
>> something? Sounds cool.
>
> Well, I do have a machine with an onboard DVI connector. Nothing new
> actually - onboard graphics have been around since decades. The
> performance is far from the top of the crop of course, but perfectly
> sufficient for a Linux-based "render slave" (I don't have a display
> connected to it anyway :-)).
What I initially thought when I read the article was that they were going to
put the DVI connector on the mainboard and then have a slot for a 3D card.
The 3D card would then feed the display signal back into the mainboard and
out the DVI connector. That was the only way I saw it possible to allow the
CPU to keep showing a display when the graphics card is removed...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |