|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
nemesis <nam### [at] gmailcom> wrote:
> yes, that's some food for thought. Seeing as hardware nowadays is so fast that
> raytracing is starting to get used in game engines for real-time applications
> and NVidia buying Mental Images mainly for its highly-touted real-time shading
> tech Mental Mill, perhaps it is time for stills to move on to more precise and
> automatic lighting models...
I would not want to wait 11 hours for a single sphere to render.
Being slow for the sake of being slow is not the answer.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp escribió:
> Trying to support both methods at the same time is a bit problematic
> too. It's hard to mix them. Basically you would need two separate
> rendering engines and a way to choose which one is used where. The
> full BRDF definitions are basically useless in method 1.
>
> If you want POV-Ray to support raytracing with BRDFs, that would
> mean basically that two different raytracers would need to be packaged
> into one executable, and you simply choose which one is used. The advantage
> in this can be dubious. It may be much better to simply have two different
> programs.
>
The main advantage is that the raytracing algorithms (as in, ray-object
intersections), SDL parsing, camera projections, and maybe even pigments
would be common to both renderers. finish{}es would probably need
totally different properties for each renderer, but I think that's it.
Anything that alters ("fakes") the normals may be problematic (smooth
heightfields, bezier patches, meshes with smooth triangles, or normal{}
on any object), as I have seen lighting models not using the normal
vector at all. But almost everything else would just work.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp <war### [at] tagpovrayorg> wrote:
> I would not want to wait 11 hours for a single sphere to render.
of course you know that is not true, it was intentional exaggeration. Even
luxrender, the open-source unbiased renderer, at early stages, is capable of
rendering moderately complex scenes in about a few hours with nowadays
hardware. A bit noisy output, but bearable...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
nemesis <nam### [at] gmailcom> wrote:
> [-- text/plain, encoding 8bit, charset: iso-8859-1, 10 lines --]
> Warp <war### [at] tagpovrayorg> wrote:
> > I would not want to wait 11 hours for a single sphere to render.
> of course you know that is not true, it was intentional exaggeration. Even
> luxrender, the open-source unbiased renderer, at early stages, is capable of
> rendering moderately complex scenes in about a few hours with nowadays
> hardware. A bit noisy output, but bearable...
But, you see, I want my simple scenes to render in a few minutes, not
in a few hours. There are scenes where I don't *need* that kind of accuracy
or photorealism, but a simple phong lighting model suffices more than enough,
and I want to render it fast and clean.
For example, if I want to render some "3D-looking" buttons for a graphical
interface, I don't want to spend hours waiting for them to look good. I want
them to render in a few seconds, and POV-Ray can do exactly that.
I'm also happy that I can render images like these if I need to:
http://warp.povusers.org/pics/RubiksRevenge2.jpg
http://warp.povusers.org/pics/screws.jpg
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"H. Karsten" <h-karsten()web.de> wrote in message
news:web.478286cd8ffe2b1020776f0@news.povray.org...
> Hi people
>
> Since months I'm working a lot with Bruteforcerendering. Such a feature in
> Povray wold be gread!!!!
>
Isn't that similar to photons? Or am I completely off track?
--
-Nekar Xenos-
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Nekar Xenos" <nek### [at] gmailcom> wrote:
> "H. Karsten" <h-karsten()web.de> wrote in message
> news:web.478286cd8ffe2b1020776f0@news.povray.org...
> > Hi people
> >
> > Since months I'm working a lot with Bruteforcerendering. Such a feature in
> > Povray wold be gread!!!!
> >
>
> Isn't that similar to photons? Or am I completely off track?
>
>
> --
> -Nekar Xenos-
PovRay use photons for caustics (I think) and not for radiosity.
But in theory, yes.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
nemesis skribis:
>Seeing as hardware nowadays is so fast that
>raytracing is starting to get used in game engines for real-time applications
>and NVidia buying Mental Images mainly for its highly-touted real-time shading
>tech Mental Mill, perhaps it is time for stills to move on to more precise and
>automatic lighting models...
I agree.
We are seeing more and more elements from raytracing being brought into the
graphics cards. First-order reflections, shadows and procedural textures have
existed for quite a while, and the best graphics cards at the time of writing
even have support for radiosity-like features.
It is only a matter of time before the graphics cards will be able to generate
higher-order reflections, as well as simple refraction and caustics. Simple -
but convincing. If POV-ray doesn't keep up with these developments, staying a
couple of steps ahead all the time, it will eventually die out.
I am not familiar with the concept of "brute-force rendering" but I like the
sound of it. :P
Hymyly.
And there was light.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Hymyly" <chr### [at] hotmailcom> wrote:
> It is only a matter of time before the graphics cards will be able to generate
> higher-order reflections, as well as simple refraction and caustics. Simple -
> but convincing.
real-time scanline imagery with faked but convincing caustics and reflections is
dramatic displayed in games like Crysis.
> I am not familiar with the concept of "brute-force rendering" but I like the
> sound of it. :P
it basically means stochastic forward raytracing. Takes ages, but produces very
physically accurate and realistic images.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Hymyly <chr### [at] hotmailcom> wrote:
> We are seeing more and more elements from raytracing being brought into the
> graphics cards. First-order reflections, shadows and procedural textures have
> existed for quite a while
Note that those reflections which are rendered with graphics cards are
not physically accurate. The effect is the same as if the rest of the
scene was infinitely away from the reflecting object. The closer something
in the scene is to the object, the clearer the visual error. And of course
self-reflection is completely out of question.
Also, that type of reflection has a quality/memory usage tradeoff: The
higher the quality of the reflection, the higher the memory usage. This
isn't so with raytracing, obviously.
Many high-end renderers out there (such as, AFAIK, 3DMax) support
raytraced reflections for this exact reason: Because the scanline-rendered
reflections are not always accurate enough. Unfortunately it also means
that the more accurate raytraced reflections cannot be calculated with
the GPU.
There are currently two distinct techniques in real-time hardware
rendering for calculating shadows, neither of which has anything to do
with raytracing. One is polygon-based and not suitable for mathematical
surfaces, the other is stencil-buffer-based, and suffers from the
quality/memory usage tradeoff.
Procedural textures have advanced quite a lot with GPU technology, but
it's still limited to what the GPU shader language has to offer. For example,
if I'm not completely mistaken, looping and recursion is out of question.
> It is only a matter of time before the graphics cards will be able to generate
> higher-order reflections, as well as simple refraction and caustics. Simple -
> but convincing. If POV-ray doesn't keep up with these developments, staying a
> couple of steps ahead all the time, it will eventually die out.
To me it sounds like hardware is still badly lagging behind in features.
Extremely fast? Yes. Diverse features? Nope.
> I am not familiar with the concept of "brute-force rendering" but I like the
> sound of it. :P
In layman terms it means that you might be able to get stunningly realistic
scenes if you have the proper BRDFs and can wait a couple of days for the
graininess to go away.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Just what it is brute-force rendering? an inverse raytracing? (following
the light ray from the light source?)
The following my sound childish, even ridiculous but... screw you
Indigo! Long live POV-Ray! THE BEST RENDERER EVR!!!
I warned you, didn't I? :D
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |