|
|
|
|
|
|
| |
| |
|
|
From: Jim Charter
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 14:07:22
Message: <47237e5a$1@news.povray.org>
|
|
|
| |
| |
|
|
Nicolas Alvarez wrote:
>
> I said it on other threads, but why not repeat:
Indeed...repeat. This time I got it.
Post a reply to this message
|
|
| |
| |
|
|
From: Orchid XP v7
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 15:11:35
Message: <47238d67$1@news.povray.org>
|
|
|
| |
| |
|
|
Gail Shaw wrote:
>> Wait... since when does POV not have "true" area lights?
>
> Since always. Area lights are implemented as arrays of point lights
And this is observably different how, exactly?
Also, what about radiosity with a big square on the ceiling set to have
a nonzero ambient figure?
Post a reply to this message
|
|
| |
| |
|
|
From: Darren New
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 16:57:30
Message: <4723a63a@news.povray.org>
|
|
|
| |
| |
|
|
Warp wrote:
> algorithm called "radiosity", which is related to calculating lightmaps,
...
> calculating light maps which have a certain pixel resolution.
My memory of the algorithm is that you could build it in a way that let
the areas with lots of detail have finer resolution than the areas with
less detail, and still get all the benefits of the "real" radiosity
algoritm. Am I mistaken here?
--
Darren New / San Diego, CA, USA (PST)
Remember the good old days, when we
used to complain about cryptography
being export-restricted?
Post a reply to this message
|
|
| |
| |
|
|
From: Gilles Tran
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 17:25:09
Message: <4723acb5$1@news.povray.org>
|
|
|
| |
| |
|
|
47238d67$1@news.povray.org...
> Also, what about radiosity with a big square on the ceiling set to have a
> nonzero ambient figure?
You still won't get specular highlights. Unless of course you give every
material the physically correct blurred reflection necessary to obtain
specularity. Good luck with that.
G.
Post a reply to this message
|
|
| |
| |
|
|
From: Warp
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 19:31:11
Message: <4723ca3f@news.povray.org>
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> My memory of the algorithm is that you could build it in a way that let
> the areas with lots of detail have finer resolution than the areas with
> less detail, and still get all the benefits of the "real" radiosity
> algoritm. Am I mistaken here?
I suppose that if you are calculating the lightmaps into something else
than bitmaps you could do adaptive supersampling (ie. if two adjacent samples
differ too much, take an additional sample in-between). I also suppose that
if you do that you lose the efficiency of having lightmaps as bitmaps...
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
From: Warp
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 19:32:37
Message: <4723ca94@news.povray.org>
|
|
|
| |
| |
|
|
Gilles Tran <gitran_nospam_@wanadoo.fr> wrote:
> You still won't get specular highlights. Unless of course you give every
> material the physically correct blurred reflection necessary to obtain
> specularity. Good luck with that.
Combining blurred reflection trick with 'exponent' should make it at
least partially possible. Why do you need luck with that?
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Because Indigo has (ONLY) been tested on Windows XP(R) and Windows 2000(R)
I just want to say, give me another SDL!
****** SDL Simple DirectMedia Layer
****** SDL Specification and Description Language (CCITT)
****** SDL Space Dynamics Laboratory
****** SDL Specification and Design Language
****** SDL SAW (Surface Acoustic Wave) Delay Line
****** SDL Security Development Lifecycle
****** SDL Self Directed Learning
***** SDL Service Description Language
***** SDL Systems Development Laboratory (JPL)
**** SDL System Description Language
**** SDL Software Development Library
**** SDL Schematic Driven Layout
**** SDL System Design Language
**** SDL Storage Definition Language
**** SDL System Design Laboratory
**** SDL Sogosogo Duavata Ni Lewenivanua (United Fiji Party, Fiji)
**** SDL Software Development Laboratory
**** SDL Serial Data Link
**** SDL Search Digital Libraries
**** SDL Standard Distribution List
**** SDL Structure Description Language
**** SDL Sundsvall, Sweden - Sundsvall (Airport Code)
**** SDL Soft Defect Localization (scanning laser microscopy methodology)
**** SDL Secure Domain Logon
**** SDL Superimposed Dead Load
*** SDL Signaling Data Link
*** SDL Sample Detection Limit
*** SDL Surface Data Logging
*** SDL Structured Design Language
*** SDL Satellite Data Link
*** SDL Scouts du Liban (Lebanon)
*** SDL Stammdienststelle der Luftwaffe (German)
*** SDL State Designated Level
*** SDL Sensor Data Link
*** SDL Switched Delay Line (fiber optics)
*** SDL Synchronous Delay Line
*** SDL Screen Definition Language
*** SDL Service Delivery Lead
* SDL Shared Distribution List
* SDL Scottsdale, Arizona - Municipal (airport code)
* SDL Supplementary Defect List
* SDL Smart Data Loopback (Hekimian)
* SDL Solution Demonstration Laboratory
* SDL Standard Direct Layer (low level computer graphics)
* SDL Subcontractor Data List
* SDL Supplier Document List
* SDL Solution Defeating Lunacy
* SDL Stomach Damaging Lecture
* SDL Spirit Destroying Life
* SDL Spirit Destroying Location
* SDL Spirit Destroying Linkage
* SDL So Damn Lucky
liquidation
locomotion
lullaby
lipectomy
lesson
liberation
lost_cause
lock
lining
lechery
lamination
Post a reply to this message
Attachments:
Download '' (0 KB)
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v7 <voi### [at] devnull> wrote:
> Tom York wrote:
> > What unbiased methods give you is certainty. If you leave them long enough
> > they *will* approach the true solution.
>
> So will POV-Ray's radiosity system, if you turn the settings up high
> enough. (And wait a damn long time...)
No, being a biased method it definitely isn't guaranteed to (even ignoring
limits on quality settings that others have mentioned). The true solution isn't
the only solution that looks good, of course, so whether or not this is a
problem depends upon the scene; but the main point is that you can put in
additional rendering time using POV's radiosity without necessarily improving
the image in the way you want (some artefacts may never disappear).
> Your point?
My point in the previous message was the rest of that paragraph, the part you
didn't quote. The nice thing (or one of the nice things) about the unbiased
methods is that you can wind them up and let them go and the quality will
definitely increase over time - minimising tweaking/re-rendering to avoid
stubborn artefacts.
Tom
Post a reply to this message
|
|
| |
| |
|
|
From: Darren New
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 20:59:23
Message: <4723deeb$1@news.povray.org>
|
|
|
| |
| |
|
|
Warp wrote:
> I suppose that if you are calculating the lightmaps into something else
> than bitmaps you could do adaptive supersampling
Hmmm. Unless you're talking about how to turn a bunch of radiosity chips
in 3D into a bitmap of the 3D structure as seen from a particular point
in space, I am confused. Either you're talking about something other
than what I learned, or my memory of what I learned doesn't match what
the "radiosity" algorithm really does.
--
Darren New / San Diego, CA, USA (PST)
Remember the good old days, when we
used to complain about cryptography
being export-restricted?
Post a reply to this message
|
|
| |
| |
|
|
From: John VanSickle
Subject: Re: This is another "free" unbiased engine: Indigo Render
Date: 27 Oct 2007 21:04:19
Message: <4723e013@news.povray.org>
|
|
|
| |
| |
|
|
Darren New wrote:
> Color me unimpressed. Maybe it's because I'm not an expert, but some of
> the sub-surface scattering stuff is the only stuff that looks
> particularly good to me. Balanced against most of their proud gallery
> being obnoxiously grainy, I don't see it as a win just from the photos.
>
> Is it possible to automatically know when a scene is good enough? Or
> does it take human intervention to say "ok, stop now and move on to the
> next frame"?
For animations this is a show-stopper. Picture quality *must* be
consistent from frame to frame, and that rules out any perceptible
degree of graininess. Letting the unbiased renderers go until the grain
is gone is not practical, because that requires a human to monitor the
render, and requires that human to decide consistently from one frame to
the next. The only way an unbiased renderer could be used in animation
work is to let it render the first frame of every shot, decide on an
acceptable quality level, and then allow that much time for each frame,
and hope that the movement of some object or the camera doesn't increase
the time requirement significantly.
(And if you want grain for some reason, other renderers, and
post-processors too, can supply it in a way what is much easier to control.)
Ray-tracing and z-buffering deliver consistency from frame to frame,
which is why animators use those rendering algorithms. Pixar's renderer
uses a z-buffering architecture, combined with ray-tracing for certain
situations; in their docs they say that the only real drawback to
ray-tracing is the requirement that the entire scene be containable in
memory (which for Pixar's work is a show-stopper; their scenes can use
insane amounts of data). To this I'd add that z-buffering handles
displacement mapping much more efficiently than ray-tracing does.
Regards,
John
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |