POV-Ray : Newsgroups : povray.off-topic : New LuxRender web site (http://www.luxrender.net) Server Time
14 Nov 2024 22:22:10 EST (-0500)
  New LuxRender web site (http://www.luxrender.net) (Message 1 to 10 of 175)  
Goto Latest 10 Messages Next 10 Messages >>>
From: delle
Subject: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 07:40:00
Message: <web.47bc1f7cc8e4f3c827eef15b0@news.povray.org>
LuxRender is a Free, Open Source and heavily modified GPL fork from the PBRT
sources, keeping portability between both codebases intact whenever possible,
and implements new features as a superset of PBRT's.

New Web Site is now http://www.luxrender.net

Here some current features:

User Frontends

    * FLTK based Graphical User Interface with interactive rendering controls,
tonemapping controls and a progressive/linear viewport.

      Realtime engine control with Adding & Removing of rendering threads and
start/pause/restart controls.

      Portable and consistent look & feel across platforms.
    * Command Line User Interface for headless/scripted operation.

    * Direct Blender python User Interface using the C++ API together with
boost::python (no scenefile output and engine launch/loading/parsing is
required). (slated for v0.2 release)

Compatibility/Optimization

    *  Native 32- & 64 bits binaries for GNU/Linux, MS-Windows(R) and MacOS X(R)
operating systems will be provided for all major releases.
    * Multithreading for multi-core or SMP systems. Thread control is available
in the API, which allows for creation and removal of threads on the fly during
rendering.
    *

      Intel SSE/SSE2 SIMD code-level optimizations for vector/matrix types.
    * Boost C++ Shared Libraries are used throughout the system.

C/C++ Programming API

    *

      Flexible and complete C/C++ programming API, which allows for direct
integration of the LuxRender Engine core into applications. (provided GPLv3
licensing conditions are met.)

      Hierarchical graphics state with attribute and transformation stacks, and
active light retention.
    * Plans are in the works to align the API with Blender for direct
integration of the engine into Blender as a shared library. v0.2 release will
allow a pseudo implementation of this concept through python > C++ API calls
(using boost::python)

Scene File format

    *

      Compact RIB-like file format which implements the complete API. (.lxs
extention)
    * Fast Flex/Bison file scanner/parser.

Engine Core
Surface Integrators

    * Unbiased
          o path (Path Tracing)
          o bidirectional (Bidirectional Path Tracing)
    * Biased
          o exphotonmap (Photonmapping using final gathering)
          o irradiancecache
          o igi (Instant Global Illumination)
          o directlighting
    * Misc
          o debug (Visual debugging integrator)

Volume Integrators

    * single (Single Scattering)
    * emission (...)

Samplers

    *

      metropolis (MLT prototype, v0.2, path/bidirectional integrators only)

    * lowdiscrepancy (Quasi Random using [0,2] sequences)
    * random (Pseudo Random using a Mersenne Twister generator)
    * stratified (subdivision of non-overlapping regions)
    * bestcandidate (patterned using dart throwing)

Alll samplers are adaptive and can provide samples for all integrators, except
the Metropolis-Hastings sampler, which can draw samples from all other samplers
and provide them mutations to path/bidirectional.

Image film can be sampled progressively (eg watch as the render becomes less and
less noisy), or traditional linear/finite.
Geometry
Intersection Acceleration Structures

    *

      grid (Uniform grid)
    * kdtree (SAH KDtree)

Shapes/Primitives

    * sphere (full or partial with u/v parameters)
    * cylinder (full or partial with angle parameter)
    * disk (full or partial with angle parameter)
    * cone (full or partial with angle parameter)
    * paraboloid (full or partial with angle parameter)
    * hyperboloid (full or partial with angle parameter)
    * trianglemesh
    * heightfield
    * loopsubdiv (loop subdivision from base control mesh with arbitrary levels)
    * nurbs (Non Uniform B-Splines)

Lightsources
Geometric

    * area

      En emissive lightsource which can use any Shape as it's geometry. Also
called a 'mesh emitter' if used with a trianglemesh Shape. (supports Exit
Portals)

Environment

    * sunsky

      Physically based Sunlight & Daylight model (Preetham/Shirley/Smits) as
used in most modern unbiased renderers. (supports Exit Portals)
    * infinite

      Infinite Area Light, environment lightsource/map using a colour or
texture. (supports Exit Portals)
    * infinitesample

      Same as above but using improved sampling. (supports Exit Portals)

Traditional

    * distant (Parallel lightsource)
    * goniometric (Goniophotometric diagram lightsource)
    * point (Traditional point light source)
    * projection (Texture projection lightsource)
    * spot (Traditional spot light source)

Camera Models

    * perspective

      Supports true DOF (depth of field) using lens radius and focal distance
parameters.

      Supports exposure control using shutter open/close time parameters.
    * orthographic
    * environment

Materials, Volumes & Textures
BRDFs

    * Diffuse
          o Lambertian
          o Oren-Nayar (using sigma parameter)
    * Specular
          o Perfect Specular
          o Glossy Specular using any of the following Microfacet distribution
models:
                + Torrange/Sparrow (Isotropic, linear)
                + Blinn (Isotropic, exponential fallof)
                + Ashikmin/Shirley (Anisotropic/exponential with separate u/v
parameters)
    * Mixing
          o Fresnel for Dielectrics (using IOR parameter)
          o Fresnel for Conductors (using n/k parameters)
          o Fresnel blending for specular with diffuse-substrate
(Ashikmin/Shirley model)

Materials
Unified Material System (slated for v0.2 release)

A new innovative programmable material system is currently in development and
slated for v0.2 release.

The new system allows for recursive material definitions down to the BRDF level,
while providing easy to use frontends for artistic users and dynamic frontends
in the exporters/GUI's.
Legacy PBRT Material System (current, v0.1 release)

    *

      matte (Lambertian or Oren-Nayar 'matte paint' diffuse reflection)
    * plastic (Glossy Dielectric Diffuse/Specular using a Blinn Microfacet
distribution)
    * shinymetal (Glossy Conductor Diffuse/Specular using a Blinn Microfacet
distribution)
    * substrate (Glossy Frensel Blended Diffuse substrate/Specular superstrate
using an Iso/Anisotropic Ashikmin/Shirley Microfacet distribution)
    * glass (Fresnel Dielectric with perfect specular reflection and
transmission)
    * translucent (Fresnel Dielectric with glossy specular reflection and
transmission using a Blinn Microfacet distribution)
    * mirror (Perfect specular reflection)
    * uber (Flexible base material with control over all reflection/transmission
BRDFs)
    * bluepaint (measured)
    * brushedmetal (measured)
    * clay (measured)
    * felt (measured)
    * primer (measured)
    * skin (measured)

Textures

All textures can be used to modulate any material/BRDF parameter as well as
material Bump maps.
Mappings

    *

      2D u,v Mapping
    * Planar Mapping
    * Spherical Mapping
    * Cylindrical Mapping
    * 3D Mapping

Mixing

    *

      mix (Mix any two textures using a specified amount)
    * scale (Multiply 2 textures)

Image Textures

    *

      imagemap (HDR or LDR image texture, with MIPMAPs)

      Filtering

      bilerp (Bilinear)

      EWA (anisotropic)

Solid & Procedural Textures

    *

      checkerboard (2D or 3D parametrizeable checks)
    * constant (Constant Spectrum)
    * dots (Polka dots, using Perlin noise)
    * fbm (using Multi octave Perlin noise)
    * marble (using Multi octave Perlin noise)
    * windy (using Multi octave Perlin noise)
    * wrinkled (using Multi octave Perlin noise)

Volumes

    * exponential
    * homogenous
    * volumegrid (3D grid based volume)

Film/Texture Imaging Pipeline
Colour handling

    * Spectra (used throughout the render engine core)
    * CIE XYZ (intermediate, useable from API and for XYZ output formats)
    * RGB(A) (used for colour definitions in scene file format/API and RGB(A)
image output.)

Film / Texture Image file formats

    * OpenEXR RGBA
    * TGA (24bits RGB or 32bits RGBA)
    * All formats supported by the FreeImage library (slated for v0.2 release)

Tonemapping Kernels

    * contrast
    * highcontrast
    * maxwhite (Maximum to White)
    * nonlinear (Spatially varying non-uniform scale based on Reinhard2001)
    * reinhard (Implementation of nonlinear/Reinhard2004 with burn, pre- and
postscale parameters)

Additional Pipeline Features

    * HDR Bloom filter
    * Gamma correction
    * Pre-Quantization output dithering

Pixel Sample Reconstruction Filters

All filter sizes are parametrizeable.

    * box
    * gaussian
    * mitchell
    * sinc
    * triangle


Post a reply to this message


Attachments:
Download 'luxrender.jpg' (124 KB)

Preview of image 'luxrender.jpg'
luxrender.jpg


 

From: Warp
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 07:58:54
Message: <47bc240e@news.povray.org>
It would be nice to see some rendertimes.

-- 
                                                          - Warp


Post a reply to this message

From: delle
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 08:30:05
Message: <web.47bc2ad9b014483d27eef15b0@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> It would be nice to see some rendertimes.
>
> --
>                                                           - Warp

I'm looking for the best Quality possible, LuxRender seems to be on the right
path... CPU are becoming faster and faster... so...

Even Povray with high quality settings on in not so fast....

try to enable: Radiosity (hi Q), area Lights, anti aliasing and camera depth of
field...

Speaking about "low quality", my Nvidia card can play Crysis at 30 fps ... the
quality is quite good...

;-)

Delle


Post a reply to this message


Attachments:
Download 'crysis.jpg' (259 KB)

Preview of image 'crysis.jpg'
crysis.jpg


 

From: scott
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 08:37:23
Message: <47bc2d13$1@news.povray.org>
> Speaking about "low quality", my Nvidia card can play Crysis at 30 fps ... 
> the
> quality is quite good...

Ah but imagine if you came across a big chrome sphere, the reflections 
wouldn't be physically correct...


Post a reply to this message

From: somebody
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 09:57:33
Message: <47bc3fdd$1@news.povray.org>
"Warp" <war### [at] tagpovrayorg> wrote in message
news:47bc240e@news.povray.org...

>   It would be nice to see some rendertimes.

It would also be nice if one of these hundereds of OS renderers that are
popping up could advance beyond alpha 0.1 and also integrate with other
modelers than Blender.


Post a reply to this message

From: Severi Salminen
Subject: Re: Brute force renderers
Date: 20 Feb 2008 10:09:15
Message: <47bc429b$1@news.povray.org>
Warp wrote:
>   It would be nice to see some rendertimes.

I have been simply amazed by these "new breed of" renderers which use
unbiased methods that make the image converge slowly to the correct
solution. Just look at these amazing images made with indigo Renderer:

http://www.indigorenderer.com/joomla/index.php?option=com_gallery2&Itemid=26

The rendering times usually vary from few hours to tens of hours. But it
all depends on quality (ie. noise) requirements. You get a preview very
quickly with a lot of noise.

I've been implementing my own path tracer and my impressions are:

I love the fact that you always get (almost) all the characteristics of
light taken into account. You don't have to figure whether radiosity,
caustics, indirect shadows, indirect refractions etc.etc. play
significant role or not. You don't have to tweak tens of parameters and
guessimate which parameters/values/features give you the effect that the
renderer should handle in the first place. And you don't have to guess
if the artifact you see is really an artifact (because wrong parameters
or whatever) or correct result.

I just implemented specular reflections.After doing those I noticed:
"Wow, I just accidentally implemented caustics and phong shading". The
same way you don't have to guess which kind of light source and which
parameters you must use to get the desired effect. Every lightsource can
be of any shape (not just 2d fake sources or physically impossible
points) and every point on the light source surface does indeed emit
light. No parameters, no features to enable, no shadow artifacts. It
Just Works (tm).

The whole purpose of raytracer is to give you an accurate image.
Otherwise you could use faster scanline renderers. Brute-force renderers
really accomplish this as can be seen from the images. IMO (and this is
just my opinion) you quite rarely see images of such quality made with
Pov - at least compared to the ones rendered with Indigo. (By "quality"
I mean lighting realism - not necessarily model quality etc). In many
cases this is not because POV wasn't capable of producing almost as good
images. The reason is that it is difficult and time consuming to
guess/try/tweak which features/parameters you have to enable/set to get
realistic results. Many images made by new users look worse than 3D
games that are 5 years old. Had they used brute-force, the results would
be a lot more photorealistic with little effort.

The rendering time is not a that big problem in still images -
animations are a different matter. Brute-force renderers are very easily
made multi-threaded (I know because I did it just 3 hours ago...) and
thus they scale very well with new multi core CPUs.

I waited 3 hours for a small refractive sphere to get drawn on my screen
back in the 90s. I can wait the same 3 hours now to get true global
illumination, caustics and other very important effects of light and I
can be 100% certain that the result is accurate.

I also hope these new methods are considered when rewriting/designing of
Pov4 begins.

This was not a rant against POV. But things go forward and methods that
were impossible to use 10-20 years ago are now very usable and superior
in many regards. And after 5 years they are even more usable.

SS


Post a reply to this message

From: Warp
Subject: Re: Brute force renderers
Date: 20 Feb 2008 11:00:05
Message: <47bc4e84@news.povray.org>
Severi Salminen <sev### [at] notthissaunalahtifiinvalid> wrote:
> I waited 3 hours for a small refractive sphere to get drawn on my screen
> back in the 90s. I can wait the same 3 hours now to get true global
> illumination, caustics and other very important effects of light and I
> can be 100% certain that the result is accurate.

  Am I incorrect if I get the impression that these unbiased renderers
offer nothing else than unbiased rendering? In other words, even the
simplest of scenes will take hours to look ungrainy, no matter what
you do?

  Sometimes I use POV-Ray to get simple 3D-looking graphics for diverse
things. The big advantage is that I can do it easily and POV-Ray renders
it very fast. We are talking about a few seconds even for large-sized
images. It would be completely counter-productive to have to wait for
hours for a simple image to look non-grainy, when all you want is something
quick which looks "cool" and "3D'ish".

  For this reason even if POV-Ray in the future supports unbiased rendering,
it should always be an *alternative* method of rendering, not the only
available one. Removing the current phongshading-based rendering would be
a setback in many areas. After all, POV-Ray is not *always* used for
physical simulations of reality.

-- 
                                                          - Warp


Post a reply to this message

From: nemesis
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 11:08:10
Message: <47bc506a@news.povray.org>
yes, I saw luxrender2.org was having issues.

hmm, trying to lure users away from povray or trying to get povray 
developers hard-pressed to integrate unbiased methods into the renderer? ;)

You'll only perhaps succeed with the first goal if you somehow manage to 
get luxrender to parse povray SDL files.  Come on, RIB and all the 
current XML crap sucks very much compared to the high-level povray SDL...

I'll keep watching lux, though...


Post a reply to this message

From: Mike Raiford
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 11:09:36
Message: <47bc50c0@news.povray.org>
delle wrote:
> Warp <war### [at] tagpovrayorg> wrote:
>> It would be nice to see some rendertimes.
>>
>> --
>>                                                           - Warp
> 
> I'm looking for the best Quality possible, LuxRender seems to be on the right
> path... CPU are becoming faster and faster... so...
> 
> Even Povray with high quality settings on in not so fast....
> 
> try to enable: Radiosity (hi Q), area Lights, anti aliasing and camera depth of
> field...
> 
> Speaking about "low quality", my Nvidia card can play Crysis at 30 fps ... the
> quality is quite good...

You didn't answer his question. I'm also curious about render times, but 
you completely dodged the question.


Post a reply to this message

From: Severi Salminen
Subject: Re: New LuxRender web site (http://www.luxrender.net)
Date: 20 Feb 2008 11:13:16
Message: <47bc519c$1@news.povray.org>
Mike Raiford wrote:

> You didn't answer his question. I'm also curious about render times, but
> you completely dodged the question.

See the indigo site gallery. Some captions include render time. Of
course that tells you nothing about how good/bad the image looked in
less time. Remember that with brute force you see a lot even after 1 minute.


Post a reply to this message

Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.