POV-Ray : Newsgroups : povray.advanced-users : Reverse engineering pigments and textures Server Time: 17 Jun 2019 06:37:52 GMT
  Reverse engineering pigments and textures (Message 1 to 7 of 7)  
From: Bald Eagle
Subject: Reverse engineering pigments and textures
Date: 15 Jul 2016 19:10:00
Message: <web.5789343436e895bcb488d9aa0@news.povray.org>
I am having, and have had in the past, difficulty reproducing the colors in
sampled images in my renders.

I was wondering if anyone has had success in sampling rgb values from images or
"rgb color picker" apps and getting good results in their renders.
using sRGB doesn't seem to help me much.

This also got me thinking about how my overall textures are usually pretty
simplistic and plain.

Has anyone ever thought about or even experimented with scanning images and
trying to work backwards from the color / brightness values and trying to deduce
any patterns?  I would think that someone, somewhere has experimented with this
- universities, Disney, Pixar, IL&M, etc.

Thanks


Post a reply to this message

From: clipka
Subject: Re: Reverse engineering pigments and textures
Date: 15 Jul 2016 20:59:58
Message: <57894ece$1@news.povray.org>
Am 15.07.2016 um 21:06 schrieb Bald Eagle:
> I am having, and have had in the past, difficulty reproducing the colors in
> sampled images in my renders.
> 
> I was wondering if anyone has had success in sampling rgb values from images or
> "rgb color picker" apps and getting good results in their renders.
> using sRGB doesn't seem to help me much.

You have to remember that the colour you see in images is not a
straightforward material colour, but a combination of (a) the material
colour, (b) the surface finish, (c) the geometry, (d) the direct
illumination from light sources, and (e) the indirect illumination via
other objects.

Presuming an overall white illumination, the most important factors will be:

(1) Darkening from shadows and/or the light hitting the surface at a
shallow angle. This corresponds to a mixing with black.

(2) Brightening from (possibly very dull) highlights. This corresponds
to a mixing with white.

In combination, they correspond to a change in total brightness and
desaturation. To counter this, you'll usually have to boost both the
brightness by multiplying all channels by another constant value, and
the saturation by subtracting another constant value from all channels.
By how much, that's a thing you'll have to figure out by tweaking.

Note that these changes need to be applied _after_ converting the picked
colour values to linear colour space; this is especially true for the
boosting of the saturation, which would otherwise cause a shift in hue.

I haven't tested it, but you should be able to do it like this:

  #declare SB = 0.2; // Saturation boost
  #declare VB = 0.4; // Brightness ("volume") boost
  #declare MyColour = (( srgb <128,255,220>/255 )-SB)*(1+VB);

Note that in this variant the saturation boost also affects the
brightness, but the upside is that the formula remains simple enough to
be written in one declaration statement, or even at the very place where
the colour is actually used.

If you are ok with something more complex but possibly easier to tweak,
try the following:

  #declare SB = 0.2; // Saturation boost
  #declare VB = 0.2; // Brightness ("volume") boost
  #declare RawColour = srgb <128,255,220>/255;
  #declare SatColour = RawColour-SB;
  #declare MyColour  = SatColour*(1+VB)*RawColour.gray/SatColour.gray;

This should keep the general brightness unaffected by tweaks to the
saturation.


Indirect illumination from nearby colourful surfaces, or a general
non-uniformity in the distribution of colours in the illumination (such
as a sunny sky, with a yellowish sun and blue sky), may introduce
additional challenges by also affecting the hue. Countering these may
prove particularly challenging, and you may end up tweaking each colour
channel individually. However, the above stable-brightness trick may
help a bit with this.


In any case, coloures picked from a simple everyday photograph (as
opposed to a photograph taken under carefully chosen lighting
conditions) are typically only useful as a starting point, and will
almost inevitably need a certain degree of tweaking.


Post a reply to this message

From: Thomas de Groot
Subject: Re: Reverse engineering pigments and textures
Date: 16 Jul 2016 06:59:35
Message: <5789db57$1@news.povray.org>
On 15-7-2016 22:59, clipka wrote:
> Am 15.07.2016 um 21:06 schrieb Bald Eagle:
>> I am having, and have had in the past, difficulty reproducing the colors in
>> sampled images in my renders.
>>
>> I was wondering if anyone has had success in sampling rgb values from images or
>> "rgb color picker" apps and getting good results in their renders.
>> using sRGB doesn't seem to help me much.
>
> [snip]
>
> I haven't tested it, but you should be able to do it like this:
>
>   #declare SB = 0.2; // Saturation boost
>   #declare VB = 0.4; // Brightness ("volume") boost
>   #declare MyColour = (( srgb <128,255,220>/255 )-SB)*(1+VB);
>
> Note that in this variant the saturation boost also affects the
> brightness, but the upside is that the formula remains simple enough to
> be written in one declaration statement, or even at the very place where
> the colour is actually used.
>
> If you are ok with something more complex but possibly easier to tweak,
> try the following:
>
>   #declare SB = 0.2; // Saturation boost
>   #declare VB = 0.2; // Brightness ("volume") boost
>   #declare RawColour = srgb <128,255,220>/255;
>   #declare SatColour = RawColour-SB;
>   #declare MyColour  = SatColour*(1+VB)*RawColour.gray/SatColour.gray;
>
> This should keep the general brightness unaffected by tweaks to the
> saturation.

Interesting. I am going to play with this a bit :-)

>
>
> Indirect illumination from nearby colourful surfaces, or a general
> non-uniformity in the distribution of colours in the illumination (such
> as a sunny sky, with a yellowish sun and blue sky), may introduce
> additional challenges by also affecting the hue. Countering these may
> prove particularly challenging, and you may end up tweaking each colour
> channel individually. However, the above stable-brightness trick may
> help a bit with this.

Absolutely.

>
>
> In any case, coloures picked from a simple everyday photograph (as
> opposed to a photograph taken under carefully chosen lighting
> conditions) are typically only useful as a starting point, and will
> almost inevitably need a certain degree of tweaking.
>

I was going to say this. I often 'pick' colours from photographs and 
such but only as a basis for further and very substantial tweaking. An 
example of one such exercise is the following fairly simple roof tile 
texture based on colours picked from 'real' tiles photographs:

#declare MedTileTex =
texture {
   pigment {
     cells
     color_map {//colours taken from roof images
       [0.1 srgb <220,158,106>/255]
       [0.2 srgb <217,177,126>/255]
       [0.3 srgb <228,215,173>/255]
       [0.4 srgb <219,152,100>/255]
       [0.5 srgb <166,120, 61>/255]
       [0.6 srgb <225,171,125>/255]
       [0.7 srgb <230,199,142>/255]
       [0.8 srgb <223,187,129>/255]
       [0.9 srgb <236,176,114>/255]
       [1.0 srgb <244,188,131>/255]
     }
     scale <1, 2, 1>*TileScale
   }
   normal {
     average
     normal_map {
       [1 granite 1 scale 0.1]
       [1 cells 2]
     }
     scale <1, 2, 1>*TileScale
   }
   finish {
     specular 0.1
     roughness 0.00286312
     diffuse 0.4
     reflection {0} conserve_energy
   }
}

It can be used for distant view where each colour represents one tile, 
or as a texture for a single tile.


-- 
Thomas


Post a reply to this message

From: Bald Eagle
Subject: Re: Reverse engineering pigments and textures
Date: 16 Jul 2016 16:05:00
Message: <web.578a5aba492309545e7df57c0@news.povray.org>
Thomas de Groot <tho### [at] degrootorg> wrote:

> I was going to say this. I often 'pick' colours from photographs and
> such but only as a basis for further and very substantial tweaking. An
> example of one such exercise is the following fairly simple roof tile
> texture based on colours picked from 'real' tiles photographs:

Indeed.  We've discussed this before, with regards to IIRC, the Victorian room's
hardwood floor colors.

I'm thinking that there might be a way to automatically create a color map from
an image file using SDL.   Scan all the color values using eval pigment :)  and
then do a bit of math to see how the color values cluster.

Might need some iterative interaction.   Maybe plot a graph in 3d RGB space.

I might be (probably) reinventing someone's wheel here.

Maybe someone will get to this before I do.    So many experiments.  Need more
round tuits, coffee, and processor cores.


Post a reply to this message

From: Thomas de Groot
Subject: Re: Reverse engineering pigments and textures
Date: 17 Jul 2016 07:17:48
Message: <578b311c$1@news.povray.org>
On 16-7-2016 18:03, Bald Eagle wrote:
> Thomas de Groot <tho### [at] degrootorg> wrote:
>
>> I was going to say this. I often 'pick' colours from photographs and
>> such but only as a basis for further and very substantial tweaking. An
>> example of one such exercise is the following fairly simple roof tile
>> texture based on colours picked from 'real' tiles photographs:
>
> Indeed.  We've discussed this before, with regards to IIRC, the Victorian room's
> hardwood floor colors.
>
> I'm thinking that there might be a way to automatically create a color map from
> an image file using SDL.   Scan all the color values using eval pigment :)  and
> then do a bit of math to see how the color values cluster.
>
> Might need some iterative interaction.   Maybe plot a graph in 3d RGB space.
>
> I might be (probably) reinventing someone's wheel here.
>
> Maybe someone will get to this before I do.    So many experiments.  Need more
> round tuits, coffee, and processor cores.
>

My round tuit generator is over-heated at the moment and spits out 
half-molten artefacts. I am afraid they are of not much use...

-- 
Thomas


Post a reply to this message

From: INVALID ADDRESS
Subject: Re: Reverse engineering pigments and textures
Date: 26 Jul 2016 04:33:01
Message: <1934107526.491199329.263926.gdsHYPHENentropyAThotmaolDOTcom@news.povray.org>
This may or may not be useful for you, but a while back I wrote a simple
.NET app for ripping unique colors from images into include files as
materials and pigments, and an example can be seen here:

http://news.povray.org/povray.binaries.images/thread/%3Cop.vo4db60c0819q0@gdsentropy.nc.rr.com%3E/

The colors were torn from a picture of aluminum corrosion (I used the same
sampling technique in my corrosion macros), and then used to make an agate
like texture by layering scaled copies of the object and used materials and
such.

The original goal was to emulate depth, as one would find in a gas giant
cloud deck, which worked fairly well. Then life happened, as it is wont to
do, and I largely had to drop the POV projects I hd going. Anyway it has a
variety of applications as I am sure you can imagine.

I am confident that there is a better way to do this specific thing (the
physical depth of the texture such that you can see through to lower
layers) but that is neither here nor there.

I can dump the source somewhere, it was the product of maybe 15m work so
don't expect much in the way of "production" worthy code. If I recall
correctly it does have substantial support for various texture and pigment 
properties.

I do not check here often so you can shoot me a mail at (sorry for this):

gds
-
entropy

at hotmail.

The hyphen isn't an accident.

If enough interest exists I can port it up to .NET 4.6.1 and implement it
in WPF, improve threading, and post it in binaries.utilities as source and
binaries.

Ian

Bald Eagle <cre### [at] netscapenet> wrote:
> I am having, and have had in the past, difficulty reproducing the colors in
> sampled images in my renders.
> 
> I was wondering if anyone has had success in sampling rgb values from images or
> "rgb color picker" apps and getting good results in their renders.
> using sRGB doesn't seem to help me much.
> 
> This also got me thinking about how my overall textures are usually pretty
> simplistic and plain.
> 
> Has anyone ever thought about or even experimented with scanning images and
> trying to work backwards from the color / brightness values and trying to deduce
> any patterns?  I would think that someone, somewhere has experimented with this
> - universities, Disney, Pixar, IL&M, etc.
> 
> Thanks
> 
> 
>


Post a reply to this message

From: scott
Subject: Re: Reverse engineering pigments and textures
Date: 28 Jul 2016 08:05:05
Message: <5799bcb1$1@news.povray.org>
>> I was going to say this. I often 'pick' colours from photographs and
>> such but only as a basis for further and very substantial tweaking. An
>> example of one such exercise is the following fairly simple roof tile
>> texture based on colours picked from 'real' tiles photographs:
>
> Indeed.  We've discussed this before, with regards to IIRC, the Victorian room's
> hardwood floor colors.
>
> I'm thinking that there might be a way to automatically create a color map from
> an image file using SDL.   Scan all the color values using eval pigment :)  and
> then do a bit of math to see how the color values cluster.

You need to "undo" the lighting still as clipka mentioned.

Otherwise, even with simple lighting in the image, you will not get what 
you expect. Imagine sampling the "bright orange" and "brown" central 
squares from the image below:

https://phenomenalqualities.files.wordpress.com/2009/09/rubiks-cube-2.jpg

Both have the same rgb value - yet this is not what you want to use in 
your pigments!


Post a reply to this message

Copyright 2003-2008 Persistence of Vision Raytracer Pty. Ltd.