The majority of that went straight over my head. I tried a few of Bagginsbill's methods in Poser, nice results, but boy did they take some setting up!
CHEERS!
Sorry. Basically I'm saying that I can make my products handle the setup for GC by default, so you don't have to do anything. But you can still make adjustments if you want.
Part of that is the fault of the map...I've found ones that are RGB as opposed to greyscale are more likely to show up 'whispy'. And for a control map, does it really need to be 'color'?
Hmmm... that seems like a good tip, thank you. I need to test that.
I don't know if it's more likely that a non-color map is likely to be seen by Studio as a control map or what, but it's just something I've noticed. I kind of stumbled on it when working with an older hair. The bump map was not color (no, mot just black and white, but actually set to greyscale)...so was the transmap. I was trying to enhance the detail by pasting in one of the light colored maps, but it was always pasting in as a greyscale image, not color. But I also noticed that when rendered, this particular hair didn't get all wispy when GC was on. So I started looking/playing. The few that were wispy, so while the transmaps were black and white, they were actually 'color' images. And manually correcting them did help. But the true non-color ones loaded as 'control' maps...
What I'd like to know is exactly what data is being used when the maps are being fed to tdlmake. Because, that color information, bit depth and what not are all in the image file, so I'm guessing that the correction is more likely to be correct, if the expected map type is being fed to it...a true non-color map for a control map.
Also, map location seems to have some impact as to whether or not the 'guess' is correct. Ones in the strength slots tend, in my experience, to be more often correctly assigned. While ones in 'color' slots aren't. Normal maps seem to be one exception...it seems to me to be totally random as to how they are treated.
Yep, two things always seems weird to me: RGB greyscale images as bump or specular maps, which does not make any sense and specular maps applied in the color slot instead strength. It is easy to see why some presets looks so weird when you use GC and gamma 2.2 as the presets were adjusted without GC and to compensate these wrong behaviors. Turn on GC and you get the correct calculations over wrong values. That is it!
1) GC in Render Settings: Map settings get saved with Material(s) Presets. But if a user changes maps then I think they have to update the settings for the new map.
I just tested this. The gamma for opacity maps will stay at 1 even if you change the maps manually. Unless you change it to a map that already has a different value assigned to in this particular scene, of course.
// I loaded A3 and her Poser mat that made DS give the eyelash opacity map the "0" value, then set it to 1 manually. Then loaded a few other maps for opacity via the Surfaces tab, and DS assigned 1 to all of them, even though they were all RGB //
Wow, thanks! Yep, you called it (not full GI). My typical light setup in these renders is a key (either spot or distant light), a rim (same), and a low intensity AoA Ambient or UberEnvironment2 as a fill. This hopefully demonstrates one of the ways that GC makes things easier.
Maybe it would be good to show one with full GI though. Do you have a recommended setup?
Well, I guess I don't really have a preferred setup for GI because every scene will be different.
What makes using full GI feasible, though, is 3Delight's full raytracer module which is specifically optimised for intensive tasks like that (this is available in the "default" 3Delight render settings pane via the "progressive" switch), and then, to speed things up even more, a special RiOption that tells 3Delight to use its GI caching - the latter requires using the "scripted 3Delight" (which has its caveats).
If you scroll down this lengthy post of mine, there is my take on the Fiery Genesis scene attached (with some explanations), and the second link points to the post where the scene file can be downloaded and my mini-tutorials on "scripted rendering" accessed:
UPD: you can safely decrease diffuse bounces from 5 to 2. It won't make a visible difference for this scene, but will cut down on render time. I was just testing how high we could go.
There is a kit with better versions of the scripts and with new shaders in the works, but I have apparently contracted the "DAZ S**n" virus =)
-------------
What I'd like to know is exactly what data is being used when the maps are being fed to tdlmake.
Here are the tdlmake options that concern GC (source: 3Delight PDF manual):
-gamma n
Indicates the gamma of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it. -rgbagamma n n n n
Indicates the gamma of each red, green, blue and alpha channel of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it. -colorspace
Indicates the color space of the input image (linear, sRGB or BT.709). This allows tdlmake and 3Delight to convert the texture to a linear space before performing computations on it.
I assume that the first option is used: DS just tells tdlmake the general gamma.
What I'd like to know is exactly what data is being used when the maps are being fed to tdlmake.
Here are the tdlmake options that concern GC (source: 3Delight PDF manual):
-gamma n
Indicates the gamma of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it. -rgbagamma n n n n
Indicates the gamma of each red, green, blue and alpha channel of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it. -colorspace
Indicates the color space of the input image (linear, sRGB or BT.709). This allows tdlmake and 3Delight to convert the texture to a linear space before performing computations on it.
I assume that the first option is used: DS just tells tdlmake the general gamma.
Yeah, that's what I'm leaning towards, too. But...like all the rest it's hard to say for sure what Studio is passing...like my discovery of it embedding the source...encrypted, as the default, when passing it off to shaderdl.
...
...
Hmmm... I'm not sure I understand the contradiction, nor the severity of the problem that makes you say shader GC should be avoided entirely. Can you explain further? Could you demonstrate it with a picture? Note that all of my renders shared in this thread use shader GC.
Here is the simple shader that i used to build gamma correction into the shader: Basically it is the same as yours. I rendered a bunch of triangles pointing at a common center (similar to those images used to test printers) with this and with the DefaultShader. The triangles are 50% grey against a black background. The result is below. On the left is the render with the Default Shader and Gamma 2.2, on the right is the "Gamma Encoding"-Material with a render exposure of 1. Especially in the center of the image it can be noticed that the Shader-Gamma is considerably darker than the Default Render. The reason for this is that 3delight implements anti-aliasing by averaging pixel values (this is the "pixel samples" in the render settings). The average value is then gamma encoded. E.g. if there are 3 sampling points with a grey value of 0.5 and one with a value of 0 it results in ((0.5 * 3 + 1 * 0) / 4) ** (1/2.2) = 0.64; as an 8-bit pixel vlaue this is 163. The Gamma-Shader does the exponentiation to the individual samples and then 3delight averages the result: (3 * (0.5 ** (1/2.2)) + 1 * 0) / 4 = 0.55; => 140 in 8-bit.
Because gamma encode function x->x ** (1/2.2) is a concave function, the average of some values will always be less than the value of the average, i.e. mixed pixels will be too dark.
In short, this setup has violated the rule of linear workflow:
All calculations need to happen in linear gamma.
Antialiasing is a calculation, not viewing therefore this gamma encoding shader must not be used with anti-aliasing. In practice this means: when using this shader always render with pixel-samples = 1. A lot of people did (and still do) it that way, but this usually means they have some kind of post-work after the render, so they do not need the gamma corrected output anyway.
Here is another example: I placed a single plane on the left with a color of 50% grey as a reference. Right of it i put a plane with a color of pure white but with 50% opacity. The background is black, so the resulting color of both planes is 50% grey, because for the plane on the right it's 50% white + 50% black background => 50% grey. I put a copy of that 50% opaque plane below the other one, to see what happens where the planes overlap (think of the opacity for something like leaves, transmapped hair, etc). I can predict that the overlapping area will be 75% grey, because it is 50% white and for the remaining 50% there is 50% white of the other plane. The first render (with the Default Shader and an exposure of 1.0) shows exactly these values. In the center the same image rendered with Exposure = 2.2. As expected the overlapping area is 0.75 ** (1/2.2) = pixel value 224.
Rendering the same with the Gamma-Shader, the result is interesting: The overlapping area is pure white! The reason for this is in the opacity of the Gamma Shader. Since only the color channel is gamma encoded, the color channel is no longer premultiplied by the opacity, which is a requirement by renderman renderers in general. The renderer adds color values until it has a a total opacity of 100%. In a linear model it is not possible that a 50% transparent surface can be brighter than 50% grey against a black background. Again, here the color values are used in a calculation (alpha blending), and not for viewing, so they should be linear.
Stupid thing is: This unpremultiplied color is exactly what is required for gamma corrected output (only the color is gamma encoded, not the alpha channel). So there is no easy solution for this problem.
In general, there will be problems with the Gamma Shader whenever pixel values are added together, because only linear values should be added, and the Gamma Shader does not provide them. Pixel values are added in the surface shader (you probably have that covered, if you have read bagginsbills posts), i.e. rip the DefaultMaterial node apart into a big sum of components, gamma decode each component, and then gamma encode the result. All other places where pixel values are added, will make problems, because they are not in the Surface shader and rely on linear color values: Antialiasing, Pixel Filters, Alpha blending. As soon as indirect lighting is to be used, it is probably based on light shaders, that sample the scene and take averages of surrounding surfaces, etc.
The previous examples basically show that is difficult to try to built the render exposure into the shader, not the gamma correction of inputs. With the inputs the problem (at least my problem) is more the usability of these shaders.
Take for example the Aikobot-2 shader that you mentioned. It has some kind of GC built in. E.g. the reflection map is gamma-decoded. One day i decide to use a HDR image for the reflection map (not an uncommon use of HDRI). HDRIs are usually linear images, so no gamma is to be applied. The guessing algorithm of DS4 already knows this and so it works for the most part. The Aikobot2 shader does not and so it decodes the image anyway. I can also not simply set the GC parameter to 1.0 because the shader tightly couples the Input-Gamma with the Exposure-Gamma. What i have to do is, using the image editor to assign a gamma of 0.4545 ( = 1/2.2) to the image so that DS encodes the image, so that the shader can decode it again (having to do these kinds of workarounds are the classic quirks of of a non-linear workflow).
The Aikobot2 shader is a rather primitive shader, where this is easily possible, but try that with a more complex of bagginsbill's shaders and this can be very difficult. There is even a version that tries to guess if the user is having GC or not and acts differently. Problem was (or is), it could only determine what the renderer was doing, not what the user wanted to do, leading to endless confusion (imagine: i rendered an image which finally looked ok, so i wanted to get a linear render of it for postwork, so i changed the render settings gamma to 1, rendered and...: exactly nothing changed :-). Using it has the feeling of playing with a Rubik's clock, where changing one parameter changes the value of every parameter. Not bagginsbill's fault, though: he clearly wrote that he never did any postwork on the images, so there was no reason for him to ever rendering something without gamma correction. But i think many users do something with their images apart from posting them directly on the internet, photoshopping for example.
I'm curious, is gamma correction taken care of in Reality 4? Is it only something that needs changing in 3Delight?
Also, regarding the reduction of SSS to physically correct levels, how would I go about doing that? Reality has options for surface colour, interior colour, absorption scale, scattering scale and surface thickness.
I'm curious, is gamma correction taken care of in Reality 4? Is it only something that needs changing in 3Delight?
Also, regarding the reduction of SSS to physically correct levels, how would I go about doing that? Reality has options for surface colour, interior colour, absorption scale, scattering scale and surface thickness.
I am not sure if Reality pre-processes image maps for gamma correction, but by default it does not set the resultant Gamma to 2.2, you have to do that yourself.
Luxus DOES process the image maps for gamma.
I'm not really sure how to address your second question. AoA's subsurface shader uses some scientifically observed characteristics, if you set the material in the shader. UberSurface and uberSurface 2... well that's a whole 'nother ball of wax.
I'm curious, is gamma correction taken care of in Reality 4? Is it only something that needs changing in 3Delight?
Also, regarding the reduction of SSS to physically correct levels, how would I go about doing that? Reality has options for surface colour, interior colour, absorption scale, scattering scale and surface thickness.
I am not sure if Reality pre-processes image maps for gamma correction, but by default it does not set the resultant Gamma to 2.2, you have to do that yourself.
Luxus DOES process the image maps for gamma.
I'm not really sure how to address your second question. AoA's subsurface shader uses some scientifically observed characteristics, if you set the material in the shader. UberSurface and uberSurface 2... well that's a whole 'nother ball of wax.
Right.
And to the second question, I probably should be more careful about how I use the term "physically correct". What I should have said is "to get closer to physically correct levels" or "close enough". Close enough may in fact be different for you than it is for me. But if you were working with AoA's Subsurface shader and Gamma Correction, you could try setting SSS Strength to 25% rather than the default value of DAZ presets (typically 75%) and see if you like the results. Sorry, I know your questions were focusing on Reality, but maybe someone who knows Lux/Reality well can chime in.
Anyway, What I'm doing is looking at photographs and real life and trying to get my renders to be closer to my reference. Fortunately, the science supports my supposition that less SSS is needed if you use GC, since both are essentially adding light. The goal is to make the surface look as translucent as skin but not as translucent as a candle or a wax museum figure. And basically, if you use SSS without GC, then the parts of the skin that don't have a strong SSS influence, due to said "characteristics" will be subject to the problem of possibly being too dark. IE, in the absence of GC the baseline is just ordinary diffuse. I hope that makes sense. :)
Why are 99% of control maps saved in RGB when they should be grayscale?
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth.
Now, something like 16-bit TIFF will retain more resolution. MCasual wrote about it in one of his threads regarding displacement and posted examples: the 8-bit gave very jagged, stepped displacements, while 16-bit map created a smooth neat slope. But not many people create 16-bit control maps in the DS community, I believe.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Why are 99% of control maps saved in RGB when they should be grayscale?
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth.
Now, something like 16-bit TIFF will retain more resolution. MCasual wrote about it in one of his threads regarding displacement and posted examples: the 8-bit gave very jagged, stepped displacements, while 16-bit map created a smooth neat slope. But not many people create 16-bit control maps in the DS community, I believe.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Lightening/darkening the map changes the amount of apparent surface movement...so don't change/correct them.
Why are 99% of control maps saved in RGB when they should be grayscale?
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth.
Now, something like 16-bit TIFF will retain more resolution. MCasual wrote about it in one of his threads regarding displacement and posted examples: the 8-bit gave very jagged, stepped displacements, while 16-bit map created a smooth neat slope. But not many people create 16-bit control maps in the DS community, I believe.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Lightening/darkening the map changes the amount of apparent surface movement...so don't change/correct them.
I imagine we could compile an FAQ of sorts as these helpful things get shared, and I could add it to one of my front page posts. What do you guys think?
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth..
The problem isn't RGB, it's using .JPG for the distribution format which limits the color channels to 8 bit. TIFF has support for 16 bit color channels, so this is preferable. For normal maps, DDS is probably the best option since the usage is pretty widespread.
Gamma correction shouldn't be done via the shader. True, it takes some care setting up all the maps so they will use the correct gamma, but those should be the common practice anyway. DAZ needs to standardize the control maps format (for instance have all control maps be grayscale rather than RGB, but with much larger resolution) and set a different gamma correction profile for each type.
I agree with Parris, having the correct inputs makes things a lot more 'correct' and it will help not only those using iray, but also 3delight as well. We also need a more robust shader than what's available now.
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth..
The problem isn't RGB, it's using .JPG for the distribution format which limits the color channels to 8 bit. TIFF has support for 16 bit color channels, so this is preferable. For normal maps, DDS is probably the best option since the usage is pretty widespread.
Gamma correction shouldn't be done via the shader. True, it takes some care setting up all the maps so they will use the correct gamma, but those should be the common practice anyway. DAZ needs to standardize the control maps format (for instance have all control maps be grayscale rather than RGB, but with much larger resolution) and set a different gamma correction profile for each type.
I agree with Parris, having the correct inputs makes things a lot more 'correct' and it will help not only those using iray, but also 3delight as well. We also need a more robust shader than what's available now.
Jpeg is a pretty crappy format for any serious texture work. For 'final' versions I pack, I usually use png (somewhat better...too often tif will freak people out/or are too big...yeah, yeah, you can compress them, but why screw up a good thing?). It's especially bad considering that most image programs default to something less than 'top quality'...so in addition to the inherent bit limits and such, you get compression that reduces things even further.
Daz...standardize? Okay...yeah...umm...what's the temp in the Underworld this week? I've been waiting/wanting one very simple (hell it can be a couple of lines of code in the DIM) 'standardization' forever...case for all the file names. It doesn't have to be caps, lower or any particular one...but it does have to be JUST ONE! That's why I can't use the DIM...on a case-sensitive file system you end up with duplicated content folders after almost EVERY install. So going in and cleaning up/merging everything takes as much time, even with bulk renamers, as installing manually to begin with.
But yes, mandating those kind of changes to control maps would be very helpful...and solve a bunch of problems, with almost all the currently usable renders in Studio (I don't think the OpenGL one cares/matters). I'm sure that there's even ways of batch 'correcting' the files...and maybe even 'normalizing' them. I'm pretty sure that ImageMagick could do any conversions needed on a batch basis.
The basic/default Studio surface shader for 3DL is essentially the same as it was back in DS 3 and before. It does not leverage any of the more recent improvements/additions to 3DL. It's pretty much 'oldschool'...and therein lies half the problem. It's inefficient, with current standards, at best and just plain 'wrong' at worst. One of the biggest areas that it is noticeable in is when doing SSS (this includes the US/US 2 shaders, too). They are all using old algorithms/methods which are much slower and more prone to 'overdoing' it than the newer ones. But even still, outside of this thread and the Laboratory thread, there's very little being done to push it to the max it is capable of. Of course, getting the most out of it, starts with correct inputs...so it's all sort of circular. (GIGO applies here.)
And yes, I'm too lazy to do all the corrections, all the time...but I'm trying to. I am trying to make sure the stuff I'm currently making is 'correct'...and what renders I do as 'final' renders are all correct. There's just too much not 'correct' stuff to get it all done...although having Studio itself setup properly does help.
Jpeg is a pretty crappy format for any serious texture work. For 'final' versions I pack, I usually use png (somewhat better...too often tif will freak people out/or are too big...yeah, yeah, you can compress them, but why screw up a good thing?).
Well, you're confusing users want to do serious work with DAZ Studio and 3delight. :)
More seriously though, if realism is the goal, as per the title thread, the basics like linear workflow, gamma correction, good textures and control maps are a must. That's why I really appreciate when vendors put extra effort in supplying good, workable textures. I've seen sets that look good even with just the default DAZ materials (and some good lighting setup).
The second hurdle is of course, materials. I still see illogical things like setting up pure white for diffuse. Probably because it was setup without gamma correction, both in the texture or via the renderer. Again, some have setup their materials more sensibly, but that's sadly not the norm.
Third, materials and lighting are tied together. If you setup your materials incorrectly, or perhaps the more accurate term, illogically, you ended up setting your lights the wrong way. Unfortunately, most just look at renders with gamma correction enabled and say it looks bad with their current setup and went back to not using linear workflow.
The fourth one, as you noted, are the shaders used. Even advanced shaders like US2 doesn't keep up with advancements in 3delight. Technically you can make your own shaders with Shader Mixer, but the bricks are generally outdated. Last step is Shader Builder, but that has it own sets of problems.
But even with those limits, you can do some pretty cool things. It might not be accurate or even correct, but they can be close. I've cobbled a shader network in Shader Mixer quite quickly using clay for diffuse, glossy with fresnel based IOR for specular and the results are very close to what I've gotten with US2. I'll probably do more with it when I have the time.
Lastly, but not the least, is to document and simply things so the entire workflow is accessible/understandable for most. I'm willing to bet those who post here know things like IOR values for glass or water. But most will not and having too much controls to work with is daunting to most. So, we need good tutorials and FAQs for every step/part of the workflow.
Call me hopeful, but if there's enough time and effort put into this, then I think users will glady follow.
I imagine we could compile an FAQ of sorts as these helpful things get shared, and I could add it to one of my front page posts. What do you guys think?
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Lightening/darkening the map changes the amount of apparent surface movement...so don't change/correct them.
So they are control maps, and hence linear?
--------
The problem isn't RGB, it's using .JPG for the distribution format which limits the color channels to 8 bit. TIFF has support for 16 bit color channels, so this is preferable. For normal maps, DDS is probably the best option since the usage is pretty widespread.
That's what I essentially meant by "RGB", 8 bit per channel. "Low dynamic range".
I believe we could also use HDR/EXR these days in DS.
So, we need good tutorials and FAQs for every step/part of the workflow.
Call me hopeful, but if there's enough time and effort put into this, then I think users will glady follow.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Lightening/darkening the map changes the amount of apparent surface movement...so don't change/correct them.
So they are control maps, and hence linear?
They may be RGB, but yes they should be linear. I know that most of the programs that you can actually bake them in should be linear, already. Those where you make them by way of converting other maps, like PS, Gimp, etc should also be linear. But in any case, the over all, as it appears in the image, color is sort of the 'base' for the map...like the mid-grey or black on displacement maps. Changing the gamma on that is going to change the baseline, just like over correcting any other map will...basically, assume they are linear, unless you know for sure, otherwise.
Remember they are essentially a 3d color bump map.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Normal maps encode in RGB value the xyz values of the vector normal to the surface. The values have no meaning at in in terms of colors (since they are not colors!) and therefore normal maps are to be used with a gamma of 1.
The rules is simple: is it a "color"? if so, check the color space in which the image is saved and verify if there is a burned-in gamma value; if not, use a gamma of 1. E.g., a JPG diffuse, saved as sRGB uses, by sRGB specification, a gamma of 2.2; a PNG file could have all kind of "weird" gamma, as specified in its gAMA chunk. And HDR file, saved with a gamma of 1 and used not only as IBL data but also as background would require a gamma of 1. Bump or normal maps are non-color maps and therefore you can assume a gamma of 1 (unless the person who created them was smoking "strong stuff" and in this case the recycle bin is a vital tool).
Linear workflow is not about doing something "strange"; it is about stopping using old hacks and doing the things in the right way (i.e. 2+2=4).
I was simply unsure if the normal maps will get gamma-encoded for "easy previews" or something. Kinda shows I don't use them much, what with 3Delight's efficient displacement.
This shows the uncorrected 3Delight version of a render with the Hachiro skin and Advanced Ambient lights, the corrected version with render settings only, and a version where I went to templates 1, 2 and 3 and applied gamma 2.2 to the textures as well.
The uncorrected one looks better to me. What did I do wrong?
Along with what Mjc said, the lighting as a rule will also need to be redone when it comes to intensities and falloffs.
The manthra is: no falloff on distant lights (at infinite distance), quadratic falloff on spotlights, pointlights and area lights (those at a finite distance). The physically based settings.
(and if by any chance there is "velvet" enabled on skin, it will also need to be dialed down)
UPD: reflection maps should also be checked, when it comes to reflection intensity. Less so with raytraced reflections, but still a point as well. Your character appears to use a lot of reflection for the eye surfaces.
...what you are doing when gamma-correcting the output (again, which only part of the parcel) is similar to using those "curves" in Photoshop. Take a photo you like and apply an inverse 2.2 (approx. 0.45) gamma curve to it in an image editor. See how different it looks.
Now, if you were to set up physical lights to a new photo sesstion so as to get the scene in a thus "corrected" photo look the way it did in the original "normal" photo - imagine how different it would look to your naked eye.
The eye that essentially does the "2.2 correction" to all the linear luminosities of the real world.
John Hable has more on that, with examples: http://filmicgames.com/archives/299Kettu, as much as we have gone rounds about how I should be using GC, I agree with that 100%. Just look at FW Eve's face, the glowing edges, the over done peach-fuzz, etc.
The velvet, glossiness, reflection, etc. They have ALL got to be adjusted properly for an image to render correctly, and not look like this test with "GC on".
The lights that are set up, YES, the must be also adjusted , as the fall off is also effected heavily by GC Off Gamma 1.0 / On Gamma 2.2 as I also found out, just trying to get the one test render done.
The original GC off renders were for a different reason, and thus altering the figures default mats in any way, was a No Go.
[Daz Studio 4.7] [3delight] [No Post Processing]
I was simply unsure if the normal maps will get gamma-encoded for "easy previews" or something. Kinda shows I don't use them much, what with 3Delight's efficient displacement.
You and me both, lol. I had seen the maps around, and thought the image headers were corrupted, lol. I even thought for a long time, that Normal was just some form of Hue/saturation or something adjustment for the surface shader, lol.
That would be wonderful. Would really, really appreciate that.
This feels like a prayer being answered.
I decided to start using DS from now on after seeing such realistic renders.
I am right now rendering my very first image in DS using Iray.
Will post it for your constructive criticism once its done.
Today is the start of the DS journey.
Ok, sounds good. Although, it might be best to post Iray Renders elsewhere (and let me know where you post), since this thread focuses on 3Delight and Gamma Correction. I haven't yet delved into Iray so I won't be able to offer Iray specific help, just yet. I think we have yet to establish (in this thread at least) how to properly implement GC in Iray.
The few tests that I have managed to get done (CPU bound befor last week), appear to demonstrate that the default settings of "Tone Mapping ON" and "Gamma 2.2" Work the best with the lights and distances. As for surfaces, well, skin is still in the works for many other reasons (Nothing to do with Gamma).
The most painful bit with Iray that I have found, is turning "Tone Mapping" off, removes the Gamma setting entirely from the loop. The one render I tried it with looked horrible till I opened it up in an image viewer (Irfanview) and set a gamma of 2.2 to the image.
"O" and the default in Iray, is Gamma 2.2 by the way.
I know it can be a pain to learn all them ISO F-stop etc settings, And I know it is way more then what 3delight has in the render tab for settings. Tho for the time being, "Tone Mapping" must be left ON (The default by the way), in order to have access to the Gamma controls.
Linear workflow is not about doing something "strange"; it is about stopping using old hacks and doing the things in the right way (i.e. 2+2=4).
Yes, but it seems that there are many out there that think it's something akin making a Philosopher's Stone and turning lead into gold... For some of us, it is kind of a 'Black Arts' to run with the Gamma settings set to something other then the default, especially the new to daz individuals like me.
The entire thing regarding what linear vs logarithmic vs whatever it's called workflow, can be incredibly confusing. I have lost track of how many times My color chart cube was called into question, simply because of the various file gamma settings, and getting them 0.45/1.0/2.2 curves vs mid gray (127.5 in 8bit) confused.
I do not have anything against using Gama control as far as workflow of others. My issues tend to be more of a, I want to have fun and be able to see what I'm doing, without shooting blind or breaking the bank on a four-socket µ-Supercomputer just to run IPR.
For some stupid reason, turning GC on and setting Gamma to 2.2 in Daz Studio 4.6, 4.7, and 4.8, Tends to make the view-field so dark, I can't use it for many aspects of setting up a scene. It is painful. The solution would be to run IPR, if it was not so slow on this computer, and also make this computer so non-responsive.
I (and many others probably) would love to have a four-socket (32-core total) computer with 128GB of RAM. I cant afford the electric bill, lol.
Renderers are created by 'geeks', not artists. So, by default they are going to be techie/scientific and are going to operate in 'the real world'...the place where math rules and the calculations are logical (scientific sense of the word), linear and predictable. Computers are great at crunching numbers, but to get the expected results, they must have the correct input. Linear workflow IS that correct input. Everything else is pretty much like putting in random data and expecting a sensible result...
Renderers are created by 'geeks', not artists. So, by default they are going to be techie/scientific and are going to operate in 'the real world'...the place where math rules and the calculations are logical (scientific sense of the word), linear and predictable. Computers are great at crunching numbers, but to get the expected results, they must have the correct input. Linear workflow IS that correct input. Everything else is pretty much like putting in random data and expecting a sensible result...
Comments
Sorry. Basically I'm saying that I can make my products handle the setup for GC by default, so you don't have to do anything. But you can still make adjustments if you want.
I see, sounds good.
CHEERS!
I don't know if it's more likely that a non-color map is likely to be seen by Studio as a control map or what, but it's just something I've noticed. I kind of stumbled on it when working with an older hair. The bump map was not color (no, mot just black and white, but actually set to greyscale)...so was the transmap. I was trying to enhance the detail by pasting in one of the light colored maps, but it was always pasting in as a greyscale image, not color. But I also noticed that when rendered, this particular hair didn't get all wispy when GC was on. So I started looking/playing. The few that were wispy, so while the transmaps were black and white, they were actually 'color' images. And manually correcting them did help. But the true non-color ones loaded as 'control' maps...
What I'd like to know is exactly what data is being used when the maps are being fed to tdlmake. Because, that color information, bit depth and what not are all in the image file, so I'm guessing that the correction is more likely to be correct, if the expected map type is being fed to it...a true non-color map for a control map.
Also, map location seems to have some impact as to whether or not the 'guess' is correct. Ones in the strength slots tend, in my experience, to be more often correctly assigned. While ones in 'color' slots aren't. Normal maps seem to be one exception...it seems to me to be totally random as to how they are treated.
Yep, two things always seems weird to me: RGB greyscale images as bump or specular maps, which does not make any sense and specular maps applied in the color slot instead strength. It is easy to see why some presets looks so weird when you use GC and gamma 2.2 as the presets were adjusted without GC and to compensate these wrong behaviors. Turn on GC and you get the correct calculations over wrong values. That is it!
- -
Well, I guess I don't really have a preferred setup for GI because every scene will be different.
What makes using full GI feasible, though, is 3Delight's full raytracer module which is specifically optimised for intensive tasks like that (this is available in the "default" 3Delight render settings pane via the "progressive" switch), and then, to speed things up even more, a special RiOption that tells 3Delight to use its GI caching - the latter requires using the "scripted 3Delight" (which has its caveats).
If you scroll down this lengthy post of mine, there is my take on the Fiery Genesis scene attached (with some explanations), and the second link points to the post where the scene file can be downloaded and my mini-tutorials on "scripted rendering" accessed:
http://www.daz3d.com/forums/discussion/21611/P570/#650436
http://www.daz3d.com/forums/discussion/21611/P570/#650631
UPD: you can safely decrease diffuse bounces from 5 to 2. It won't make a visible difference for this scene, but will cut down on render time. I was just testing how high we could go.
There is a kit with better versions of the scripts and with new shaders in the works, but I have apparently contracted the "DAZ S**n" virus =)
-------------
Here are the tdlmake options that concern GC (source: 3Delight PDF manual):
-gamma n
Indicates the gamma of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it.
-rgbagamma n n n n
Indicates the gamma of each red, green, blue and alpha channel of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it.
-colorspace
Indicates the color space of the input image (linear, sRGB or BT.709). This allows tdlmake and 3Delight to convert the texture to a linear space before performing computations on it.
I assume that the first option is used: DS just tells tdlmake the general gamma.
Here are the tdlmake options that concern GC (source: 3Delight PDF manual):
-gamma n
Indicates the gamma of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it.
-rgbagamma n n n n
Indicates the gamma of each red, green, blue and alpha channel of the input image. This allows tdlmake and 3Delight to convert the texture to a gamma of 1.0 before performing computations on it.
-colorspace
Indicates the color space of the input image (linear, sRGB or BT.709). This allows tdlmake and 3Delight to convert the texture to a linear space before performing computations on it.
I assume that the first option is used: DS just tells tdlmake the general gamma.
Yeah, that's what I'm leaning towards, too. But...like all the rest it's hard to say for sure what Studio is passing...like my discovery of it embedding the source...encrypted, as the default, when passing it off to shaderdl.
Antialiasing is a calculation, not viewing therefore this gamma encoding shader must not be used with anti-aliasing. In practice this means: when using this shader always render with pixel-samples = 1. A lot of people did (and still do) it that way, but this usually means they have some kind of post-work after the render, so they do not need the gamma corrected output anyway.
Here is another example: I placed a single plane on the left with a color of 50% grey as a reference. Right of it i put a plane with a color of pure white but with 50% opacity. The background is black, so the resulting color of both planes is 50% grey, because for the plane on the right it's 50% white + 50% black background => 50% grey. I put a copy of that 50% opaque plane below the other one, to see what happens where the planes overlap (think of the opacity for something like leaves, transmapped hair, etc). I can predict that the overlapping area will be 75% grey, because it is 50% white and for the remaining 50% there is 50% white of the other plane. The first render (with the Default Shader and an exposure of 1.0) shows exactly these values. In the center the same image rendered with Exposure = 2.2. As expected the overlapping area is 0.75 ** (1/2.2) = pixel value 224.
Rendering the same with the Gamma-Shader, the result is interesting: The overlapping area is pure white! The reason for this is in the opacity of the Gamma Shader. Since only the color channel is gamma encoded, the color channel is no longer premultiplied by the opacity, which is a requirement by renderman renderers in general. The renderer adds color values until it has a a total opacity of 100%. In a linear model it is not possible that a 50% transparent surface can be brighter than 50% grey against a black background. Again, here the color values are used in a calculation (alpha blending), and not for viewing, so they should be linear.
Stupid thing is: This unpremultiplied color is exactly what is required for gamma corrected output (only the color is gamma encoded, not the alpha channel). So there is no easy solution for this problem.
Why are 99% of control maps saved in RGB when they should be grayscale?
They are wonky enough as it is, but I'm sure this doesn't help. Ever try to change them and see even more wonky behavior?
In general, there will be problems with the Gamma Shader whenever pixel values are added together, because only linear values should be added, and the Gamma Shader does not provide them. Pixel values are added in the surface shader (you probably have that covered, if you have read bagginsbills posts), i.e. rip the DefaultMaterial node apart into a big sum of components, gamma decode each component, and then gamma encode the result. All other places where pixel values are added, will make problems, because they are not in the Surface shader and rely on linear color values: Antialiasing, Pixel Filters, Alpha blending. As soon as indirect lighting is to be used, it is probably based on light shaders, that sample the scene and take averages of surrounding surfaces, etc.
The previous examples basically show that is difficult to try to built the render exposure into the shader, not the gamma correction of inputs. With the inputs the problem (at least my problem) is more the usability of these shaders.
Take for example the Aikobot-2 shader that you mentioned. It has some kind of GC built in. E.g. the reflection map is gamma-decoded. One day i decide to use a HDR image for the reflection map (not an uncommon use of HDRI). HDRIs are usually linear images, so no gamma is to be applied. The guessing algorithm of DS4 already knows this and so it works for the most part. The Aikobot2 shader does not and so it decodes the image anyway. I can also not simply set the GC parameter to 1.0 because the shader tightly couples the Input-Gamma with the Exposure-Gamma. What i have to do is, using the image editor to assign a gamma of 0.4545 ( = 1/2.2) to the image so that DS encodes the image, so that the shader can decode it again (having to do these kinds of workarounds are the classic quirks of of a non-linear workflow).
The Aikobot2 shader is a rather primitive shader, where this is easily possible, but try that with a more complex of bagginsbill's shaders and this can be very difficult. There is even a version that tries to guess if the user is having GC or not and acts differently. Problem was (or is), it could only determine what the renderer was doing, not what the user wanted to do, leading to endless confusion (imagine: i rendered an image which finally looked ok, so i wanted to get a linear render of it for postwork, so i changed the render settings gamma to 1, rendered and...: exactly nothing changed :-). Using it has the feeling of playing with a Rubik's clock, where changing one parameter changes the value of every parameter. Not bagginsbill's fault, though: he clearly wrote that he never did any postwork on the images, so there was no reason for him to ever rendering something without gamma correction. But i think many users do something with their images apart from posting them directly on the internet, photoshopping for example.
I'm curious, is gamma correction taken care of in Reality 4? Is it only something that needs changing in 3Delight?
Also, regarding the reduction of SSS to physically correct levels, how would I go about doing that? Reality has options for surface colour, interior colour, absorption scale, scattering scale and surface thickness.
I am not sure if Reality pre-processes image maps for gamma correction, but by default it does not set the resultant Gamma to 2.2, you have to do that yourself.
Luxus DOES process the image maps for gamma.
I'm not really sure how to address your second question. AoA's subsurface shader uses some scientifically observed characteristics, if you set the material in the shader. UberSurface and uberSurface 2... well that's a whole 'nother ball of wax.
I am not sure if Reality pre-processes image maps for gamma correction, but by default it does not set the resultant Gamma to 2.2, you have to do that yourself.
Luxus DOES process the image maps for gamma.
I'm not really sure how to address your second question. AoA's subsurface shader uses some scientifically observed characteristics, if you set the material in the shader. UberSurface and uberSurface 2... well that's a whole 'nother ball of wax.
Right.
And to the second question, I probably should be more careful about how I use the term "physically correct". What I should have said is "to get closer to physically correct levels" or "close enough". Close enough may in fact be different for you than it is for me. But if you were working with AoA's Subsurface shader and Gamma Correction, you could try setting SSS Strength to 25% rather than the default value of DAZ presets (typically 75%) and see if you like the results. Sorry, I know your questions were focusing on Reality, but maybe someone who knows Lux/Reality well can chime in.
Anyway, What I'm doing is looking at photographs and real life and trying to get my renders to be closer to my reference. Fortunately, the science supports my supposition that less SSS is needed if you use GC, since both are essentially adding light. The goal is to make the surface look as translucent as skin but not as translucent as a candle or a wax museum figure. And basically, if you use SSS without GC, then the parts of the skin that don't have a strong SSS influence, due to said "characteristics" will be subject to the problem of possibly being too dark. IE, in the absence of GC the baseline is just ordinary diffuse. I hope that makes sense. :)
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth.
Now, something like 16-bit TIFF will retain more resolution. MCasual wrote about it in one of his threads regarding displacement and posted examples: the 8-bit gave very jagged, stepped displacements, while 16-bit map created a smooth neat slope. But not many people create 16-bit control maps in the DS community, I believe.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Probably people are looking for better resolution (i.e. less banding in gradient fills). Although as far as I understand, RGB won't help with that - it's still the same 8 bit per pixel, same as 256-level grayscale. With RGB, we just get three channels set to the same value, all using the 8-bit depth.
Now, something like 16-bit TIFF will retain more resolution. MCasual wrote about it in one of his threads regarding displacement and posted examples: the 8-bit gave very jagged, stepped displacements, while 16-bit map created a smooth neat slope. But not many people create 16-bit control maps in the DS community, I believe.
...What really makes me wonder is how a normal map should be treated in the context of gamma correction.
Lightening/darkening the map changes the amount of apparent surface movement...so don't change/correct them.
Lightening/darkening the map changes the amount of apparent surface movement...so don't change/correct them.
I imagine we could compile an FAQ of sorts as these helpful things get shared, and I could add it to one of my front page posts. What do you guys think?
The problem isn't RGB, it's using .JPG for the distribution format which limits the color channels to 8 bit. TIFF has support for 16 bit color channels, so this is preferable. For normal maps, DDS is probably the best option since the usage is pretty widespread.
Gamma correction shouldn't be done via the shader. True, it takes some care setting up all the maps so they will use the correct gamma, but those should be the common practice anyway. DAZ needs to standardize the control maps format (for instance have all control maps be grayscale rather than RGB, but with much larger resolution) and set a different gamma correction profile for each type.
I agree with Parris, having the correct inputs makes things a lot more 'correct' and it will help not only those using iray, but also 3delight as well. We also need a more robust shader than what's available now.
The problem isn't RGB, it's using .JPG for the distribution format which limits the color channels to 8 bit. TIFF has support for 16 bit color channels, so this is preferable. For normal maps, DDS is probably the best option since the usage is pretty widespread.
Gamma correction shouldn't be done via the shader. True, it takes some care setting up all the maps so they will use the correct gamma, but those should be the common practice anyway. DAZ needs to standardize the control maps format (for instance have all control maps be grayscale rather than RGB, but with much larger resolution) and set a different gamma correction profile for each type.
I agree with Parris, having the correct inputs makes things a lot more 'correct' and it will help not only those using iray, but also 3delight as well. We also need a more robust shader than what's available now.
Jpeg is a pretty crappy format for any serious texture work. For 'final' versions I pack, I usually use png (somewhat better...too often tif will freak people out/or are too big...yeah, yeah, you can compress them, but why screw up a good thing?). It's especially bad considering that most image programs default to something less than 'top quality'...so in addition to the inherent bit limits and such, you get compression that reduces things even further.
Daz...standardize? Okay...yeah...umm...what's the temp in the Underworld this week? I've been waiting/wanting one very simple (hell it can be a couple of lines of code in the DIM) 'standardization' forever...case for all the file names. It doesn't have to be caps, lower or any particular one...but it does have to be JUST ONE! That's why I can't use the DIM...on a case-sensitive file system you end up with duplicated content folders after almost EVERY install. So going in and cleaning up/merging everything takes as much time, even with bulk renamers, as installing manually to begin with.
But yes, mandating those kind of changes to control maps would be very helpful...and solve a bunch of problems, with almost all the currently usable renders in Studio (I don't think the OpenGL one cares/matters). I'm sure that there's even ways of batch 'correcting' the files...and maybe even 'normalizing' them. I'm pretty sure that ImageMagick could do any conversions needed on a batch basis.
The basic/default Studio surface shader for 3DL is essentially the same as it was back in DS 3 and before. It does not leverage any of the more recent improvements/additions to 3DL. It's pretty much 'oldschool'...and therein lies half the problem. It's inefficient, with current standards, at best and just plain 'wrong' at worst. One of the biggest areas that it is noticeable in is when doing SSS (this includes the US/US 2 shaders, too). They are all using old algorithms/methods which are much slower and more prone to 'overdoing' it than the newer ones. But even still, outside of this thread and the Laboratory thread, there's very little being done to push it to the max it is capable of. Of course, getting the most out of it, starts with correct inputs...so it's all sort of circular. (GIGO applies here.)
And yes, I'm too lazy to do all the corrections, all the time...but I'm trying to. I am trying to make sure the stuff I'm currently making is 'correct'...and what renders I do as 'final' renders are all correct. There's just too much not 'correct' stuff to get it all done...although having Studio itself setup properly does help.
Well, you're confusing users want to do serious work with DAZ Studio and 3delight. :)
More seriously though, if realism is the goal, as per the title thread, the basics like linear workflow, gamma correction, good textures and control maps are a must. That's why I really appreciate when vendors put extra effort in supplying good, workable textures. I've seen sets that look good even with just the default DAZ materials (and some good lighting setup).
The second hurdle is of course, materials. I still see illogical things like setting up pure white for diffuse. Probably because it was setup without gamma correction, both in the texture or via the renderer. Again, some have setup their materials more sensibly, but that's sadly not the norm.
Third, materials and lighting are tied together. If you setup your materials incorrectly, or perhaps the more accurate term, illogically, you ended up setting your lights the wrong way. Unfortunately, most just look at renders with gamma correction enabled and say it looks bad with their current setup and went back to not using linear workflow.
The fourth one, as you noted, are the shaders used. Even advanced shaders like US2 doesn't keep up with advancements in 3delight. Technically you can make your own shaders with Shader Mixer, but the bricks are generally outdated. Last step is Shader Builder, but that has it own sets of problems.
But even with those limits, you can do some pretty cool things. It might not be accurate or even correct, but they can be close. I've cobbled a shader network in Shader Mixer quite quickly using clay for diffuse, glossy with fresnel based IOR for specular and the results are very close to what I've gotten with US2. I'll probably do more with it when I have the time.
Lastly, but not the least, is to document and simply things so the entire workflow is accessible/understandable for most. I'm willing to bet those who post here know things like IOR values for glass or water. But most will not and having too much controls to work with is daunting to most. So, we need good tutorials and FAQs for every step/part of the workflow.
Call me hopeful, but if there's enough time and effort put into this, then I think users will glady follow.
That's what I essentially meant by "RGB", 8 bit per channel. "Low dynamic range".
I believe we could also use HDR/EXR these days in DS.
Call me hopeful, but if there's enough time and effort put into this, then I think users will glady follow.
I agree. It's all about dedication.
So they are control maps, and hence linear?
They may be RGB, but yes they should be linear. I know that most of the programs that you can actually bake them in should be linear, already. Those where you make them by way of converting other maps, like PS, Gimp, etc should also be linear. But in any case, the over all, as it appears in the image, color is sort of the 'base' for the map...like the mid-grey or black on displacement maps. Changing the gamma on that is going to change the baseline, just like over correcting any other map will...basically, assume they are linear, unless you know for sure, otherwise.
Remember they are essentially a 3d color bump map.
Normal maps encode in RGB value the xyz values of the vector normal to the surface. The values have no meaning at in in terms of colors (since they are not colors!) and therefore normal maps are to be used with a gamma of 1.
The rules is simple: is it a "color"? if so, check the color space in which the image is saved and verify if there is a burned-in gamma value; if not, use a gamma of 1. E.g., a JPG diffuse, saved as sRGB uses, by sRGB specification, a gamma of 2.2; a PNG file could have all kind of "weird" gamma, as specified in its gAMA chunk. And HDR file, saved with a gamma of 1 and used not only as IBL data but also as background would require a gamma of 1. Bump or normal maps are non-color maps and therefore you can assume a gamma of 1 (unless the person who created them was smoking "strong stuff" and in this case the recycle bin is a vital tool).
Linear workflow is not about doing something "strange"; it is about stopping using old hacks and doing the things in the right way (i.e. 2+2=4).
Thank you folks.
I was simply unsure if the normal maps will get gamma-encoded for "easy previews" or something. Kinda shows I don't use them much, what with 3Delight's efficient displacement.
Yes, but it seems that there are many out there that think it's something akin making a Philosopher's Stone and turning lead into gold...
Along with what Mjc said, the lighting as a rule will also need to be redone when it comes to intensities and falloffs.
The manthra is: no falloff on distant lights (at infinite distance), quadratic falloff on spotlights, pointlights and area lights (those at a finite distance). The physically based settings.
(and if by any chance there is "velvet" enabled on skin, it will also need to be dialed down)
UPD: reflection maps should also be checked, when it comes to reflection intensity. Less so with raytraced reflections, but still a point as well. Your character appears to use a lot of reflection for the eye surfaces.
...what you are doing when gamma-correcting the output (again, which only part of the parcel) is similar to using those "curves" in Photoshop. Take a photo you like and apply an inverse 2.2 (approx. 0.45) gamma curve to it in an image editor. See how different it looks.
Now, if you were to set up physical lights to a new photo sesstion so as to get the scene in a thus "corrected" photo look the way it did in the original "normal" photo - imagine how different it would look to your naked eye.
The eye that essentially does the "2.2 correction" to all the linear luminosities of the real world.
John Hable has more on that, with examples: http://filmicgames.com/archives/299Kettu, as much as we have gone rounds about how I should be using GC, I agree with that 100%. Just look at FW Eve's face, the glowing edges, the over done peach-fuzz, etc.
The velvet, glossiness, reflection, etc. They have ALL got to be adjusted properly for an image to render correctly, and not look like this test with "GC on".
The lights that are set up, YES, the must be also adjusted , as the fall off is also effected heavily by GC Off Gamma 1.0 / On Gamma 2.2 as I also found out, just trying to get the one test render done.
The original GC off renders were for a different reason, and thus altering the figures default mats in any way, was a No Go.
[Daz Studio 4.7] [3delight] [No Post Processing]
Ok, sounds good. Although, it might be best to post Iray Renders elsewhere (and let me know where you post), since this thread focuses on 3Delight and Gamma Correction. I haven't yet delved into Iray so I won't be able to offer Iray specific help, just yet. I think we have yet to establish (in this thread at least) how to properly implement GC in Iray.
The few tests that I have managed to get done (CPU bound befor last week), appear to demonstrate that the default settings of "Tone Mapping ON" and "Gamma 2.2" Work the best with the lights and distances. As for surfaces, well, skin is still in the works for many other reasons (Nothing to do with Gamma).
The most painful bit with Iray that I have found, is turning "Tone Mapping" off, removes the Gamma setting entirely from the loop. The one render I tried it with looked horrible till I opened it up in an image viewer (Irfanview) and set a gamma of 2.2 to the image.
"O" and the default in Iray, is Gamma 2.2 by the way.
I know it can be a pain to learn all them ISO F-stop etc settings, And I know it is way more then what 3delight has in the render tab for settings. Tho for the time being, "Tone Mapping" must be left ON (The default by the way), in order to have access to the Gamma controls.
Yes, but it seems that there are many out there that think it's something akin making a Philosopher's Stone and turning lead into gold... For some of us, it is kind of a 'Black Arts' to run with the Gamma settings set to something other then the default, especially the new to daz individuals like me.
The entire thing regarding what linear vs logarithmic vs whatever it's called workflow, can be incredibly confusing. I have lost track of how many times My color chart cube was called into question, simply because of the various file gamma settings, and getting them 0.45/1.0/2.2 curves vs mid gray (127.5 in 8bit) confused.
I do not have anything against using Gama control as far as workflow of others. My issues tend to be more of a, I want to have fun and be able to see what I'm doing, without shooting blind or breaking the bank on a four-socket µ-Supercomputer just to run IPR.
For some stupid reason, turning GC on and setting Gamma to 2.2 in Daz Studio 4.6, 4.7, and 4.8, Tends to make the view-field so dark, I can't use it for many aspects of setting up a scene. It is painful. The solution would be to run IPR, if it was not so slow on this computer, and also make this computer so non-responsive.
I (and many others probably) would love to have a four-socket (32-core total) computer with 128GB of RAM. I cant afford the electric bill, lol.
There is linear workflow and then there is the wrong workflow. Okay I don't like using words this harsh, but it's what I have read people say =)
Renderers are created by 'geeks', not artists. So, by default they are going to be techie/scientific and are going to operate in 'the real world'...the place where math rules and the calculations are logical (scientific sense of the word), linear and predictable. Computers are great at crunching numbers, but to get the expected results, they must have the correct input. Linear workflow IS that correct input. Everything else is pretty much like putting in random data and expecting a sensible result...
Brilliant, just brilliant.
I could say more, but I won't.