The Official aweSurface Test Track

18911131466

Comments

  • wowie said:

    That's something that does not need effort on 3delight devs or TotalBiscuit. Only DAZ can do it. So I encourage people to put in a feature request for DAZ to include ray cache/radiosity caching option in the renderer settings.

    And a switch to use the raytracer without being locked to the "progressive" mode with its quirks and no filters.

  • wowie said:

     8x8 pixel samples, but really, really high irradiance samples.

    How high? =) 2048? 4096?

  • wowiewowie Posts: 2,029

    How high? =) 2048? 4096?

    Higher. wink Just in case anyone is wondering, AWE Surface does not have a clamp/limit on irradiance samples. But it's probably best to wait for the update regardless.

  • wowiewowie Posts: 2,029

    That figures;) This is the first actual comparison I've seen, very interesting! I suggested someone should start a "show us your low light interior grainfree non-post processed Iray renders", still haven't seen one.

    We probably never will. From what I see, with iray it perhaps best to over expose the scene in render and readjust exposure in postwork. You could do that with 3delight, but the difference in both speed and noise levels stays. I did dialed back on some of the optimizations I used though. Went too aggressive at some point and cut render times in half. laugh I guess 1/4 of iray render times will have to do for now. Probably best to just use up to date hardware.

  • Sven DullahSven Dullah Posts: 7,621
    edited November 2018
    wowie said:

    How high? =) 2048? 4096?

    Higher. wink Just in case anyone is wondering, AWE Surface does not have a clamp/limit on irradiance samples. But it's probably best to wait for the update regardless.

    I will have to find out which parameters are limited and which can be "abused"blush, the end justifies the means every now and then:) Already knew about the Irradiance samples, I tried 8192 at some point. Also diffuse strength, opacity strength, SSS samples if I'm not wrong, and maybe something else... I know I'm not supposed to do that if I am to stay within the physically plausible, but...well... curiosity killed the catlaugh

    Post edited by Sven Dullah on
  • Oso3DOso3D Posts: 15,011

    I plan on doing it, it's just paying work takes priority. ;)

    And yeah, you use the same tricks as real world photography; use more, diffuse light and then adjust in post. If you are going to adjust tone a lot, it helps to have higher color depth, which Iray does easily with Beauty canvas.

    A lot of people doing CGI work get weirdly hung up on simulating things exactly rather than using tools to produce effective results.

     

  • Sven DullahSven Dullah Posts: 7,621
    Oso3D said:

    I plan on doing it, it's just paying work takes priority. ;)

    I hear you:)

    Oso3D said:

    A lot of people doing CGI work get weirdly hung up on simulating things exactly rather than using tools to produce effective results.

    This I totally agree with, been there done that, as I said in my previous post, the end justifies the meansyes

  • wowiewowie Posts: 2,029
    edited November 2018

    I will have to find out which parameters are limited and which can be "abused"blush, the end justifies the means every now and then:) Already knew about the Irradiance samples, I tried 8192 at some point. Also diffuse strength, opacity strength, SSS samples if I'm not wrong, and maybe something else... I know I'm not supposed to do that if I am to stay within the physically plausible, but...well... curiosity killed the catlaugh

    Diffuse/specular/reflection strength is actually limited to 1. You can overdrive values from textures, but the actual input will never go pass 1. smiley Opacity strength is actually a color, but yes it is not clamped when viewed directly. SSS and translucency strengh isn't limited though, but using more than 100% isn't physically plausible.

    Irradiance and SSS samples don't have limits, as you've discovered.

    Here's the quick and dirty postwork 3delight shot, to get it somewhere close with the iray render with default settings. I used the default AWE emitter prop 1 at EV 7, which seems to closely translates to around 1000 - 1250 luminance in iray. The iray shot is the one without reflections.

     

    postwork 3delight 29 minutes 51.54 seconds.jpg
    1067 x 600 - 102K
    2 hours 1.79 seconds.jpg
    1067 x 600 - 331K
    Post edited by wowie on
  • Oso3D said:

    If you are going to adjust tone a lot, it helps to have higher color depth

    You meant higher dynamic range, right?

    Which reminds me I need to package one little scary arcane "render script" for my cozy little freebie thread... in the tradition of "DAZ Soon (tm)", I expect this to happen around mid-December. Cuz paid work and all that.

  • Oso3DOso3D Posts: 15,011

    Yes. It's a lot easier to darken/lighten/whatever if the image isn't going to get all jagged because you're trying to expand 3 shades of black into an entire image.

     

  • wowiewowie Posts: 2,029
    edited November 2018

    Nice to see progress. Draft settings with default 128 irradiance samples. Still in DS 4.7 with a 3delight build that's now around 3 years out of date. laugh

    Current release build - 7 minutes 34.88 seconds

    Developmental build - 4 minutes 42.22 seconds

    Still amazes me to see 3delight's rendering potential unleashed.

    Edit:

    Finished more extensive testing.

    With 2048 irradiances samples. Old - 32 minutes 25.45 seconds. Dev build - 15 minutes 15.10 seconds.

    With 2048 irradiance samples and 8x8 pixel samples. Old - 36 minutes 41.9 seconds. Dev build - 20 minutes 27.12 seconds.

    4 minutes 42.22 seconds.jpg
    1067 x 600 - 462K
    7 minutes 34.88 seconds.jpg
    1067 x 600 - 423K
    Post edited by wowie on
  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    Nice to see progress. Draft settings with default 128 irradiance samples. Still in DS 4.7 with a 3delight build that's now around 3 years out of date. laugh

    Current release build - 7 minutes 34.88 seconds

    Developmental build - 4 minutes 42.22 seconds

    Still amazes me to see 3delight's rendering potential unleashed.

    Edit:

    Finished more extensive testing.

    With 2048 irradiances samples. Old - 32 minutes 25.45 seconds. Dev build - 15 minutes 15.10 seconds.

    With 2048 irradiance samples and 8x8 pixel samples. Old - 36 minutes 41.9 seconds. Dev build - 20 minutes 27.12 seconds.

    Well, rendertimes cut down by almost half is really impressive!

  • Sven DullahSven Dullah Posts: 7,621
    edited November 2018

    ...meanwhile, playing around with transmission- and SSS-settings...

    image

    DreamClouds.png
    1920 x 1080 - 3M
    Post edited by Sven Dullah on
  • wowie said:

    Nice to see progress. Draft settings with default 128 irradiance samples. Still in DS 4.7 with a 3delight build that's now around 3 years out of date. laugh

    Current release build - 7 minutes 34.88 seconds

    Developmental build - 4 minutes 42.22 seconds

    Still amazes me to see 3delight's rendering potential unleashed.

    Edit:

    Finished more extensive testing.

    With 2048 irradiances samples. Old - 32 minutes 25.45 seconds. Dev build - 15 minutes 15.10 seconds.

    With 2048 irradiance samples and 8x8 pixel samples. Old - 36 minutes 41.9 seconds. Dev build - 20 minutes 27.12 seconds.

    Ah neat.

    The ball's emissive in the second render? Or this the no-extra-darkening in play?

  • kyoto kidkyoto kid Posts: 41,057
    wowie said:

    Nice to see progress. Draft settings with default 128 irradiance samples. Still in DS 4.7 with a 3delight build that's now around 3 years out of date. laugh

    Current release build - 7 minutes 34.88 seconds

    Developmental build - 4 minutes 42.22 seconds

    Still amazes me to see 3delight's rendering potential unleashed.

    Edit:

    Finished more extensive testing.

    With 2048 irradiances samples. Old - 32 minutes 25.45 seconds. Dev build - 15 minutes 15.10 seconds.

    With 2048 irradiance samples and 8x8 pixel samples. Old - 36 minutes 41.9 seconds. Dev build - 20 minutes 27.12 seconds.

    ...OK, this is getting me excited. 

  • wowiewowie Posts: 2,029
    edited November 2018

    The ball's emissive in the second render? Or this the no-extra-darkening in play?

    No. It's just having its global illumination exposure driven up. Both GI and specular exposure on the shader can now be used to over/underexpose the material even more. So I guess it has more extra non-darkening now.smiley

    Technically it breaks physical plausibility, but I figured I'd leave it up to users to use and experiment. As the saying goes, it is best to have it and not use it rather than not having it when you need it.

    Well, rendertimes cut down by almost half is really impressive!

    kyoto kid said:

    ...OK, this is getting me excited. 

    I've figured out how to implement some optimizations I put off with the release build. Technically, it's even more brute force and rely more heavily on multiple importance sampling. In effect, you can have twice the number of samples for equivalent render times. There'll be no change to the irradiance samples dial, but you can always deactivate the limits and push it up as high as you need it.

    Right now, I see almost close to 1 : 1 performance between specular and diffuse. The biggest performance hit that's left is due to using more pixel samples, which is necessary to minimize noise from reflections.  Have some more crazy ideas around to tackle that and perhaps optionally speed up some of the GI even more, but those are more difficult to work into the shader. Probably have to work on imager shaders now.

    Post edited by wowie on
  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    The ball's emissive in the second render? Or this the no-extra-darkening in play?

    No. It's just having its global illumination exposure driven up. Both GI and specular exposure on the shader can now be used to over/underexpose the material even more. So I guess it has more extra non-darkening now.smiley

    Technically it breaks physical plausibility, but I figured I'd leave it up to users to use and experiment. As the saying goes, it is best to have it and not use it rather than not having it when you need it.

    Thank youwink

    wowie said:

    Well, rendertimes cut down by almost half is really impressive!

    kyoto kid said:

    ...OK, this is getting me excited. 

    I've figured out how to implement some optimizations I put off with the release build. Technically, it's even more brute force and rely more heavily on multiple importance sampling. In effect, you can have twice the number of samples for equivalent render times. There'll be no change to the irradiance samples dial, but you can always deactivate the limits and push it up as high as you need it.

    Right now, I see almost close to 1 : 1 performance between specular and diffuse. The biggest performance hit that's left is due to using more pixel samples, which is necessary to minimize noise from reflections.  Have some more crazy ideas around to tackle that and perhaps optionally speed up some of the GI even more, but those are more difficult to work into the shader. Probably have to work on imager shaders now.

    Take your timesmiley, things are getting better and better!

  • Sven DullahSven Dullah Posts: 7,621
    edited November 2018

    Here are a couple of experiments, trying to create fog with awe:) Did not let the renders finish.

    image

    image

     

    image

    I used 40 parallell planes with opacity maps, same maps also inserted in the translucency strength channel, both translucency strength and translucency shadows at 100%.

    Having some problems getting the back lit version to work, with one plane I get the shadow right, not so much with 40:) Would it be better to go with transmission and tr. roughness? Render times are acceptable as long as I stick to opacity/translucency, did a quick test with transmission (IoR at 1), didn't finish it, naturally takes much longer. I turned off indirect lighting and shadows for the planes. Thoughts?

    A Foggy Tree awe.png
    1280 x 720 - 1M
    Fog awe1.png
    1232 x 633 - 2M
    Fog2 awe.png
    1227 x 623 - 1M
    Post edited by Sven Dullah on
  • wowiewowie Posts: 2,029

    I confessed that I never looked into fog/interior/atmosphere shaders. I believe they currently don't work with mustakettu's render script.

    This is rather handy if you're working with the planes method - https://sites.google.com/site/mcasualsdazscripts/mcjjet

     

  • kyoto kidkyoto kid Posts: 41,057

    ...I wonder if Nerd3D's old Fog Tool Deluxe would work as it uses planes as well.

  • Sven DullahSven Dullah Posts: 7,621
    kyoto kid said:

    ...I wonder if Nerd3D's old Fog Tool Deluxe would work as it uses planes as well.

    That's what I used, just modified the opacity maps and converted to awe. There are 40 planes and 2 material zones + a nice bunch of morphs;)

  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    I confessed that I never looked into fog/interior/atmosphere shaders. I believe they currently don't work with mustakettu's render script.

    This is rather handy if you're working with the planes method - https://sites.google.com/site/mcasualsdazscripts/mcjjet

     

    That's what she told me, so decided to try this. Kind of works, doesn't it. For a smoother fog, one probably could instance the whole fog prop. And also thinking of starting a nice forest fireblush

  • wowiewowie Posts: 2,029

    Kind of works, doesn't it. For a smoother fog, one probably could instance the whole fog prop. And also thinking of starting a nice forest fireblush

    "Kind of" is very apt. laugh But that's not using the renderer to its best potential. 3delight can do proper volumetric rendering, even with .vdb, but the host app need to expose that.

    https://twitter.com/jcubeinc/status/1056834832037113857

    It should be possible to use volumetric/interior/atmosphere shader to do proper fog/volumetric rendering (at scripted renderer speeds) if DAZ have path tracer and ray caching enabled for the standard renderer.

    https://twitter.com/pberto/status/756244464817934341

    Just goes to show how out of step DAZ is compared to 3delight main branch. We're in 2018 and DAZ Studio still don't use the path tracer by default. laugh

  • Sven DullahSven Dullah Posts: 7,621
    edited November 2018
    wowie said:

    Kind of works, doesn't it. For a smoother fog, one probably could instance the whole fog prop. And also thinking of starting a nice forest fireblush

    "Kind of" is very apt. laugh But that's not using the renderer to its best potential. 3delight can do proper volumetric rendering, even with .vdb, but the host app need to expose that.

    https://twitter.com/jcubeinc/status/1056834832037113857

    It should be possible to use volumetric/interior/atmosphere shader to do proper fog/volumetric rendering (at scripted renderer speeds) if DAZ have path tracer and ray caching enabled for the standard renderer.

    https://twitter.com/pberto/status/756244464817934341

    Just goes to show how out of step DAZ is compared to 3delight main branch. We're in 2018 and DAZ Studio still don't use the path tracer by default. laugh

    Wow nice render! Yeah can't help it, I feel the same way as that twitter dude:) It's unbelievable really, would very much love to forget all kinds of old school approachesindecision

    Post edited by Sven Dullah on
  • wowiewowie Posts: 2,029
    edited November 2018

    Thanks to mustakettu, one more bug resolved.

    Current public build.

    Development build.

    Anisotropy with polygon meshes (using a sphere primitive in this example) now renders without jagged artifacts. Obviously, with subD the artifact would be less noticeable.

    internal.jpg
    369 x 600 - 8K
    ds.jpg
    369 x 600 - 9K
    Post edited by wowie on
  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    Thanks to mustakettu, one more bug resolved.

    Current public build.

    Development build.

    Anisotropy with polygon meshes (using a sphere primitive in this example) now renders without jagged artifacts. Obviously, with subD the artifact would be less noticeable.

    Very good news!

  • Sven DullahSven Dullah Posts: 7,621
    edited November 2018

    Found a nice M4 character in fast grab (https://www.daz3d.com/angus), had to try an awe conversion;) Made a G1 character to go with the skin. Rendertime 15 min with 16x16 pixelsamples..

    image

    Had to chop his feet off, still no shadowcatcher for awe, and didn't want to use IBLM:)

    Tunnel East awe.png
    1800 x 1350 - 3M
    Post edited by Sven Dullah on
  • wowiewowie Posts: 2,029

    Had to chop his feet off, still no shadowcatcher for awe, and didn't want to use IBLM:)

    Out of curiosity, is the scene lit with just a HDRI or are there other lights?

  • Sven DullahSven Dullah Posts: 7,621
    wowie said:

    Had to chop his feet off, still no shadowcatcher for awe, and didn't want to use IBLM:)

    Out of curiosity, is the scene lit with just a HDRI or are there other lights?

    Just HDRI;)

  • Sven DullahSven Dullah Posts: 7,621
    edited November 2018

    ...and a WIP, 5 arealights and HDRI...rendertime with 8x8 pixelsamples 1h 40min. No issues converting this stuff, but need to see what could still be optimized to keep rendertimes down;)

    image

    But when I made a couple of pool scenes using RR3 by Marshian I was looking at 12 h IIRC:)

    A pool table awe.png
    1800 x 1013 - 3M
    Post edited by Sven Dullah on
Sign In or Register to comment.