Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
I know that feeling!
It doesn't help that different programs call the same things by different names. Usually a "shader" and a "material" are interchangeable terms and refer to the collection of settings in the software that tell the renderer how light interacts with part or all of an object (i.e. the particular combination and settings of diffuse, reflection, refraction, emmision, etc.) although in some programs a material might also mean the part of the object the shader applies to (what DAZ Studio calls "surfaces"). Usually a texture is the actual image file that gets assigned to a particular channel in the material; the word is often interchangeable with "map" as in a bump map.
So in DS you apply a shader, which may or may not load various textures, to a surface.
OK - I wasn't too far away from an understanding then. Thanks.
No offence taken.
...aaand now for something completely different. Some modeling fun from me. Recently I've been pretty charmed by low-poly stuff, so I figured I'd try my hand at it. I figured given the weather outside snow was appropriate.
Rendered in cycles obv, signature and border were done in photoshop ans well as a bit of final color tweaking but the rest, including those sun rays were done in blender's compositor, the bit of DOF to blur the horizon was also done via the defocus node in the compositor, which I love I love DOF but it does significantly increase render times, the defocus node can honestly replace true DOF 9 times out of 10, and you can do all sorts of fun control tweaks to it that you can't with proper DOF (the falloff of the DOF here is in no way physically accurate, if it were the clouds would have some blurring, but I didn't want the clouds to have any blurring )
Its so much fun! I think I'm going to be rendering more in this and similar styles soon
I looove it! I want to render things like that but while I can do the modelling part, setting up materials etc. to render like that is entirely alien to me (as is the description, no offence~)
Other than the sun the materials in this are all disgustingly simple literally just plain diffuse set different colors, the sun is diffuse mixed 50/50 with an emission shader (both yellow) The lighting slightly more complex. There's a sun lamp parented to the sun object with an emmision strength of 55, and another sun lamp pointing towards the scene from the right of the camera at a low strength (.700) and an area light pointing directly at the sun from just to the left of it (thats how I got the color to go from white to yelow on the left edge it is probably 99% unnecesary in the grand scheme of things) the sky is just the sky texture node plugged into the background. The attached render is the scene but just tweaking the HSV in the compositor, no fancy noding, no postwork.
But it's so pretty! Thanks for the tips - I really should spend more time messing around with rendering in Blender instead of always dragging everything back to DS every time.
SaveI like that reminds me of the christmas shows I use to watch as a kid like Rudolph The Red nosed Reindeer that had Burl Ives as the snowman. Now you just need the snowman. :)
Oh, btw JC, do you mind if I point out you have a typo in your sig? It could be dangerous. :)
I posted this in the blender thread over in the Art Studio forums....but I thought I would toss it out here as well.
Let's talk about the philosophy of material zones (which directly correlate to surfaces in Daz Studio)
Let's say you have an object with sub-meshes of multuple material types, for example, a stone building with wooden shutters with metal hinges.
Do you assign materials based on material type (stone, metal, wood, etc.)?
Or do you assign materials based on sub-object (building & shutters)? With the use of 3D painting tools you can get multiple material types in the same Daz Studio surface.
I've seen both in use in different sets, so I'm thinking it comes down to a philosophical choice.
I'm inclined toward the latter, but that makes it harder for people to re-texture down the road.
What are the advantages and disadvantages of each approach?
Personally, I would go with the latter so the mats on objects can be adjusted independently. In Studio, you can still select multiple objects and filter for "stone" to get all the "stone" mats selected at once for simultaneous editing. The other way, you'd have to create new surface groups to edit the "stone" surfaces on house A and house B independently. Hope that makes sense, and I was understanding your question correctly.
- Greg
Totally off topic but just an fyi for anyone interested, I have a couple Facebook pages, one: Technology Today and another The 3D Dimension.
Hey Gedd - have you had any luck exporting any alembic files from DS and importing into Blender 2.78?
- Greg
Just to be technically accurate.....from an application-agnostic point of view (i.e., just in terms of 3D modelling/rendering IN GENERAL).....
A "Shader" is a small program that defines how light reacts to and reflects/refracts off/through a surface. Shaders have inputs (colors, textures, values) but those values may not be fixed. A shader MAY be applied to a surface, and will use default values, as a sort of 'default' material for that shader.
A "Material" is a Shader with preset values (textures, colors, etc.) that can be applied to a surface. In DS, this is usually referred to as a "Shader Preset" or "Material Preset".
A "Texture" is simply a color pattern, either generated programmatically (procedural texture) or from an image file (a texture map). Sometimes, we simply refer to images used for this as 'Maps', instead of Texture Maps. Therefore a "Bump Map" is a texture map (image) which is applied to the 'bump' input of a Shader (that uses such an input.)
A "Surface" is a section of a mesh which can have a Material/Shader applied to it. It has polygons and may have UV coordinates and such.
A "Mesh" is a collection of polygons with one or more Surfaces defined over it. A Mesh may be "Open" or "Closed". "Open" meshes have holes which expose the interior of the geometry. "Closed" Meshes are solid with no holes exposing the interior sides of the polygons of the mesh.
Depending on the particular program one is using, they may refer to any of these with different terms, or even interchange them. But IN GENERAL, these are the correct definitions (based on 3D Analytic Geometry and how it applies to discrete operations, like polygonal models of smooth surfaces.)
I haven't played with exporting Alembic out of DS as I don't typically do animation in it. I've mostly done stills and the animation I've done I haven't tried to export.
Also to add: a material zone is a ..zone.. on the surface of the mesh that designates which shaders.materials affect those specific polygons, allowing you to have more than one on a single mesh.
Just adding this one because I feel like it's another one that gets confused because people expect a zone to be a preset material or whatever else when it's actually just a designation.
Thanks for responding, Gedd. Just recently noticed that import was finally added in 2.78, which is something I've been waiting for.
I've been busy building upon the cloth sim capabilities in DS (to automate the creation of JCMs), but it's like trying to pound a square peg into a round hole. With the recent improvements to Blender's cloth sims, I was excited to find out that alembic import had also been implemented. Exporting vertex animation out of DS (both the figure and cloth) for simulation in blender would have been great!
In case anyone else is interested and trying to get it to work, here is the error message in the Blender console:
"Could not open as ogawa file from provided streams."
I have found references in Blender dev discussions that talk about Ogowa being a more efficient Alembic back-end, and I fear that the output from the Alembic plugin in DS hasn't been updated in many years.
Anyway, I follow this thread and appreciate all of the posts you make and info you provide about Blender - cheers!
- Greg
Thank you for the feedback. There is so much to do in the 3d environmen it's impossible to test every use/case so it's good when anyone gives their experience as well laid out as you did for anyone else who wants to follow up on. :)
I would agree it is a philosophical choice as well as designing for your target render engine.
I make my own clothing content for the Genesis figures.
with the the intent to render the content either in blender for
stills or in maxon Cinema4D for animation with Vray or C4D
native proceedural shaders.
I assign basic "material zones" or surfaces" as DS users would
say, during the modeling process
the Zones will ultimately be occupied by a native shader of the
specific renderer I am using.
In general it is always better& faster to use the shader system
of the native render engine one is using
I therefore do not bother with absurd 4k texture mapped
UV's like the ones used in many Daz products that make Iray even slower& more hardware intensive.
Here are some example of some of my recent clothing models for the genesis 2 figures.
modeled in C4D but rendered in Blender via Daz studio & the MCJ teleblend script.
.......
another way to use blender that folks might not think about, its now part of my compositing/postwork workflow
Render an exr out of Iray (beauty pass and depth map) > tonemap and composite in blender (filmic comes into play here) > save out a 16 bit .tif for final tweaking in photoediting program of choice the mostly the NIK collection tools in this instance.
For compositing, Blender has a nice built in compositor that offers a lot. Da Vinci Resolve has great color tools and some nice effects and Da Vinci Fusion has a lot of tools and effects for compositing. The Da Vinci tools are professional grade, used by major studios and are free for anything that isn't geared specificallyh towards a large studio. As J Cade mentioned, Photoshop has many plugins that can be used on the various passes of a comp.
Getting into composition opens up a whole new world of possibilities and is well worth the time if one is at the level where it makes sense. :)
Another super quicko example of The Power of CompositingTM.
Super quicko render (10 minutes about just loading and rendering, i didn't bother letting it bake enough to actually fully clean since it was a test) likewise probably about 5 minutes of compositing. Compare: the sky's no longer super blown out and blends much more nicely into the buidings where its brightest. the shadows are much less dark , there's also some mist I added using the zdepth pass which gives it some nice atmosphere (sidenote Iray's zdepth pass is somewhat borked, Its still useable you just have to tweak and blur it a bit to kill its grainyness)
Love the second one. Looks like a photograph!
OK.......I'm gonna have to add compositing to the list of things to experiment with in blender.........
That list just keeps getting longer.
Those are beutiful examples, but one thing that hasn't really been showcased yet in compositing is the ability to merge various parts of a scene together in post. This can be very handy in many cases. Perhaps part is rendered out in DS, part in Blender. If we match lighting setups in the two programs, we can render out with zdepth, etc.. and stitch the parts together in a post production environment. Or as another example, perhaps the overall scene is too heavy to render in a single pass without it failing out to cpu and going much slower, the scene can be split up into parts that will still render on the gpu and again be stitched together in post.
It's really nice that J Cade is taking the time to post these examples because it takes time to render stuff out, or one has to have handy scenes to show as examples and unfortunately that isn't always workable when we are busy with projects. Pictures really do speak more then just words in situations like this.
One thing I like in the example above is that not only does it showcase adding an atmospheric effect, but it also shows how we can totally change the whole overall tone and colorspace after the fact. This can be very handy if we are doing work for a client and they decide that the colors aren't going to work for a given project. With exr files and a bit of post work we can rework the piece subtly or massively without having to rerender the entire scene, even to the point of inserting things that weren't in the original render. :)
Funny you should mention that...
I'm currently proof of concepting something in this vein. I really wanted some nice motion bluring cars in the new urban future set. Of course, you can't do that in Iray, and redoing the whole set's materials in blender would take way too long so... render the cars and motion in Cycles, the scenery in Iray and stick them together. Sadly Blender's new shadow catcher is still sooper expirimental and AFAIK it doesn't work for reflections (look at the reflection of the tail lights on the pavement, had to have that) so I'm using render layers and compositing node tricks instead. The lighting in blender was super easy though I rendered out my scene in Iray as (tiny) 32 bit environment map: so pretty much identical lighting
Things don't actually match up 100% perfectly, mostly because I was retrofitting things and my initial render had some camera distortion that had no blender analogue.
Very nice :)
As you guys are talking about compositing and color grading, it could be a good idea to have a look at Natron, Nuke personnal edition (free) or Blackmagic Fusion personnal edition (free).
Just saw this and had this article marked for info. Page two goes into shaders/textures/materials. http://www.informit.com/articles/article.aspx?p=2162089