Simple answer, a "shader", "shader preset", or sometimes called "materials," depending on the program being used, is a texture you can place on most items. The model would be the object that the shader gets applied to. Example, you have a model of a house, and you use a brick wall shader preset to change the color/texture of the walls.
Hope that helps.
To clarify somewhat. A shader tells studio what the surfaces are capable of and different shaders give different capabilities. A shader preset will tell the surfaces the settings to use with a shader. You have to have the shader for the presets to work. Studio comes with several (Human surface shader, default shader, ubershader and the Age of Armour surbsurface shade) and then there are others that are parts of products.
You can also broaden that definition to 'a bit of code' that tells the renderer how to handle an object.
That means you can have shaders for cameras and shaders for lights as well as surface shaders.
It is actually good documents for new user, about surface and shaders of daz studio.
(And I believe, vndors recommend it too ^^b)
the section of doucments, 3.4 - Surface Shaders may give you clear knowledge ,,(though not say about all,, it seems imposibble)
but you may better to read all , I recommend.
III - Textures, Surfaces and Materials
I did not get such good documents when I had same problem before ^^;
A texture map would be an image (one example would be a .jpg) that gets wrapped around your 3D object. The texture is painted onto the flat image, and various parts of the flat 2D image correspond (are mapped) to various pieces of the 3D object. The 2D image may appear distorted depending on how it is mapped, although sometimes one can look at it and see what piece goes where, other times it's not very obvious. You can modify (a copy of) this image to change how your object looks, or completely replace it with a different image. Plus there may be multiple images; one for the diffuse channel (the colors), one for bump map, one for opacity, etc. For example, the standard Genesis figure does this.
You can also have a procedural shader. There is no image file. Patterns are generated mathematically and applied to the object.
Or, you can have sort of a mix, for example a shader that takes an image file that isn't mapped to the the object, but rather uses it as sort of a rubber stamp to stamp all over your object, perhaps also doing additional things with that image.
It is code that defines how light will interact with an object. As others have said a shader does not necessarily need a texture map. What becomes confusing is people and vendors that should know better will often call shader presets just shaders.
So it is safe to say that a Shader is a bit of software that tweaks a texture map in one way or another?
No. A software shader would be used INSTEAD OF a texture map.
The texture map might make sense for an object where the texture is a photo or a few of them, and you would map each piece of the photo to some polygons on the object.
However if you wanted a streambed consisting of ten thousand pebbles, it would be really hard to take a photo of every side of every one of those 10,000 unique pebbles and map each piece of each photo to each piece of each 3D model. However, it might be possible and very easy for the computer to easily fake it with software that generates sort of a random looking stone material and just paints that with unique variations on each of the 10,000 pebble objects in your scene. This would be a good place to use a software shader. Mapping is not used on your object. (although you could apply the software shader to an object that has been texture mapped, and just not use that supplied texture map. The software shader MAY make use of a texture that is not mapped, or may not use any texture.
EDITED TO ADD: THIS MIGHT BE TOTALLY WRONG-> To be clear, although a texture map makes use of an image, the image itself is not called a texture map, just a texture. A texture map (somebody who makes them help me out here) is sort of like the instructions used to determine which part of that image gets applied (mapped) to which polygons on the object, and is encoded into the object itself in some way.
This is a bit more confusing as "shader" seems interchangeable with "texture map."
That's done because with the figures we use in Studio the texture map (the skin) is the most obvious part.
As you can see from those pages linked earlier, its really only one part of the 'diffuse' component, out of several components that make up a surface shader.
It would be more accurate to say:
The texture applied to Genesis uses the Lana texture map.
or
The material applied to Genesis uses the Lana texture map.
or
The shader applied to Genesis uses the Lana texture map.
A texture map (somebody who makes them help me out here) is sort of like the instructions used to determine which part of that image gets applied (mapped) to which polygons on the object, and is encoded into the object itself in some way.
You're drifting dangerously close to the subject of UV mapping! :grrr:
Though that process is also called 'texture mapping' I think you were right earlier when you said that a texture map, like a bump map, displacement map or a reflection map is an actual file like a jpg.
Agree, there seems to be some confusing information in this thread. A shader may of course use an image and then interpret it on UV map basis in relation to an object - or whatever mapping as we have even some projection mapping surface shaders in the DAZ store.
A decent summary of shaders is this link on the Pixar site. 3Delight, the DAZ Studio render engine, is close enough since it is RenderMan-compliant, though this whole Maya babbling might confuse a bit ;)
Hmm, ok, maybe I need to take a step back and re-review my terms. From the "Glossary of Terms - for those new to Digital Art and DAZ 3D" thread http://www.daz3d.com/forums/discussion/46/ :
Map: In 3D graphics, a map is an image used within a material. The purpose of a map is to vary some material attribute across a surface. For example, a texture map alters the color of an object, and a bump map simulates roughness. 2D maps, including all bitmaps, require mapping coordinates, which tell the renderer how to project the map onto the 3D object. 3D procedural textures do not require mapping coordinates, because they are volumetric.
Texture mapping: The process of assigning (mapping) an image (texture) to a 3D surface. This allows a complicated colouring of the surface without requiring additional polygons to represent minute details.
So I may have been confused. I'm going to sit back and just listen now. :-)
A shader, like a plug-in or a script, is something that provides or adds features - in this case, it alters the way things behave in a render. The surface shader used is what determines the properties that show in the Surfaces pane, and what effect they have; a light or camera shader determines the settings available in the Light and Camera panes, and how the lights and cameras affect the render.
Thanks Richard for the succinct summation. I suppose it's best to think of Shader not as the kind of shade you find under a tree, but as the "shade" of effect (like a shade of a color) that rendering will have on the surface of a model.
BTW, never understood why they called textures that when they have no texture. I think wrapper is a more obvious word for them.
If you Check you will find that on open most content will default to the DAZ Studio Default Shader. That provides the Surfaces settings such as Matt, Skin, Metal and the others as well as many of the settings. Just about everything we do in DAZ Studio involves a shader in some way to send the information to the Render Engine. Some shaders just do more than others and some are very specialized in what they to. But basically they all are either adding more commands to the Render engine or stacking them to effect other effects at the same time.
Its easy to get confused by some of the terminology since it often gets bandied about without regard to strict definitions, unfortunately. When working with 3D I always find it useful to use the real world as reference.
For example, lets consider an orange. With care, it is possible to peel the orange so that the peel is removed in one piece. Tricky, but possible. If you then placed the peel on a flat surface and tried to flatten it, you would find it necessary to create splits in oder to do so. You may even need to break it into sections to flatten it. The resultant, flattened orange peel would be equivalent to what in the virtual 3D world is called a UV map. (BTW, UV does not refer to "ultraviolet", but to the co-ordinates of the map, U and V). This UV map, or in the real world our orange peel, could be repositioned on the orange (if you didn't eat it. Did I tell you to eat it?) assuming you remembered exactly which part went where. Once we have this UV map in our virtual world, we can recreate it using any colours (Canadian, eh?) we like, and therefore have a purple orange, for example. (would a purple orange still be an orange?) So the purple colour we applied to the UV map would be a new texture. The variations of this, such as bump and displacement maps, can be used to add surface irregularities to our object. In the real world our orange is not smooth, but has a kind of pebbled surface. A greyscale bump map could be used to simulate this (I.E. fake it) by combining it with our orange texture, since light and dark areas are interpreted as being closer or farther away from the camera in the virtual world. A displacement map is similar, except it is used by the 3D software to actually deform the surface to produce the irregularities (and therefore consumes more system resources).
If we took a photograph of our real world orange, it would appear not only orange in colour, but there would be slight shadows indicating the irregular surface, so we would see that it was not smooth. In the virtual 3D world rendering is the equivalent of taking a photograph, and we could recreate our orange using an orange texture applied to a sphere, coupled with either a bump map or displacement map to add the surface irregularity.
So where do shaders come in? Well, lets consider our real world orange again. What if we had a real orange and two fake oranges of the sort some people like to display in bowls on tables (for some inexplicable reason). One is made from hard plastic and another made from glass. All three would presumably be more-or-less spherical, all three would presumably be orange and all three would presumably have the familiar orange rind surface irregularity. Nevertheless, being made from three different real world materials, they probably wouldn't look identical. Why? Well glass is very shiny, plastic nearly so or possibly not, and orange rind not so much. Each real world material has different properties relative to light falling upon it. Some light might be absorbed, some reflected and some refracted, for example. In the real world the materials have these properties inately, and a real world camera simply captures this when used to take a photograph. In the virtual 3D world where materials have only the properties we give them, how does the render engine know how to deal with variables like reflection, refraction and absorption in order to produce photorealistic results. Well, we tell it. And we tell it with a set of instructions called shaders or shader presets. There are metal shaders, glass shaders, skin shaders, leather shaders, latex shaders, vinyl shaders, etc., etc., etc. And it is all in the name of creating redered effects. I have never seen a barbequed steak shader, but as soon as someone needs to render one, they'll need it.
Hopefully this has been more illuminating (see what I did there?), than confusing. :)
Is the orange peel really a UV Map? Or is it a texture, and the "map" is the coordinates that link that texture to the model geometry? As I noted before, it would be more logical for theses "skins" to be called something like a wrapper, i.e. a 2D image that is wrapped upon the model.
And thanks for the explanation of shaders. Is another aspect of a shader that they kick in only when rendering? We don't see their impact in the program until the image is rendered? Whereas we do see textures and various maps as we work in the program (assuming we're not in some wire-frame view).
Depending on the program and the shader, you can sometimes see a rough preview of the shader's effect in the viewport prior to rendering, but not always.
And thanks for the explanation of shaders. Is another aspect of a shader that they kick in only when rendering? We don't see their impact in the program until the image is rendered? Whereas we do see textures and various maps as we work in the program (assuming we're not in some wire-frame view).
To an extent -- the OpenGL hardware render that you see in the viewport has some limited capabilities, but the more advanced shaders only work in the software renderer.
And thanks for the explanation of shaders. Is another aspect of a shader that they kick in only when rendering? We don't see their impact in the program until the image is rendered? Whereas we do see textures and various maps as we work in the program (assuming we're not in some wire-frame view).
Most graphics card have inbuilt shader support for a variety of effects. Since a 'shader' is basically just a bit of information which tells the computer how to display a surface, it also applies to 3D objects in most video games. These often use hardware supported shaders, making them incredibly fast to display.
The shaders used in 3DL are rendered using the software though, so they don't benefit from this speed. That said, they can also be a lot more complex as a result. Since they tend to be a bit more intensive to calculate, it would be inefficient to run these myriad calculations just for display purposes hence why you see a far more basic version.
This 'basic' version, by the way, does use hardware shaders.
Comments
Simple answer, a "shader", "shader preset", or sometimes called "materials," depending on the program being used, is a texture you can place on most items. The model would be the object that the shader gets applied to. Example, you have a model of a house, and you use a brick wall shader preset to change the color/texture of the walls.
Hope that helps.
To clarify somewhat. A shader tells studio what the surfaces are capable of and different shaders give different capabilities. A shader preset will tell the surfaces the settings to use with a shader. You have to have the shader for the presets to work. Studio comes with several (Human surface shader, default shader, ubershader and the Age of Armour surbsurface shade) and then there are others that are parts of products.
You can also broaden that definition to 'a bit of code' that tells the renderer how to handle an object.
That means you can have shaders for cameras and shaders for lights as well as surface shaders.
I think that is a bit confusing for someone who is asking questions in the New Users Forum. We try to help people in this forum, not confuse them.
Thanks
http://docs.daz3d.com/doku.php/public/software/dazstudio/4/userguide/chapters/textures_surfaces_and_materials/start
It is actually good documents for new user, about surface and shaders of daz studio.
(And I believe, vndors recommend it too ^^b)
the section of doucments, 3.4 - Surface Shaders may give you clear knowledge ,,(though not say about all,, it seems imposibble)
but you may better to read all , I recommend.
III - Textures, Surfaces and Materials
I did not get such good documents when I had same problem before ^^;
Thanks for the help. I'll check out that document. This is a bit more confusing as "shader" seems interchangeable with "texture map."
A texture map would be an image (one example would be a .jpg) that gets wrapped around your 3D object. The texture is painted onto the flat image, and various parts of the flat 2D image correspond (are mapped) to various pieces of the 3D object. The 2D image may appear distorted depending on how it is mapped, although sometimes one can look at it and see what piece goes where, other times it's not very obvious. You can modify (a copy of) this image to change how your object looks, or completely replace it with a different image. Plus there may be multiple images; one for the diffuse channel (the colors), one for bump map, one for opacity, etc. For example, the standard Genesis figure does this.
You can also have a procedural shader. There is no image file. Patterns are generated mathematically and applied to the object.
Or, you can have sort of a mix, for example a shader that takes an image file that isn't mapped to the the object, but rather uses it as sort of a rubber stamp to stamp all over your object, perhaps also doing additional things with that image.
So it is safe to say that a Shader is a bit of software that tweaks a texture map in one way or another?
It is code that defines how light will interact with an object. As others have said a shader does not necessarily need a texture map. What becomes confusing is people and vendors that should know better will often call shader presets just shaders.
No. A software shader would be used INSTEAD OF a texture map.
The texture map might make sense for an object where the texture is a photo or a few of them, and you would map each piece of the photo to some polygons on the object.
However if you wanted a streambed consisting of ten thousand pebbles, it would be really hard to take a photo of every side of every one of those 10,000 unique pebbles and map each piece of each photo to each piece of each 3D model. However, it might be possible and very easy for the computer to easily fake it with software that generates sort of a random looking stone material and just paints that with unique variations on each of the 10,000 pebble objects in your scene. This would be a good place to use a software shader. Mapping is not used on your object. (although you could apply the software shader to an object that has been texture mapped, and just not use that supplied texture map. The software shader MAY make use of a texture that is not mapped, or may not use any texture.
EDITED TO ADD: THIS MIGHT BE TOTALLY WRONG-> To be clear, although a texture map makes use of an image, the image itself is not called a texture map, just a texture. A texture map (somebody who makes them help me out here) is sort of like the instructions used to determine which part of that image gets applied (mapped) to which polygons on the object, and is encoded into the object itself in some way.
That's done because with the figures we use in Studio the texture map (the skin) is the most obvious part.
As you can see from those pages linked earlier, its really only one part of the 'diffuse' component, out of several components that make up a surface shader.
It would be more accurate to say:
The texture applied to Genesis uses the Lana texture map.
or
The material applied to Genesis uses the Lana texture map.
or
The shader applied to Genesis uses the Lana texture map.
You're drifting dangerously close to the subject of UV mapping! :grrr:
Though that process is also called 'texture mapping' I think you were right earlier when you said that a texture map, like a bump map, displacement map or a reflection map is an actual file like a jpg.
Agree, there seems to be some confusing information in this thread. A shader may of course use an image and then interpret it on UV map basis in relation to an object - or whatever mapping as we have even some projection mapping surface shaders in the DAZ store.
A decent summary of shaders is this link on the Pixar site. 3Delight, the DAZ Studio render engine, is close enough since it is RenderMan-compliant, though this whole Maya babbling might confuse a bit ;)
Hmm, ok, maybe I need to take a step back and re-review my terms. From the "Glossary of Terms - for those new to Digital Art and DAZ 3D" thread http://www.daz3d.com/forums/discussion/46/ :
Map: In 3D graphics, a map is an image used within a material. The purpose of a map is to vary some material attribute across a surface. For example, a texture map alters the color of an object, and a bump map simulates roughness. 2D maps, including all bitmaps, require mapping coordinates, which tell the renderer how to project the map onto the 3D object. 3D procedural textures do not require mapping coordinates, because they are volumetric.
Texture mapping: The process of assigning (mapping) an image (texture) to a 3D surface. This allows a complicated colouring of the surface without requiring additional polygons to represent minute details.
So I may have been confused. I'm going to sit back and just listen now. :-)
A shader, like a plug-in or a script, is something that provides or adds features - in this case, it alters the way things behave in a render. The surface shader used is what determines the properties that show in the Surfaces pane, and what effect they have; a light or camera shader determines the settings available in the Light and Camera panes, and how the lights and cameras affect the render.
Thanks Richard for the succinct summation. I suppose it's best to think of Shader not as the kind of shade you find under a tree, but as the "shade" of effect (like a shade of a color) that rendering will have on the surface of a model.
BTW, never understood why they called textures that when they have no texture. I think wrapper is a more obvious word for them.
If you Check you will find that on open most content will default to the DAZ Studio Default Shader. That provides the Surfaces settings such as Matt, Skin, Metal and the others as well as many of the settings. Just about everything we do in DAZ Studio involves a shader in some way to send the information to the Render Engine. Some shaders just do more than others and some are very specialized in what they to. But basically they all are either adding more commands to the Render engine or stacking them to effect other effects at the same time.
Its easy to get confused by some of the terminology since it often gets bandied about without regard to strict definitions, unfortunately. When working with 3D I always find it useful to use the real world as reference.
For example, lets consider an orange. With care, it is possible to peel the orange so that the peel is removed in one piece. Tricky, but possible. If you then placed the peel on a flat surface and tried to flatten it, you would find it necessary to create splits in oder to do so. You may even need to break it into sections to flatten it. The resultant, flattened orange peel would be equivalent to what in the virtual 3D world is called a UV map. (BTW, UV does not refer to "ultraviolet", but to the co-ordinates of the map, U and V). This UV map, or in the real world our orange peel, could be repositioned on the orange (if you didn't eat it. Did I tell you to eat it?) assuming you remembered exactly which part went where. Once we have this UV map in our virtual world, we can recreate it using any colours (Canadian, eh?) we like, and therefore have a purple orange, for example. (would a purple orange still be an orange?) So the purple colour we applied to the UV map would be a new texture. The variations of this, such as bump and displacement maps, can be used to add surface irregularities to our object. In the real world our orange is not smooth, but has a kind of pebbled surface. A greyscale bump map could be used to simulate this (I.E. fake it) by combining it with our orange texture, since light and dark areas are interpreted as being closer or farther away from the camera in the virtual world. A displacement map is similar, except it is used by the 3D software to actually deform the surface to produce the irregularities (and therefore consumes more system resources).
If we took a photograph of our real world orange, it would appear not only orange in colour, but there would be slight shadows indicating the irregular surface, so we would see that it was not smooth. In the virtual 3D world rendering is the equivalent of taking a photograph, and we could recreate our orange using an orange texture applied to a sphere, coupled with either a bump map or displacement map to add the surface irregularity.
So where do shaders come in? Well, lets consider our real world orange again. What if we had a real orange and two fake oranges of the sort some people like to display in bowls on tables (for some inexplicable reason). One is made from hard plastic and another made from glass. All three would presumably be more-or-less spherical, all three would presumably be orange and all three would presumably have the familiar orange rind surface irregularity. Nevertheless, being made from three different real world materials, they probably wouldn't look identical. Why? Well glass is very shiny, plastic nearly so or possibly not, and orange rind not so much. Each real world material has different properties relative to light falling upon it. Some light might be absorbed, some reflected and some refracted, for example. In the real world the materials have these properties inately, and a real world camera simply captures this when used to take a photograph. In the virtual 3D world where materials have only the properties we give them, how does the render engine know how to deal with variables like reflection, refraction and absorption in order to produce photorealistic results. Well, we tell it. And we tell it with a set of instructions called shaders or shader presets. There are metal shaders, glass shaders, skin shaders, leather shaders, latex shaders, vinyl shaders, etc., etc., etc. And it is all in the name of creating redered effects. I have never seen a barbequed steak shader, but as soon as someone needs to render one, they'll need it.
Hopefully this has been more illuminating (see what I did there?), than confusing. :)
Is the orange peel really a UV Map? Or is it a texture, and the "map" is the coordinates that link that texture to the model geometry? As I noted before, it would be more logical for theses "skins" to be called something like a wrapper, i.e. a 2D image that is wrapped upon the model.
And thanks for the explanation of shaders. Is another aspect of a shader that they kick in only when rendering? We don't see their impact in the program until the image is rendered? Whereas we do see textures and various maps as we work in the program (assuming we're not in some wire-frame view).
Depending on the program and the shader, you can sometimes see a rough preview of the shader's effect in the viewport prior to rendering, but not always.
To an extent -- the OpenGL hardware render that you see in the viewport has some limited capabilities, but the more advanced shaders only work in the software renderer.
The shaders used in 3DL are rendered using the software though, so they don't benefit from this speed. That said, they can also be a lot more complex as a result. Since they tend to be a bit more intensive to calculate, it would be inefficient to run these myriad calculations just for display purposes hence why you see a far more basic version.
This 'basic' version, by the way, does use hardware shaders.