This newest project is coming along very nicely Casual. It will be so cool to make my own HDR files.... Now will I be able to export in the native .hdr format or just a image file?
Blender seems to have the ability to do hdr renders
normally an hdr is many photos taken with diverse exposures ,
then they mix those and obtain a higher dynamic range
and the hdr images have more than 8 bits per color channel, probably 16 bit
i'm not too sure how you'd set up your renders
but you'd end up with 4, 8 or 16 8192x4096 spherical panoramas
and combine them into one ( using tools with names like "panotools" which i never really tried using )
( i vaguely remember using a program like that related to Quicktime VR )
unless 3delight can render 16 bits tif or targa or png images
note that i mentioned this earlier, but eventually i will or someone did/will create a camera "shader" which lets 3Delight render perfect spherical images in 1 big render, no scripts nothing
incidentally i should check if Blender Cycles doesnt already have that option!!!
I'm referring to the need for .hdr files for use in iRay in the newest DAZ Studio.
i could add an option so that when imagemagick assembles the rendered jpg/png tiles
into the big spherical panorama, the resulting image would be saved in hdr format ( not sure if all versions of imagemagic will accept to do it though )
it would be hdr but would not have true high dynamic range ( finely graded light intensities )
but i think you'll get the same result by applying the jpg or png version to IRay's environment sphere
--
i did a test just now, imagemagick took the 128 512x512 tiles and created a 50 MB hdr image in a few seconds
so it must be 48 or 64 bits per pixel 8192x4096 pixels
if the hdr is only used as a light source, not as the visible backdrop
then you can use a smaller and even blurred version of the panorama for iray's sphere
You can take the results from the script and either run it through and fake some 'stops' by making adjusting things to get other 'exposures' and then use something like Picturenaut or a Photoshop/GIMP plugin to create a 'higher' range image. Or you can repeat the render, several times adjusting the gain setting and then do the same processing to assemble an hdr...you'd probably be closer to a real 'several stops up/several down' image doing the multiple renders at different gain settings.
Also, the Blender route...you can create a full environment, light it with the sun/sky and then render that...similar to what you can do in Vue, Terragen and several others.
Some of the other interesting uses for this...
Take one of the big scenery sets (Stonemason's sets come to mind) and generate a decent full 'environment' backdrop from the scene to use. That way you can remove the 'set' and either use more characters/greater detail/higher res textures, without having to count the set's geometry...this is very useful if you don't have a 12 GB Titan to do Iray renders with.
I'm referring to the need for .hdr files for use in iRay in the newest DAZ Studio.
fig 1 using the jpg spherical panorama as the iray infinite sphere environment
fig 2 using the hdr spherical panorama as the iray infinite sphere environment
note that most of amy and her clothes have a non-zero ambient color
the hdr render seems to have tints closer to what was in the mount washington panorama
but maybe it was a "non-infinite" sphere?
fig 3 - with the infinite sphere ( this time i'm sure it was ) and the .hdr image
fig 4 - i had to figure i had to remove the "environment" from daz studio'd Environment tab
to get the hdr dome to render
then i had to reduce the environment intensity in IRAY from the default of 1.0 to 0.25
now i'll write ( already did that for camera tracking a while back ) the Poser camera exporter
in pz3 format it i remember well
which is also usable in Carrara but there's tricky maths involved
// Daz Studio uses a filmback with a height of 35mm ( DS4 made this modifiable ! )
// while Carrara uses filmback with a width : 43.354mm
// and Poser uses filmback with a width : 25.4mm
there will be an export camera to Carrara button too
in both cases it's a PZ3 file, so it's an animated camera object that actually gets exported
the only difference is the size of the camera "backplates" in Carrara and Poser
since this directly affect the focal-length vs field-of-view relation, it's quite important
on to Poser !! ( which for me is always a scary place )
hey we'll render the modern age-old office ! as a skyball
oops i removed indentations in the pz3 and that's apparently a no-no
oh the fun (not) poser's animation rendering cant be aborted
but as you can see the animated camera in Poser(9) is animated correctly
but (Fig 1 )it's rendering in some sketch mode
Fig 2 - rerendering but 128x128 tiles , preview mode
which gives us a 2048x1024 image map
shooda used 125
i was supposed to complete the script but i digressed sorry ! not to mention the midi animation kit that's waiting !
so i animated Amy in daz studio i exported 1 frame to Blender using mcjTeleBlender and i specified an equirectangular image for Blender's sky sphere i changed the camera type to Panoramic - equirectangular - deleted everything in the scene except the camera i selected the camera i saved this as yt360.blend i closed blender
in daz studio i set the render size to 4096x2048 ( but maybe you could save yourself some trouble by making it 1920x1080 ) i exported a 30 frames animation from Daz Studio using mcjTeleBlender but i carefully specified i didnt want the camera to be exported and i told mcjTeleBlender to re-use YT360.blend
i rendered my 30 panoramas using the batch file
then i found a way to convert that into an mp4, unfortunately, only 2048x1024 virtualDub + ffmpeg
maybe using the 64 bit version of ffmpeg or a newer/better mpeg4 codec i could get the 4096x2048 format
---
Yay ! the latest ffmpeg build, 64-bit accepts to do the avi-to-mp4 conversion for my 4096x2048 video ( though windows explorer an/or mediaplayed cant touch it )
i don't know if i'll be able to make the Poser renders pixel-perfect
in the case of the Blender tests which had tiles rendered at a higher resolution of 512x512 i didnt see that issue
maybe with smaller 128x128 tiles the "half-pixel" issues become more visible
i may try a Firefly render also
fig 3 was a Daz Studio render with 512x512 tiles
Fig 4, using a finely tesselated sphere object ( which will be a free Prop ! )
improves UV coordinate fidelity
fig 5 that's the UV map of the 128x128 facets sphere, i also inverted the normals so they point inward, and inverted the U coordinates since you'll be looking from the inside, i'll make sure the azimuth matches those in blender
daz studio render, 12x6 tiles, the tiles are 512x512
there's no visible cracks,
i just need to shift the entire panorama so the zero azimuth is on the left side,
so the first render Y Rotation should be 15 degrees i think (360/12)/2
hmm something's not right the poles are suppose to be at the top and the bottom
Fig 2 -
Now that's more like it !
in Fig1 the camera was not at 0,0,0
fig 3, once applied to the sphere object, we can see that the results are not very exact
smaller render tiles would yield less distortions ( or higher noise frequencies )
i think i'll release this as-is anyway and add a big caveats section in the manual
the real solution would be to replace the imagemagick tile processing with something mathematically correct
basically, taking each pixel from the tiles and projecting them on the sphere
so here as you can see when i project the mosaic which was made on 12x6 tiles
onto a sphere object which is also 12x6, we get less distortion
Fig 3 well in this test things look great, the ball,cube.cone,cylinder are sitting at Y = 0
if you look at the sea horizon, that makes sense
Since the photographer was not holding the camera at sea level, the primitives dont look like they're grounded
to get that, we need to move them down by the altitude of the photographer
one of the upcoming free sphere props with a sIBL equirectangular image, in daz studio
the same scene transferred to Blender using mcjTeleblender
using the same map on Blender's "World Environment"
making sure the angles match ( which was not the case before ! )
turns out making the sphere objects 100 meters diameter wasn't enough, so they will be 1000 meters
next i make sure the maps created by mcjSphereCam come out aligned the same
-------
Fig 2 - the mcjSphericalCam script doesn't unduly rotate the universe
on the left the original sIBL map, on the right the map produced by mcjSphericalCam
it's normal that Amy and the primitive are not grounded
in fig 1 the camera was 1 meter above the SkyBall's center, in fig 2 it's at 0,0,0
there will be some changes tomorrow, so consider it a pre-release
but i think it's usable
oh i'll add a few skyBall props as obj files, to be imported using Daz Preset ( scale = 100% )
those spheres have a 1 Km radius
their normals and UV mapings are ready for equirectangular panoramas
important note: the results obtained using this script are not perfect, in fact there's image
distortions that may become very apparent. In the future i may return and improve this ... but not soon
since the skyBall facets are not straight-edged
and imagemagick seems to only do bilinear deformations
i'll write a little exe that does bi-curve deformations
it's the previous method which was distorting
it was taking for example a hard triangle and stretching it into a square
but the real thing it must be doing is
stretching a pizza slice ( with the round edge) and stretch it into a square
and when it was stretching a trapeze into a square
it was supposed to stretch arc'ed trapezes into squares
I have an off topic question... I have a seamless tile running around a "horizon" globe I created using a sphere... it's showing a seam though when rendering. I used spherical UV mapping so that shouldn't be happening! Any suggestions?
Comments
Blender seems to have the ability to do hdr renders
normally an hdr is many photos taken with diverse exposures ,
then they mix those and obtain a higher dynamic range
and the hdr images have more than 8 bits per color channel, probably 16 bit
i'm not too sure how you'd set up your renders
but you'd end up with 4, 8 or 16 8192x4096 spherical panoramas
and combine them into one ( using tools with names like "panotools" which i never really tried using )
( i vaguely remember using a program like that related to Quicktime VR )
unless 3delight can render 16 bits tif or targa or png images
note that i mentioned this earlier, but eventually i will or someone did/will create a camera "shader" which lets 3Delight render perfect spherical images in 1 big render, no scripts nothing
incidentally i should check if Blender Cycles doesnt already have that option!!!
OH MY BOB !!!
Blender does have panoramic equirectangular renders !!
which means i could remove the Blender camera exportr from my script
I'm referring to the need for .hdr files for use in iRay in the newest DAZ Studio.
i could add an option so that when imagemagick assembles the rendered jpg/png tiles
into the big spherical panorama, the resulting image would be saved in hdr format ( not sure if all versions of imagemagic will accept to do it though )
it would be hdr but would not have true high dynamic range ( finely graded light intensities )
but i think you'll get the same result by applying the jpg or png version to IRay's environment sphere
--
i did a test just now, imagemagick took the 128 512x512 tiles and created a 50 MB hdr image in a few seconds
so it must be 48 or 64 bits per pixel 8192x4096 pixels
if the hdr is only used as a light source, not as the visible backdrop
then you can use a smaller and even blurred version of the panorama for iray's sphere
You can take the results from the script and either run it through and fake some 'stops' by making adjusting things to get other 'exposures' and then use something like Picturenaut or a Photoshop/GIMP plugin to create a 'higher' range image. Or you can repeat the render, several times adjusting the gain setting and then do the same processing to assemble an hdr...you'd probably be closer to a real 'several stops up/several down' image doing the multiple renders at different gain settings.
Also, the Blender route...you can create a full environment, light it with the sun/sky and then render that...similar to what you can do in Vue, Terragen and several others.
Some of the other interesting uses for this...
Take one of the big scenery sets (Stonemason's sets come to mind) and generate a decent full 'environment' backdrop from the scene to use. That way you can remove the 'set' and either use more characters/greater detail/higher res textures, without having to count the set's geometry...this is very useful if you don't have a 12 GB Titan to do Iray renders with.
fig 1 using the jpg spherical panorama as the iray infinite sphere environment
fig 2 using the hdr spherical panorama as the iray infinite sphere environment
note that most of amy and her clothes have a non-zero ambient color
the hdr render seems to have tints closer to what was in the mount washington panorama
but maybe it was a "non-infinite" sphere?
fig 3 - with the infinite sphere ( this time i'm sure it was ) and the .hdr image
fig 4 - i had to figure i had to remove the "environment" from daz studio'd Environment tab
to get the hdr dome to render
then i had to reduce the environment intensity in IRAY from the default of 1.0 to 0.25
... rendering it 1280x720 ...
there the HDR version of the spherical/equirectangular panorama in IRay
( intensity 0.5 )
i'm not sure why it's blurry and has bad contrast
i dont think i had dome blurring on
Shhh...
It's intentional...you were doing 'soft focus' on the background...sort of a DoF effect.
now i'll write ( already did that for camera tracking a while back ) the Poser camera exporter
in pz3 format it i remember well
which is also usable in Carrara but there's tricky maths involved
// Daz Studio uses a filmback with a height of 35mm ( DS4 made this modifiable ! )
// while Carrara uses filmback with a width : 43.354mm
// and Poser uses filmback with a width : 25.4mm
and that's it
oops and add the HDR output option
Awesome, thanks Casual!
yay the export camera to Poser button works
there will be an export camera to Carrara button too
in both cases it's a PZ3 file, so it's an animated camera object that actually gets exported
the only difference is the size of the camera "backplates" in Carrara and Poser
since this directly affect the focal-length vs field-of-view relation, it's quite important
on to Poser !! ( which for me is always a scary place )
hey we'll render the modern age-old office ! as a skyball
oops i removed indentations in the pz3 and that's apparently a no-no
oh the fun (not) poser's animation rendering cant be aborted
but as you can see the animated camera in Poser(9) is animated correctly
but (Fig 1 )it's rendering in some sketch mode
Fig 2 - rerendering but 128x128 tiles , preview mode
which gives us a 2048x1024 image map
shooda used 125
first poser-rendered skyBall!
not a success but not a horrible failure
C:\Users\Public\Documents\SkyBalls\poser
=======
fig 2, i was suspecting the crop-and-stretch processing was at fault
but it seems to do its job as expected
maybe it's a focal-length issue
note that the 2 top and 2 bottom rows are the problem
and there seems to be a curving effect
maybe imagemagick's deformer is not appropriate
hmm indeed imagemagick takes a straight line and comes out with a curve !
i found and solved the issue :)
i just have to tell imagemagick to use the ReverseBilinear deformer instead of BilinearForward
and there you have it, excellent fit
the apparent tiling is due to the way (opengl basically) computes light diffusion
but using firefly it should be less apparent
expect the release Monday evenin!!!
Amy's dance audition for the AVICII 360° video
i was supposed to complete the script but i digressed sorry !
not to mention the midi animation kit that's waiting !
so i animated Amy in daz studio
i exported 1 frame to Blender using mcjTeleBlender
and i specified an equirectangular image for Blender's sky sphere
i changed the camera type to Panoramic - equirectangular
- deleted everything in the scene except the camera
i selected the camera
i saved this as yt360.blend
i closed blender
in daz studio i set the render size to 4096x2048 ( but maybe you could save yourself some trouble by making it 1920x1080 )
i exported a 30 frames animation from Daz Studio using mcjTeleBlender
but i carefully specified i didnt want the camera to be exported
and i told mcjTeleBlender to re-use YT360.blend
i rendered my 30 panoramas using the batch file
then i found a way to convert that into an mp4, unfortunately, only 2048x1024
virtualDub + ffmpeg
i used the special "metadata injector" supplied by youtube
it injected something labeling the mp4 as a youtube360 video
uploaded to YT and there ya go
https://www.youtube.com/watch?v=wehmHReHk_M
maybe using the 64 bit version of ffmpeg or a newer/better mpeg4 codec i could get the 4096x2048 format
---
Yay ! the latest ffmpeg build, 64-bit accepts to do the avi-to-mp4 conversion for my 4096x2048 video
( though windows explorer an/or mediaplayed cant touch it )
---
3rd attempt, sucess !
https://www.youtube.com/watch?v=kJaiETGp9b8&feature=youtu.be
the video had to be 3840x2160 to be considered 4K
else it was 1440p so, not the full resolution
tonight , no buts, i'll post the script
i don't know if i'll be able to make the Poser renders pixel-perfect
in the case of the Blender tests which had tiles rendered at a higher resolution of 512x512 i didnt see that issue
maybe with smaller 128x128 tiles the "half-pixel" issues become more visible
i may try a Firefly render also
fig 3 was a Daz Studio render with 512x512 tiles
Fig 4, using a finely tesselated sphere object ( which will be a free Prop ! )
improves UV coordinate fidelity
fig 5 that's the UV map of the 128x128 facets sphere, i also inverted the normals so they point inward, and inverted the U coordinates since you'll be looking from the inside, i'll make sure the azimuth matches those in blender
Fig 1.
daz studio render, 12x6 tiles, the tiles are 512x512
there's no visible cracks,
i just need to shift the entire panorama so the zero azimuth is on the left side,
so the first render Y Rotation should be 15 degrees i think (360/12)/2
hmm something's not right the poles are suppose to be at the top and the bottom
Fig 2 -
Now that's more like it !
in Fig1 the camera was not at 0,0,0
fig 3, once applied to the sphere object, we can see that the results are not very exact
smaller render tiles would yield less distortions ( or higher noise frequencies )
i think i'll release this as-is anyway and add a big caveats section in the manual
the real solution would be to replace the imagemagick tile processing with something mathematically correct
basically, taking each pixel from the tiles and projecting them on the sphere
3Delight has spherical projections in its to-do list this summer ... so there's that
but
3Delight supports the following projection types in RiProjection:
perspective,
orthographic,
cylindrical,
fisheye
2 fisheye renders could probably be massaged into a good spherical
there's that
if the sphere object tiling matches the rendering tiling i will probably look okay-ish
imperfect but hey
Very cool! Looking forward to seeing what this can do in DS!
so here as you can see when i project the mosaic which was made on 12x6 tiles
onto a sphere object which is also 12x6, we get less distortion
Fig 3 well in this test things look great, the ball,cube.cone,cylinder are sitting at Y = 0
if you look at the sea horizon, that makes sense
Since the photographer was not holding the camera at sea level, the primitives dont look like they're grounded
to get that, we need to move them down by the altitude of the photographer
one of the upcoming free sphere props with a sIBL equirectangular image, in daz studio
the same scene transferred to Blender using mcjTeleblender
using the same map on Blender's "World Environment"
making sure the angles match ( which was not the case before ! )
turns out making the sphere objects 100 meters diameter wasn't enough, so they will be 1000 meters
next i make sure the maps created by mcjSphereCam come out aligned the same
-------
Fig 2 - the mcjSphericalCam script doesn't unduly rotate the universe
on the left the original sIBL map, on the right the map produced by mcjSphericalCam
it's normal that Amy and the primitive are not grounded
in fig 1 the camera was 1 meter above the SkyBall's center, in fig 2 it's at 0,0,0
Fig 3 that's'good
lets call it the early-nerd-gets-the-script edition
( and lets pretend script rhymes with worm )
https://sites.google.com/site/mcasualsdazscripts5/mcjsphericalcam
there will be some changes tomorrow, so consider it a pre-release
but i think it's usable
oh i'll add a few skyBall props as obj files, to be imported using Daz Preset ( scale = 100% )
those spheres have a 1 Km radius
their normals and UV mapings are ready for equirectangular panoramas
since the skyBall facets are not straight-edged
and imagemagick seems to only do bilinear deformations
i'll write a little exe that does bi-curve deformations
:-) You think of everything, you coder you! lol
it may extend its usefulness :)
though 3Delight and Daz Studio 5 will possibly have spherical cameras
the bi-arc deformer works, if all goes well this means the panoramas will look swell
it's like taking a pizza slice and stretching it to fill a square box
Won't that cause distortion though?
it's the previous method which was distorting
it was taking for example a hard triangle and stretching it into a square
but the real thing it must be doing is
stretching a pizza slice ( with the round edge) and stretch it into a square
and when it was stretching a trapeze into a square
it was supposed to stretch arc'ed trapezes into squares
so i wrote the special stretcher utility
and the results are splendid ( see below )
expected release tonight
I have an off topic question... I have a seamless tile running around a "horizon" globe I created using a sphere... it's showing a seam though when rendering. I used spherical UV mapping so that shouldn't be happening! Any suggestions?