Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
Agreed. I'd prefer not to have to post to the Gallery just to share the fruit of technical tests.
Just an aside, and nothing to do directly with the thread, but I thought it was interesting? This is invariably an issue with AI generated content I guess at this point.
When Wendy mentioned https://www.songr.ai/ , I popped on over to have a look see for fun. The second thing I quickly tried out with prompt as "Summer love" came up with this as lyrics... Hmm, rather familiar...
Below that is a capture from their terms and conditions page. They allow their output to be used for commercial purposes, but it's on the user to make sure there are no copyright infringements. Eek!
That is what you call a big can of worms. There will be so many problems with this tech in the long term. Unfotunately the end result will probably be that no copyright can be protected, there will not be enough courthouses to deal with all the infringement claims.
you don't need an AI to infringe copyright with your art or music
just drawing an iconic Mouse or playing a distinctive tune by ear can do it
You're absolutely right, but atleast back then the person had to sit down and draw it or play it
but now you just have to press the "make art" button and the AI will offer the lyrics from "Grease" as if it had created it. But it didn't, it stole it. That's the difference.
Google existed
rightclick save image existed
we are straying off the remixing art though and into debates that get threads locked
FYI, I was finally able to upload my test images to my post above, if anyone is interested. Thank you for reading!
Wendy, you're right again.
very cool RenderPretender
Thank you, Wendy. I'm really quite happy with her. I think she really punches a big hole in that realism barrier I've been trying so hard to crack. BTW, in the image where she's seated, see how the inside of her left elbow, left knee, the slight fold of flesh over the waistband of her shorts, and buttock (against the stool) are naturally compressed? That's actually the AI saying "The mesh can't deal with these infuriating intersections, but I can!" JCMs, take a break?
this looks awesome
Great catch, @Wendy
https://github.com/princeton-vl/infinigen
Just tried to follow instructions from https://github.com/princeton-vl/infinigen
but failed with the error: No module named 'gin'
WendyLuvsCatz,
Fascinating! I watched the video and something at 1:46 caught my eye:
"Infinigen is 100% procedural ... Math rules only. Zero AI. Just graphics."
And this stuff runs on Blender? Wow!
Cheers!
The only problem is, that it requires a special version of Blender, that needs to be build from scratch.
They promise better instructions and maybe ready build version of the Blender, later on.
While searching for Infinigen, I have found a 7 years old references, so it is really long research project.
For those interested, here are four more experimental DAZ-through-SD shots. Even if I had enough VRAM to use ControlNet, I still wouldn't be able to control poses or camera aspects as I can in DAZ. So the supine pose and horizontal aspect in the second image worked out well in this DAZ-through SD effort. SD does not do well with facial profiles or drawing portions of the human face that are not visible in the original render, so I have to present as full a face to the AI as I can while composing. The AI respected the background and props very well.
Trying controlnet shuffle and refference to get camere and scenery variation . Model i`ve used seem not really picking up the old western style from my daz >blender render . But thats ok . I bet for modern or futuristic type of 3D render , those controlnet will be just fine
it is crazy how one can reimagine renders with AI
a death wizard casting spells in a cave with flight of steps with a burning torch by the entrance, lots of vegetation and lichen
this will be an animation when done and you should be able to see traces of the original in it
So far the biggest game changer for me has been the free 360 AI generated environments by “Blockadelabs”
I convert them to HDRI with infinity AFFINITY designer.
and am using them for the panable backgrounds for my current animated film “Ghost Origins”
https://vimeo.com/835802837
@juvesatriani
Have you tried the free 360 generated environments from Blockade labs?
They have changed my entire animation workflow
and they work in blender (with cycles only)
The 360 generator is cool. What is Infinity Designer?
You have definitely NOT wasted time or money. You've had a hobby that you've enjoyed for many years and has been rewarding for you. Does AI change the landscape going forward? Or course. But that doesn't in any way devalue what you've already accomplished.
I agree completely. Don't devalue your efforts. I have been struggling with the seme feelings, but (see my posts above) I'm trying to address them by using AI as a tool in my workflow with the objective of pushing my DAZ characters closer to photorealism, which has been a constant frustration with DAZ alone. Don't give up. Try to see AI as a tool in a creative continuum, as opposed to a replacement that threatens to cancel your chosen medium.
We can't second guess or change the past, we can only learn from it. You used the tools available to you then, and now the tool options are expanding. I would say you learned significantly more doing it the more traditional way of storyboarding, easier is not necessarily better.
Ah, Affinity Designer. How well does it convert? It won't have the same dynamic range that a true HDRI has, right? Or do you further edit to add more to the end result? Since it's animated, maybe it's not as much as a concern.
Looks great. I use such 360 environments in Unity as a skyboxes.
They look great, but I miss the possibility to get light from them,
so no shadows, as a result.
I have taken my Daz Studio render from my Gallery...
and processed it in Control Net in Stable Diffusion
First example
Second example
Third example
This was easy, because no fingers were involved.
Then I have took another Daz Studio render
and processed it also in Control Net in Stable Diffusion
Another example