Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
Well I'm not a traditional Daz user and although impressive specs I've never had issues harvesting what I need from the Genesis framework.I am a traditional artist that wants to make my stuff and not handed on a silver platter. The gem for me here though is the materials so as far as looks goes if I can convert the materials to use with Genesis then Genesis can be made to look as good in UE4.
Everything is in the paint job an not in how the geometry is shaped.
Granted if I was making a movie or animated film that would be different but as to my needs G3-8 offers the base framework that I can build my stuff on.
Once again the benefit is still it has the full support of Epic so as such is consider gospel.
If someone else finds the assets of use that's fine but a Daz killer...I don't think so.
For me though I don't need a Ferrari when a pick up trunk does the job of helping a friend move on the weekend. :D
All Daz has to do is to provide a studio quality material
I forgot to say, it's actually the best for you can hope for it not to be free because then you are doing quite well financially from the use of the UE engine and it's free assets. That's the practical truth of the matter for 99%+ of the population that uses their Epic assets. If you don't make money they could care less what you're doing really as they can't squeeze blood from a stone. Practically speaking, they'd squander a fortune in trying to recoup 'profits' from people that don't have the money to pay for those 'lost profits'.
They are just making reasonable stipulations to keep their free assets from being resold by middle men. Almost every digital product, sold or given away free, has a similar stipulation.
Someone, over on the Reallusion forums, posted confirmation from a youtube comment by Epic games.
There will be no paid subscription to use the web based MH Figure generator in UE4.
IMHO this is all about Getting more professional 3DCG/VFX/ film Companies into the Epic Eco system where they will eventually spend money on assets perhaps even paid 24 hour tech support services as does Autodesk.
Remember ,you start paying for UE after your income gets over
a certain high dollar amount.
This is the target demographic of the MH figure platform.
companies with high revenues.
This is not about stealing away existing portrait/ pinup still artists from the poser/Daz Demographic who will likely never become paying users of UE4.
The obvious major beneficiary will be Autodesk Maya as it & MOBU are still the primary animation software used by Game Dev companies and Hollywood filmmakers.
And I dont see epic trying to turn UE into Maya or MOBU
Reallusion should add support for the MH figures to thier new
online actorcore motion library system immediately as they have already done for Maya, Blender C4D etc.
But frankly anyone else who had hopes of getting the big dollar hollywoodwood/ VFX companies to adopt thier figure platform should probably abandon those hopes.
apparently you can import it into 3Dxchange too if you set the LOD to 1
Bassline an iClone7 user has on their forum
I'd hate to do all those bone manually in expression editor though
I did do G3&8 so I can use entirely bone based facial animations
it took a while
Well... this pretty much has curb stomped what remains of Poser and as for Daz, well... hard to say.
I do think a lot of Daz's business practices over the years will come back to bite them in the behind when this finally drops, if they don't get it together, because this could be a game changer for those who dabble in realism. My biggest concern is that a lot of games will become very same-y looking asset flips in regards to humans.
Either way, Daz better work on making DS 5.0 the best it can be, because hoooo'boy.
It is interesting, because while Daz is free, a character model and its use in an Unreal distributed game are not. RL's CC application is not free, but exported content is "free for use" in an Unreal Engine distributed product.
Epic has been very specific in its court dealings with Apple over Fortnite. They never leave language vague, as to not leave too much room for interpretation. But their marketed terminology "free for use" is very vague. Because that may be applying to use of exported content in a distributed game, outside of the cost of the web-based application use to begin with.
I think at the end of the day, if the released Meta Human Creator has, say, 2 download modes (1) as a UE4 asset, with no restrictions, and then (2) as ageneric asset, alerting you to restrictions (ie "subscription required"), and you can download model after model after model into Unreal, then we'd have definitive confirmation that it is not pay-for when it comes to UE4. But as of this point, the "free for use" terminology is not in fact a demonstrable confirmation, and just because they have a scenario with Qixel Megascans does not then mean this is de-facto the case for Meta Human Creator. Epic could easily say it to clarify "No creation or download limitations, and no subscrption required for Unreal, etc...", but they have not.
I do hope eveything is free when it comes down to it, but they have not stated that decisively and specifically. Rather, it is being assumed, and while that assumption could turn out to be correct, it still is an assumption as-of this point and could turn out to be not correct.
man i do feel you re being "too overworried about it, it's easy to see how things can go, they will link metahuman account to unreal engine account, then when you login in the methuman site you also are logged in unreal then when you are finished to "edit the character you gonna have a "free option" to download it on unreal native format to use it, its not complicated process, now if you want to export to others format then you gonna have to pay for it., this is the most basic way, they could come with others, but is not hard to make it free to edit and then user in unreal as they are claim, now if you are worried because you want to use outside unreal then is another history.
It's not a matter of worry, but rather a point of accurracy. For example, where are you yourself specifically getting that you will have to pay for another format. Has Epic in fact anywhere stated this, or is this an assumption being made, maybe based on Qixel Megascans. Because unless they said "you will pay for another format other than the Unreal format", then this is then itself an assumption built yet on top of another assumption that the Unreal creator is free. It's a technical point, not a point of worry.
i'm not exactly making any statment or anything like that i'm just giving a exemple of how they can do, because i've already saw others doing on this way some sort of "promotions too, where you get one format for free but have to pay for others formats, it's just to show which is possible and normal to have a way for "said A users can have it for free while B or C or D must pay, we don't know how they gonna really make it but they made clear which you have to pay nothing to create and use any character as long you are using inside unreal.
Also someone told about "animation" i would not be too sure about unreal also don't want to make they own animation tool inside unreal, actually they already have it and most of they "cinematic trailers are made full inside unreal, they already have a native animation tool it's just not well developed or strong enougj as maya or MoB, but they are improving, in the same way they are building inside unreal a tool for modeling 3d without you need stufs like maya or blenderl
I means it's clear which epic is trying to create the "ultimate all stuffs tool" where you can do all the steps from modeling, to texture, to rig, to maybe do a retopology(if they not want to kill the retopology process), animate/pose to or render or create animations, movies or games all inside unreal without you have use a single program outside unreal, the best would be stuffs like the metahuman which still a unreal stuff in the end.
they really are aiming big, i would not be surprise if in a future they also not add a full sound editor to create sounds effects and music from the ground in the same way they already have a tool to create VFX stuffs.
Very interesting thought. As someone using a lot of audio software such as Bitwig, Live, Reason, RX and so on, I think it would be a very big step to do something similarly easy-to-use in the Unreal environment. DAWs are complex and the implementation of new features in well-known DAWs such as Ableton Live often takes a lot of time. It's def possible though - I haven't been diving into it, but I think you can already do a lot of synth stuff in UE, it's just not implemented as user-friendly as in modern DAWs. I think a comparatively simple solution would be to take a single powerful idea such as the graphical representation of frequencies as seen in Photosounder or RX and do something similar for the quick creation of SFX. I would not expect UE to be on the same user-friendy level as DAWs for the production of full tracks anytime soon, but I would be happily surprised if they managed to do it somehow...
I've been saying this for a while actually. Daz's content store is better than anything else out there right now. Leveraging it into engines like Unreal, etc. is the way to go. That implies improving integration, including the things like live link into Unreal.
More details...
Are there any PC webcams that can be used for this sort of motion capture? I really find it hard to believe that we have to use a $800+ Apple phone to do this when theoretically, a very good webcam should be able to harness the power of a PC and do it for less.
You can use a used iPhone X for $300 or so with no contracts, etc. Just use it for the app and the camera. I don't know of any webcams that are depth cams which the Apple stuff requires.
https://www.extremetech.com/mobile/255771-apple-iphone-x-truedepth-camera-works
You can see the infrared beam shooting out of the phone on some of Solomon's videos.
Keep in mind that they did not use an iPhone to get the mocap data for their presentation video.
Any webcam will do for facial motion capture. Though Apple's depth camera is a nifty gadget, it's not at all necessary for capturing mocap data. The issue comes with the softwares used to capture said data which cost thousands of dollars and require a lot of experience to get good results. There are of course free alternatives but the mocap data you get from these wouldn't pass any quality test.
What did they use? Solomon used an I phone and Live Link for his YouTube tute.
oh there is a whoie bunch of them I have plugins for in my Unreal library
all bloody pricey
a quick look reveals
Dynamixyz live The Dynamixyz Live Link Plugin provides realtime facial animation for characters using the Unreal Live Link protocol combined with Dynamixyz's Real-Time Facial Motion Capture software
Rokoko Studio live With Rokoko Studio Live you can sync the motion capture data from your Smartsuit Pro, Smartgloves and Face capture (or use them separately) and stream it directly to your custom character in Unreal Engine.
Live Client Using best-in-class, markerless, facial motion capture software, Live Client for Unreal alongside Faceware Live Server, animates and tracks facial movement from any video source to CG characters, in real-time directly inside Unreal Engine.
They are expensive! Thank you, Wendy
Most likely Dynamixyz as it's the best commercially available software out there for facial mocap. It also has an 8k$ pricetag.
I made a tutorial on how to get mocap data from Dynamixyz into Daz:
You get all the necessary files after you purchased Genesis Face Controls.
Now I need 8K$ for their software as I purchased Genesis Face Controls when it hit the store. Thanks Faux2D
Glad I could help. You also need Maya which is only worth 5k$.
Joking aside, there's a 30 day trial for Dynamixyz's Performer in case you want to try it out (and for Maya as well). I also attached a Documentation pdf which goes into more depth about the process of getting mocap data from Performer. There's a learning curve so there's also a time investment involved. Based on the user the results can vary a whole lot:
For the MetaHuman promo videos, I am 99% certain that Epic used Cubic Motion's Persona motion capture system: https://beforesandafters.com/2019/04/16/tech-preview-cubic-motions-persona/ This is the same system used for their Siren tech demo and I believe they even used the same performer as with Siren. Keep in mind, this system is orders of magnitudes greater in sophistication than the iPhone ArKit (which should never be used in a pro production) and comes with an epic price.
Those solutions are actually cheap when compared to what was used in the MetaHuman demo. You would probably have a heart attack if I told you that price.
Cubic Motion's Persona was used. $25k for hardware, $10k setup per character, $10k /month for software license.
Senua's the one that really impressed me, it's on par with Thanos' face from Endgame. Dynamixyz can achieve great results but it starts falling behind when dealing with dialogue, I assume that is why they kept cutting away whenever Hulk spoke.
I'm a Dynamixyz user and the technical sales rep for them for the US/Latam, so a couple of things I would like to chime in on.
Yes, the normal price for studios is 8k per year, but there is a version for Indie content producers, that has basically everything except for batch processing (to do large amounts of tracking/retargeting at once) for 2.2k per year, yes, still expensive if you are doing this just for fun.
I'm very curious on what you are referring to Faux2d in regards to it falling behind dealing with dialogue, as this is something neither I nor my customers have experienced, could you elaborate?
Thanks Dr. Zap, Bryan Steagall, and Faux2D for the pricing and additional info! All too rich for me. But at least some hope for hobbyists as tech continues to improve. My daydream as a little kid making my own characters really "come to life" may happen eventually as CG.
I'm talking about lip-synching specifically as in the quality of it. I took the movie Avengers Endgame as a benchmark and compared Thanos (animated using Masquerade 2.0) and Hulk (animated using Dynamixyz). Long story short, if you mute the audio in a scene you can almost lip-read what Thanos is saying but not so much with the Hulk.
In my own experience with Performer Single-View, when dealing with dialogue, I had to disable the poses for the lower face containing FACS and just leave the phonimes poses. I know there are several factors that can cause this overlap between FACS and phonimes poses like the quality of the video, the expressivity of the face itself, not using Multi-View, and the rig itself. This is why I looked into other CGI characters to see where the problem lies and discovered there's a whole lip-synching rabbit hole.
I am splitting hairs here a little. As far as indie creators go, Dynamixyz is the best option out there that I know of.
I have had very mixed results with Dynamixyz (single view). One, the tracker didn't always do the best job so I was forced to manually correct it and since humans are inconsistent creatures, my results varied with each tracking session. In the end, the retargeting is only as successful as the rig quality. I ended up getting a Snappers rig. IMO, the best thing that will come out of MetaHumans is that developers like Faceware and Dynamixyz will able to democratize high-end facial mocap by concentrating their efforts on a single, known, rig logic and topology. Instead of training AI for diverse and disparate rigs, they can focus a product on the Metahuman rig. Anyone who can deal with the limitations of this arrangement should be rewarded with a very responsive and accurate retargeting of their performance in realtime. Both Dynamxyz and Faceware have announced they are working on such a product for MH. I am really anticipating this. A high-end face rig and realistic face painting mean nothing without an equally proficient way of animating it. So far I have been lukewarm with Dynamixyz and Faceware. So much so, that I have been saving my pennies for Cubic Motion's Persona. I hope that the next generation of tools changes my mind.
@Faux2d
Ahh!, now I understand.. and yes, I have experienced that, but I'm not sure if it is necessarily a fault of the software itself or more of an issue of the team doing the tracking/retargeting and animation polish (and the actor) not having that in mind. Some years ago, I worked with Rochester Institute of Technology's School for the Deaf, who purchased Performer and I got a whole education in 5 days of some (and I stress, some) of what is involved when animating with this in mind, which most of us take for granted. Now I think I need to test this!
I build my own animation rigs in motionbuilder (like what you've done for Daz) and lately have been experimenting with the facs based blendshapes and I'm finding that I agree with you in that they are not enough at times, so I'm going back and I'm going to add some of the base blendshapes to see if I get better results.
@drzap
Were you using an HMC or a static/web camera? and were you calibrating for the specific lens? I've found when consulting with customers that calibration and inconsistencies in the way they annotate the shapes for tracking are what cause most of their issues. Yes, we humans are very inconsistent creatures! Which is why I stress two things to customers.. 1st, mark up the face of the actors when you are learning, so you always hit the same landmarks and always have the same team do the annotation/tracking, so that your results are as close to the same all the time.
I wholeheartedly agree that the end result depends on the quality of your rig. I don't consider myself a modeler/rigger, so for my own stuff, I depend on Daz characters, which I then adapt to my rig in Motionbuilder, which is ok, but it is a joy when you work with a good rig (snappers are some of the best around, but can't afford them personally)
I'm also looking forward to their solution (they haven't shared with me yet exactly what they are doing)