Hi,
The body skeleton looks like a Kinect v1. Kinect v2 for Windows has been out for about 6 months, now. Will you be making a Client/Server System for the Kinect v2 also? What you are doing is really great! A rendered picture used to be the end-result for Daz3d and Poser. I think interacting with our figures in a Kinect Client/Server hook up will become the new end-result! I think the 'demand' for hair, clothes, accessories and figures will increase significantly as hobbyist''s interests are rekindled.
It is easy to import Poser figures into Daz Studio, so if you get your stuff up to distributing a Beta Client/Server for Kinect v2, they'll be coming over from Poser in droves!
Thanks,
Jack
Hi,
The body skeleton looks like a Kinect v1. Kinect v2 for Windows has been out for about 6 months, now. Will you be making a Client/Server System for the Kinect v2 also? What you are doing is really great! A rendered picture used to be the end-result for Daz3d and Poser. I think interacting with our figures in a Kinect Client/Server hook up will become the new end-result! I think the 'demand' for hair, clothes, accessories and figures will increase significantly as hobbyist''s interests are rekindled.
It is easy to import Poser figures into Daz Studio, so if you get your stuff up to distributing a Beta Client/Server for Kinect v2, they'll be coming over from Poser in droves!
Thanks,
Jack
i only have an old kinect and am on a Windows 7 PC
the programs i write based on the Microsoft Kinect SDK 1.8, will work in Windows 7 and 8 but only for the old Kinect and Asus sensors , not the new V2 sensors.
To develop something for the new kinects, i need Windows 8 and Kinect V2 Sensor
so i probably wont have Kinect V2 compatibility in 2015
i'll send people with V2 kinects to Breckel and Ipisoft ( there's also something for poser 2014 i think ) where they have free and non-free apps that can produce BVH files which can be applied to Daz Studio figures
Thanks Casual, for taking the time to respond to my comments. I also see, what appears to be, an invitation for me (or someone) to step up to the plate and do the job of building a client/server system for Kinect v2 ourselves. Thanks for the implied complement but I am, pretty much, a Hobbyist, not a developer. I think developers like you might have a misconception of what the market is for Poser and Daz3D. Speaking for myself, we are not looking to record or play motion files but to move our own figures around on the display as a direct response to the data from the Kinect V2, eg: Augmented Reality. The finished product would be like a virtual mirror with the ability to see oneself on a display just like you would with an optical mirror but with the option of substituting the mirror image with our poser or daz figure(s).
If I were an executive at SmithMicro or Daz, I would either hire you or, if you are already employed, send you a Kinect v2 asap.
Jack
the 3 output modes should allow use by "client applications" which don't want to or are not able to engage is communication with the server
so you could record movements with the server then have a Blender Script read the data and apply it to the Dragon Hunter Lady ( tragic end i know i know ! )
mcjKinectServer001.exe Data is sent to stdout
mcjKinectServer001.exe -output:stdout Data is sent to stdout
mcjKinectServer001.exe -output:auto The filename is built from the current date/time, data is sent to the current working directory
mcjKinectServer001.exe -output:auto -workdir:c:\data\ The filename is built from the current date/time, data is sent to the specified directory,
mcjKinectServer001.exe -output:c:\data\data.txt Data is sent to this file ( over-writing the old content! )
mcjKinectServer001.exe >> c:\data\data.txt Data is sent to stdout then to the specified file
mcjKinectServer001.exe -output:stdout >> c:\data\data.txt Data is sent to stdout then to the specified file
for the first release at least
a session could work something like this
start the daz script mcjKinectClient001
press the 'Capture' button, which launches the kinect "server" application mcjKinectServer001.exe
press the Server's Start button
walk in the Kinect's field of view
wait until the delayed-start delay ends
act
press the server's Stop button
Close the Server
the script created an animated stickman
use the script to crop the animation the animation shown below was 600 frames but i cropped it down to 300 )
use the script to transfer this animation to a Figure ( yes Genesis too, not just Aiko3 )
OK, I just ordered mine! :) Looking forward to this. Will it be applicable to V4/Aiko4?
i'll try to make sure it works with all generations
my upcoming tests will be done on Aiko5
note that my 'kit' of programs is written for the old Kinect sensors that were sold with the xbox360, or the compatible sensors.
I develop it on Windows 7, but according to microsoft, the program will work under windows8
it will NOT work with the new Kinect v2 sensors that were sold with the xbox one
i dont have plans in the near future to support the new sensors aka kinect v2 sensors
but if you buy the new ones, there's free and commercial software out there that do support the new sensors
you could use those ( breckel/ipisoft ) and get better motion capture afforded by that sensor
they can even track up to 6 figures at once
I've got an old sensor on my old Xbox, so I think I should be OK. I don't yet have the $$$ to spend on a new Xbox One! Once I get the adapter, i may test the free Breckel software
to get better motion capture frame rate ,
the motion capture will be done on the stickman prop,
which is just a collection of cones
then all or part of the animation can be transferred from the stickman to the figure
in this case Azura is a Genesis-Aiko5 figure
this was my first test on a Genesis figure
previous tests were done on Aiko3 and the secondLife female avatar
apparently, i don't need to change anything to the method i use to apply the bone rotations
to avoid any frame-drop due to Daz Studio activity
there will also be a mode wherein the daz scene stands still during the mocap session
anyway the Kinect Server ( PCWin application ) has a line-drawn stickman
which gives you real-time feedback during the mocap session
later maybe i'll add capture with the direct-to-daz-figure feedback
---
i don't make promises but i throw out there ...
video capture during mocap, audio recording during mocap
rotoscoping in daz studio
... if we had the video captured during the mocap session
and if we could display it in daz studio as background images
it would help during the clean-up work
and it would help when adding head-face and hands animation
------
Figure 2 : added a list to select which bones' animations are transferred from the stickman to Azura
Just got my adapter in (wow, quick shipping!) Now it is a waiting game...lol. I can Beta test now if you want someone to give it a try! :)
i'll try a release tomorrow
but in the meantime, if you dont mind the download sizes
this package ( the microsoft kinect toolkit 1.8 ) includes sample source code to help programmers like me
it also contains 2 videos ( car ads i think ) promoting the kinect
and it contains ready-to-run sample programs !
i think it contains most of all the microsoft drivers for the kinect
-----
in the image below you can see the tons of sample programs in the Toolkit's "bin" folder
after installing the "runtime" which installed drivers etc
you plug your kinect in the adapter
plug the adapter in your PC's usb port
plug the power adapter
your pc should detect your kinect and start installing about 8 drivers
then you can run any program based on the microsoft sdk
---
some of the /bin/ sample programs even let you 3D scan yourself and save this as a (big) .obj file
so kinect poses are not accumulating in the communication link between the kinect server app and Daz Studio's kinect client script
( there's no traffic jam )
the poses are applied to the stickman in Daz Studio's scene almost immediately after being produced by the kinect
in Daz's viewport there was a visible Genesis (Aiko5 ) figure with clothes and hair
but that figure was not moving ( the timeline frame was not changing )
conclusion: just redrawing a figure doesnt use much cpu effort
( dropped frames may occur i'm not sure yet )
------
other tests were done and when the kinect has a hard time figuring your pose
the frame rate drops to 15 fps
But most of the time we get a steady 30 fps
well the first release will have to wait a bit more
but soon soon, yes it will be soon :)
things to do before the release
- better handling of invalid poses and interpolation between poses
- transferring poses from the stickman to a non zero-posed figure
- loading (done) / saving raw mocap data
Mathematically speaking 180° is the same as -180°.
When the PC cant keep up and drops poses ( frames ), we can get a situation where the pose before the drop uses the rotation -179.9° and the pose after the drop uses 179.9°.
Daz studio's simple interpolation would guesstimate the missing Pose to use the angle 0° ( ( -179° + 179° ) / 2 )
I'll add a processing phase that "sees" all angles as ranging between 0° and 360°
-179.9° will be seen as 180.1° and 179.9° is seen as 179.9°
The missing interpolated pose will be .... ( 180.1° + 179.0° ) / 2 = 180.0°
the mainstream daz and poser figures ( A3, V3, V4, Genesis ) have the same zero pose, the T-Pose
but figures like TheGirl have a zero-pose with legs spread apart
and we see meshbox's "Norm" figures have arms bent
so, in order to better support those figures, i'll add this feature today
and aim for a release tomorrow ----------
Figure2 - tada ! it works, but i had to use a "hand made" profile for the toon figure - i'll see if i can make this universal and automatic ( the white plane is a left-over from measuring the shoulder's zero-pose : 50 degrees )
there it is ... i think
node = Scene.getPrimarySelection()
o = node.getOrientation()
debug(o)
Comments
Just ordered my Kinect adapter in anticipation of this script. ^_^
since some of what i have is already usable, it shouldn't bee too long before i post a first kit
note that it's 2 parts, a Windows-PC-Only application/program (.exe) and a Daz Studio Script (.dsa)
though if someday an Apple/MAC programmer wants to re-compile the program it could work for MACs too
--------
( thinking out loud , figuratively speaking )
the kinect SDK data seems to stream at about 30 FPS
but for animation, often, a data-rate of 10 FPS is sufficient, and easier to work with ( read fix )
plus, it reduces the load on the communication link ( stdout/stdin/stderr )
so i could have the server-app throttle the data at 10, 15 or 30 FPS
and/or i could have the client-daz-script capture at 10, 15 or 30 fps
-----
for now i use the microsoft kinect SDK 1.8,
but there's also OPENNI/NITE out there
and i think ( not sure ) it may support spin-around motion
Unfortunately, using the mictrosoft SDK, when i spin-around, the skeleton tracking is all wrong !
Hi,
The body skeleton looks like a Kinect v1. Kinect v2 for Windows has been out for about 6 months, now. Will you be making a Client/Server System for the Kinect v2 also? What you are doing is really great! A rendered picture used to be the end-result for Daz3d and Poser. I think interacting with our figures in a Kinect Client/Server hook up will become the new end-result! I think the 'demand' for hair, clothes, accessories and figures will increase significantly as hobbyist''s interests are rekindled.
It is easy to import Poser figures into Daz Studio, so if you get your stuff up to distributing a Beta Client/Server for Kinect v2, they'll be coming over from Poser in droves!
Thanks,
Jack
i only have an old kinect and am on a Windows 7 PC
i'm not 100% sure but from what i read ( example : https://www.dreamspark.com/rss/news.aspx?ID=35&FeedType=homepagefeed )
the programs i write based on the Microsoft Kinect SDK 1.8, will work in Windows 7 and 8 but only for the old Kinect and Asus sensors , not the new V2 sensors.
To develop something for the new kinects, i need Windows 8 and Kinect V2 Sensor
so i probably wont have Kinect V2 compatibility in 2015
i'll send people with V2 kinects to Breckel and Ipisoft ( there's also something for poser 2014 i think ) where they have free and non-free apps that can produce BVH files which can be applied to Daz Studio figures
but, if some programmer somewhere on a Windows 8 or Mac can write the "server" for the Kinects V2 then my Daz Studio script would be usable
what i call a server is simply a SDK 2.0 based program that initializes the kinect and "prints" out poses to the stdout/console 30 times per second
well actually if someone could also write an SDK v1.8 based server for the Mac and Old-school Kinects that would be nice too :)
Thanks Casual, for taking the time to respond to my comments. I also see, what appears to be, an invitation for me (or someone) to step up to the plate and do the job of building a client/server system for Kinect v2 ourselves. Thanks for the implied complement but I am, pretty much, a Hobbyist, not a developer. I think developers like you might have a misconception of what the market is for Poser and Daz3D. Speaking for myself, we are not looking to record or play motion files but to move our own figures around on the display as a direct response to the data from the Kinect V2, eg: Augmented Reality. The finished product would be like a virtual mirror with the ability to see oneself on a display just like you would with an optical mirror but with the option of substituting the mirror image with our poser or daz figure(s).
If I were an executive at SmithMicro or Daz, I would either hire you or, if you are already employed, send you a Kinect v2 asap.
Jack
the 3 output modes should allow use by "client applications" which don't want to or are not able to engage is communication with the server
so you could record movements with the server then have a Blender Script read the data and apply it to the Dragon Hunter Lady ( tragic end i know i know ! )
work on the kinect kit resumes monday !
you know, eventually when the skeleton tracking kit is done
we'll get into face tracking
because so far the skeleton tracking doesnt give a very good head-neck pose
and if you look here http://msdn.microsoft.com/en-us/library/jj130970.aspx
near the bottom, we discover that the kinect SDK already includes recognition of a few expressions !
then maybe we'll look into gaze tracking
and eyelids tracking since it seems to be something the sdk doesnt do too well
This thread has me at the edge of my seat. ;)
for the first release at least
a session could work something like this
start the daz script mcjKinectClient001
press the 'Capture' button, which launches the kinect "server" application mcjKinectServer001.exe
press the Server's Start button
walk in the Kinect's field of view
wait until the delayed-start delay ends
act
press the server's Stop button
Close the Server
the script created an animated stickman
use the script to crop the animation the animation shown below was 600 frames but i cropped it down to 300 )
use the script to transfer this animation to a Figure ( yes Genesis too, not just Aiko3 )
OK, I just ordered mine! :) Looking forward to this. Will it be applicable to V4/Aiko4?
i'll try to make sure it works with all generations
my upcoming tests will be done on Aiko5
note that my 'kit' of programs is written for the old Kinect sensors that were sold with the xbox360, or the compatible sensors.
I develop it on Windows 7, but according to microsoft, the program will work under windows8
it will NOT work with the new Kinect v2 sensors that were sold with the xbox one
i dont have plans in the near future to support the new sensors aka kinect v2 sensors
but if you buy the new ones, there's free and commercial software out there that do support the new sensors
you could use those ( breckel/ipisoft ) and get better motion capture afforded by that sensor
they can even track up to 6 figures at once
I've got an old sensor on my old Xbox, so I think I should be OK. I don't yet have the $$$ to spend on a new Xbox One! Once I get the adapter, i may test the free Breckel software
( a name like Azure probably )
a Genesis ( Aiko 5 ) figure for the upcoming genesis compatibility tests
to get better motion capture frame rate ,
the motion capture will be done on the stickman prop,
which is just a collection of cones
then all or part of the animation can be transferred from the stickman to the figure
in this case Azura is a Genesis-Aiko5 figure
this was my first test on a Genesis figure
previous tests were done on Aiko3 and the secondLife female avatar
apparently, i don't need to change anything to the method i use to apply the bone rotations
to avoid any frame-drop due to Daz Studio activity
there will also be a mode wherein the daz scene stands still during the mocap session
anyway the Kinect Server ( PCWin application ) has a line-drawn stickman
which gives you real-time feedback during the mocap session
later maybe i'll add capture with the direct-to-daz-figure feedback
---
i don't make promises but i throw out there ...
video capture during mocap, audio recording during mocap
rotoscoping in daz studio
... if we had the video captured during the mocap session
and if we could display it in daz studio as background images
it would help during the clean-up work
and it would help when adding head-face and hands animation
------
Figure 2 : added a list to select which bones' animations are transferred from the stickman to Azura
same animation
but the feet animation was fixed using my free scripts
mcjCycleFilter :
- Synth-Current to reduce the range of Bend rotations from 30-60 degrees to 0-30 degrees
- Synth-Smooth - reduce the jitter
mcjKeepOrient
- keeps the toes parallel to the ground
other tools i use to fix animations
- the old mcjAutoLimb
- mcjAutoLimb 2014 now with Pole Vectors
- mcjRepeatAction - for example to repeatedly 'floor' a figure
- mcjDecimate - reduce the number of keyframes
and so many others i often forget about !
Just got my adapter in (wow, quick shipping!) Now it is a waiting game...lol. I can Beta test now if you want someone to give it a try! :)
i'll try a release tomorrow
but in the meantime, if you dont mind the download sizes
this package ( the microsoft kinect toolkit 1.8 ) includes sample source code to help programmers like me
it also contains 2 videos ( car ads i think ) promoting the kinect
and it contains ready-to-run sample programs !
http://www.microsoft.com/en-ca/download/details.aspx?id=40276
but before you can run the Toolkit's sample programs or my upcoming "Kinect Server"
you must install this
http://www.microsoft.com/en-ca/download/details.aspx?id=40277
that one is the Kinect for Windows Runtime v1.8
i think it contains most of all the microsoft drivers for the kinect
-----
in the image below you can see the tons of sample programs in the Toolkit's "bin" folder
after installing the "runtime" which installed drivers etc
you plug your kinect in the adapter
plug the adapter in your PC's usb port
plug the power adapter
your pc should detect your kinect and start installing about 8 drivers
then you can run any program based on the microsoft sdk
---
some of the /bin/ sample programs even let you 3D scan yourself and save this as a (big) .obj file
OK, downloading now. I'll get them installed and try tomorrow night some tests.
I also needed the SDK
here we can see that latency is negligible
so kinect poses are not accumulating in the communication link between the kinect server app and Daz Studio's kinect client script
( there's no traffic jam )
the poses are applied to the stickman in Daz Studio's scene almost immediately after being produced by the kinect
in Daz's viewport there was a visible Genesis (Aiko5 ) figure with clothes and hair
but that figure was not moving ( the timeline frame was not changing )
conclusion: just redrawing a figure doesnt use much cpu effort
( dropped frames may occur i'm not sure yet )
------
other tests were done and when the kinect has a hard time figuring your pose
the frame rate drops to 15 fps
But most of the time we get a steady 30 fps
Wow, this really looks great. I will try to find one camera for me, but i doubt i can found it here. Thanks for the info, videos look very good.
check this out
lets say you attach a camera to the stickman's head ( or chest ) in Daz Studio's scene
and you make this camera the current Daz Studio viewcam
and you start the kinect mocap
what do you get ?
yeah we get this : https://www.youtube.com/watch?v=XluW-7mKZ_I&feature=youtu.be
second camera mocap test
https://www.youtube.com/watch?v=Fhb01dw0bvk&feature=youtu.be
--
after i fix the script so it does this properly ( notice the disjointed bones in the second test )
i'll integrate the code that transfers the stickman animation to a standard daz figure
and if all goes well that will be enough to publish a first version of mcjKinectServer/mcjKinectClient
Mounting the camera on the hand might also make for some cool camera effects.
the hands, and feet are not very well tracked by the kinect ( at least when using the microsoft SDK drivers )
but parenting the camera top the foreArm would work
an over/behind-the-shoulder camera would be interesting too
well the first release will have to wait a bit more
but soon soon, yes it will be soon :)
things to do before the release
- better handling of invalid poses and interpolation between poses
- transferring poses from the stickman to a non zero-posed figure
- loading (done) / saving raw mocap data
Mathematically speaking 180° is the same as -180°.
When the PC cant keep up and drops poses ( frames ), we can get a situation where the pose before the drop uses the rotation -179.9° and the pose after the drop uses 179.9°.
Daz studio's simple interpolation would guesstimate the missing Pose to use the angle 0° ( ( -179° + 179° ) / 2 )
I'll add a processing phase that "sees" all angles as ranging between 0° and 360°
-179.9° will be seen as 180.1° and 179.9° is seen as 179.9°
The missing interpolated pose will be .... ( 180.1° + 179.0° ) / 2 = 180.0°
( or maybe i'll interpolate the quaternions )
------
mcjSceneGraf the free graphical animation curve viewer/editor which still works with Daz Studio 4.7
is still here http://www.daz3d.com/forums/discussion/14275/
there's also a version for DS3
PC-Win only
---
fig 2 - Azura is wearing the free bodycon dress for Genesis https://sites.google.com/site/mcasualsdazscripts2/mcjgenesisdress
( but oops, with a 'pleat' morph which will be posted someday soon )
and this texture
Fig3 - and this texture
fig 4 - new free test subjects join the team
fig 5 - there's now an added function in the script to very carefully fill-in the missing poses ( SLERP quaternion interpolation )
which means we're possibly up for a weekend release
non-standard zero poses
the mainstream daz and poser figures ( A3, V3, V4, Genesis ) have the same zero pose, the T-Pose
but figures like TheGirl have a zero-pose with legs spread apart
and we see meshbox's "Norm" figures have arms bent
so, in order to better support those figures, i'll add this feature today
and aim for a release tomorrow
----------
Figure2 - tada ! it works, but i had to use a "hand made" profile for the toon figure - i'll see if i can make this universal and automatic
( the white plane is a left-over from measuring the shoulder's zero-pose : 50 degrees )
there it is ... i think
the script now seems to work for Non-Standard zero-poses
- figures with shldrs bent ( mirye/Meshbox Norms )
- figures with legs spread ( The Girl 2 )
i doubt i'll have the initial release today, sorry