old treasure

Recently found this on my hard drive, it was a compilation of everything I was doing back @ the Tuldok headquarters before it shutdown.

Its a shame to let it go to waste so I uploaded it here. Here ya go.

I swear to myself ill continue these experiments one of these days given the time, space, and equipment that I need.

colored candies

ok another geeky post, me tracking candies in python.

Screenshot-1

no theyre actually refrigirator magnets stuck to my pentablet.

This is a python-opencv test that could be used to implement orientation tracking for the NI-droidcam.

I recently learned python-opencv from a site worth mentioning>> www.neuroforge.co.uk

so thank you Devs for the informative site.

Its still experimental, im still learning a thing or two about it.

next step is to get the data stream into blender and perhaps use the points for a camera rig.

hope it all goes well, wish me luck on this one.

NI-droidcam v1.0

Now presenting the lowcost NI-droidcam v1.0

tadaa!

droidcam

made from pvc pipes, an old laptop cooler, china tablet and accessories.

buttons

it has 2 buttons, one for quick animation playback and another for going back to frame start

(in case you made a mistake and need to restart the record)

keyboard

a mini keyboard for blender shortcuts

(Ill use this for custom shortcuts, viewing different layers, setting wireframe mode etc.)

hackednumpad

a hacked numpad for creating more custom buttons

(probably for the next versions)

display

and a lowcost china android tablet for the display.

super cheap and fast enough to run the virtualization.

This will basically implement the kinect virtualcam hack ive done in my previous posts

only this time itll be displaying through the tablet, allowing me to to move around and see

the changes being made even if I rotate 180 degrees.

here’s a video of the previous virtualization test i did with an android phone.

I havent actually tried this thing out with the droidcam, but ill make another post after I do so.

you can check out my other previous kinect camera vitualization tests here >> Through the looking glass

and dont forget to subscribe. thanks! :)

Through The looking glass

Here’s another cool Kinect test Ive done

I had this Idea of performing an old Headtracking Wii Hack done by Johnny Lee but with a Kinect sensor.

Originally done by Johnny Lee, the head tracking wii hack is an ingenious way of using headtracking to simulate a virtual room inside your monitor. He called it a VR Display or a “virtual room” display, the process includes tracking a user’s head position and using the tracked data to simulate the view of a virtual room, the feel is like getting to reach into another world  or having 3D objects pop right out of your screen.

here’s how it looks like

forest scene assets modelled by Ross Tec and Ramon Del Prado

The concept to achieving this is actually very simple, the kinect sensor tracks the user’s head
and tries to figure out it’s position, the extracted data is then used to change the angle of the scene tricking our eyes to believe that there’s another room inside our monitor.

now this is actually a really cool way to view and record animations and imitate a POV effect shot, which actually lead me to an idea of animating a camera in a whole new way.

So I had another concept that includes using the users hands to point the camera at certain angles,
it takes away the 3D experience but with this setup you could actually view the scene using hand movements.

i wont hold you guys back on this one, the video is pretty self explanatory
so here’s what we came up with @ the office. :)

Brekel Kinect Face | Importer addon for Blender

had some fun scripting recently, finally got a chance to polish my python skills. Im doing a small python script addon that imports kinect facial animation data into blender.

The facial data is captured from Brekel’s face tracking software named “Brekel Kinect Pro Face”, the software tracks/records facial points/movements and creates data that can be used for simulating facial animations.

 

 

The software exports float variables labeled as “Animation Units” these units are designed to be used for driving facial shapes/morphs created in a 3D software package. In our case, Blender Shapekeys.

 

each of the units has corresponding name and value, the names are pretty much self explanatory

blender users can use these to determine the type of shapekeys needed for their models as well as use it to drive the created shapes.

As of now the creation of facial shape drivers are done by hand,  ill be writing  a script soon that can apply these units as drivers to shapes effortlessly.

An additional button at the sidepanel perhaps.

 

 

Apart from the animation units I also managed to Import the “head frame data”, this contains the tracked data obtained from the head’s rotation and location movements  .

The data was applied to an “Empty” and can be used to constrain the rotation / location of the head.

here on this example I used it on blender’s most beloved test monkey “Suzzane”

___________________________________________________________________________

Download:

finally here’s the import script addon>>  brekel_addon.zip

the script is not done yet, there’s still a few more stuff to be done and I still have to import the tracked facial points.

you can already import the “Animation Units” and the head movements with the script.

For those who doesnt have access to kinect sensors, Ive included there the exported test file i used for the monkey. More test files can be found within Brekel’s site.

___________________________________________________________________________

Links:

for more info about the Brekel Face software please head on to this site

>>http://www.brekel.com/

Blender can be found here

>>http://www.blender.org

downloads for Brekel Face are available within Brekel’s website, just search for it at the right panel. There are Installers for 32 and 64 bit systems.  As for me, I was lucky enough to aquire a Licence straight from Brekel for doing this script. Thanks man, really appreciate it.

full tutorial on Blender/Brekel Face  workflow coming soon

Sandman in space

“Point cloud is like sandman resting in 3D space “

Been doing kinect tests lately for a freelance project im working on, and im so happy to say that i didnt hold back investing on this small piece of hardware! there’s lots of stuff you can do with it apart from playing games on the xbox 360.

It was fun having point cloud in my living room, being able to rotate the scene and look at it through different angles.

>> playback of cloud and animation sync in blender

 

here’s a sample video of the produced animation

feet at the last part slides, that was my bad not the software.

I used IpiRecorder to record depth, and then it was imported to IpiMocapStudio to generate a point cloud for bone tracking. The bones are then imported to blender to be used for animation.

>> playback of recorded depth in ipiRecorder | generated cloud in ipiMocapStudio

you can find the full tracking workflow in ipiMocap here>> http://vimeo.com/24073249

_____________________________________________________________________________

ipiRecorder is free and so is Blender of course, you can find more info on these awesome softwares

@ these sites:

Blender: http://www.blender.org

ipiRecorder: http://www.ipisoft.com

As for ipiMocap unfortunately,  its not free.

Ipi Mocap is being sold for $145 “express edition” price range up to $1195 standard edition, kinda rough on the wallet but definitely a worth buy for those wanting to do kinect motion capture.