Tuesday, March 22, 2011

Candescent NUI Lab - Preview

This is just a small update to show you what I've been working on. I'm building a "Kinect Lab" that should make it easier to work on different levels of abstraction (for example raw depth data vs. ready to use hand information).

Currently it sits on top of the OpenNI framework. But this should be easy to replace as the code itself uses only rgb and depth data and none of the other OpenNI features (like skeleton tracking and gesture recognition). This might come in handy, when Microsoft releases their SDK and I should decide to use that instead.

video

As you can see, you can add rgb and depth image streams generated from the raw data. And you can also add "layers" with data of higher abstraction (currently in a seperate space, but these can also be put on top of the images).

Basically you define data sources that take some input data and produce output data in some other form themselves. These can then be plugged into each other to form a production chain.

Here is a code sample:
       var openNIDepthDataGenerator = ...;  
       var depthDataSource = new DepthPointerDataSource(openNIDepthDataGenerator);  
       var clusterDataSource = new ClusterDataSource(depthDataSource);  
       var handDataSource = new HandDataSource(depthDataSource, clusterDataSource);  
       var mouseController = new MouseController(handDataSource);  
       mouseController.Enabled = true;  

The same with some inversion of control magic:
       var mouseController = new MouseController(Ioc.Resolve<HandDataSource>());  
       mouseController.Enabled = true;  

This is the code that initializes the cursor control with hand tracking (see my last post).

No comments:

Post a Comment