This is a small demo of what can be done with hand and finger tracking. The depth data is processed into hand movement and gesture events. I send these via the Win32 p/invoke method SendInput to windows. This allows you to control any application, like for example MSPaint:
The right hand controls the Cursor location. Closing the left hand will trigger a "mouse down" event, reopening it a "mouse up" event. I had to separate this for the moment, because opening and closing the hand moves the center of the hand cluster too much, it's not really possible to click a small icon this way. I need to spend some time on smoothing the Cursor movement.