Improving NeverKinect Gestures

Ok, NeverKinect worked and it was possible to play the first few levels already, but it was not a great fun at all. This was due to the inaccuracy of the data from the kinect which led to a flickering output and required large gestures and long distance movements. In addidtion, the gestures I tried (line from elbow to hand or waist to head as normal of the plane) seemed not so comfortable. Here are some notes about the first improvements:

Kinect Alignment

When playing it is important to be able to easily reset the tilt of the plane. To do this with NeverKinect you need to know what gesture to do (how to stand, where to move) which is not so easy if you have no reference point. However it is even harder if the kinect camera is not well aligned and any data contains a constant offset (resp. rotation).
To fix this, the kinect sensor comes with a build-in accelerometer. It is not used for some kind of auto-correction by either freenect or openni, but its not that hard with freenect:

First initialize freenect with the motor device:

1

#include “libfreenect.h”
// global variables for freenect context and device:
freenect_context *f_ctx;
freenect_device *f_dev;

// …
// in your init function:
if (freenect_init(&f_ctx, NULL) < 0) { printf("freenect_init() failed\n"); return 1; } // we only use the motor device with freenect (as openni gets // the video device for skeleton tracking) freenect_select_subdevices(f_ctx, FREENECT_DEVICE_MOTOR); if (freenect_open_device(f_ctx, &f_dev, 0) < 0) { printf("Could not open device\n"); return 1; } [/pyg] Getting gravity data is easy: [pyg language="cpp" linenos="table"] // this will be basically a vector pointing to the ground double dx, dy, dz; // get the current tilt state freenect_raw_tilt_state *state = 0; freenect_update_tilt_state(f_dev); state = freenect_get_tilt_state(f_dev); // get the processed accelerometer values (calibrated to gravity) freenect_get_mks_accel(state, &dx, &dy, &dz); // calculate the rotation angle in degrees (on z-axis, from -180 to 180, ideally 0) double corr_angle = 90 - atan2(dy, dx) * 180 / PI; [/pyg] In most cases you won't expect the gravity to change at runtime (ie. a user to move the sensor), so this does not have to be done every time in the main loop but only at the beginning and maybe once a while or after a configuration gesture (or new user).

Reducing the Noise of Depth Data

The depth data from kinect is a bit noisy and so are the joint positions calculated by OpenNI/Nite. To handle this I first tried the stupid and simple way by building an average of the last angle(s) and the new raw angle and taking this as the new angle. This method basically works, as positive and negative variations are neutralized. However it adds a certain delay to the controlling. An even simpler trick is to remove the decimal places of the angles with a simple cast to int. 20 steps (each one degree) to each side is still enough for a smooth game play and removes some amount of the flickering.
Both of these techniques are very simple. They help to handle the problem quite easy but it’s not a perfect solution. I also have not yet tested all parameters and combinations (eg. weighting of old and new angle in the average calculation). If it does not work well enough maybe some more complex algorithms and filters will do it, like Kalman filter (linear quadratic estimation / LQE) will do it, but I’m really new to this area and don’t know much (read: anything) about these topics (signal estimation, filters etc.).

Working on Gestures

Thanks to the improved alignment and stabilized skeleton tracking mentioned above, gestures can be processed a bit more precise and thus can be a bit smaller (in terms of shorter movement distances). I mainly work on the quite intuitive gesture I already used earlier and which is also shown in the first video where you move your upper body like if you were standing right on the plane and would move it by shifting of weight.

But as this is a list of improvements there were some small changes as well. Actually, I’m playing around with two versions of this basic control gesture to fix its main disadvantage. That is, that you cannot bend yourself to the back like you can bend to the front or to the sides. To compensate this, I have these three first beginnings:

  • non-linear value mapping:
    If you can only bend 10 degree to the back, just multiply negative z-angles by two.
    → May still be uncomfortable to bend to the back, esp. if you have to be even more precise due to the mapping.
  • global offset on z-axis:
    Add a global offset on the z-axis, so that you have to bend a bit to the front to reset the tilt.
    → You may need to bend very far to the front to tilt the plane to the front at maximum level. This could become very fast very annoying.
  • calculating the angle from head to waist or from head to hand

    left: angle from head to waist (small)
    right: angle from head to hand (larger)
    (I’m not a designer or gfx artist, so apologies for the bad paint job)

    head-to-hand for z-axis:
    Instead of taking the depth values of the head and the waist to calculate the z-axis angle, take the values of head and one hand.
    → It is a natural position to place the hands besides your waist, so this won’t make it that uncomfortable. But when bending to the front, the hand will be behind the waist (only talking about the z-position here, think of the typical posture of ski jumpers) and at the other end, when bending to the back, the hand will be in front of the waist (see picture). By doing so, we increase the angles without loosing precision. Of course this means that the user must keep his arm straight all the time and move it in this special manner, but after some testing it feels not as bad as it may sound.

None of them is perfect, but the last one is quite pretty. Again, this needs some testing and adjusting of parameters to find the best setting but it looks promising.

Conclusion

To sum up, these few and mostly simple changes already improved the game play a lot. There are still some minor issues to handle, questions to answer, decisions to make and best fitting parameter values to find, but all in all its right on the way to be finished and packed up to a patch for Neverball!

NeverKinect – Play Neverball with Kinect

I managed to port a C++ test program back to the pure C api of OpenNI and integrated that into a Neverball extension. It still needs some configuration and improvements, but basically it works. Source is online at https://bitbucket.org/schlangen/neverkinect

I’ll write a bit more about the details of the code and how it works etc. in another post, but here I have already a short video, so you can see it in action. It was a bit difficult to film, but I hope its enough to get an idea of what I’m talking about