FUBI, otherwise known as the Full Body Interaction Framework was designed as a framework for recognizing full body gestures and postures. This is done in real time from the data of an OpenNI-applicable depth sensor, especially the Microsoft Kinect sensor.
The Framework therefore distinguishes between four gesture categories:
· Static postures: Configuration of several joints, no movement.
· Gestures with linear movement: Linear movement of several joints with specific direction and speed.
· Combination of postures and linear movement: Combination of 1 and 2 with specific time constraints.
· Complex gestures: Detailed observation of one (or more) joints over a certain amount of time.
What's New in This Release:
· Added full support for the Kinect SDK including face tracking (can still be deactivated via preprocessor defines in FubiConfig.h)
· Added options to switch between sensors during run time
· Added rendering options for face tracking
· Removed MSKinectSDKTest sample from Visual Studio solution (Still available in the samples folder under "TrackingDataInjectionTest")
· Fixed finger rendering option
· Fixed a bug in the samples causing too many gesture notifications
· Fubi coordinate sytem is now fully right-handed (also for orientations), only the y orientation is rotated by 180° to have the 0 orientations when looking directly to the sensor (-> Changes needed for orientation recognizers!)
Several new options for the recognizer xml:
· combination recognizers can now be delayed until the last state is quit by the user
· finger counts can be calculated by the median of the last frames
· direction of linear movements can be limited by a maximum angle difference
· joint relations and linear movements can be de...