-
Notifications
You must be signed in to change notification settings - Fork 0
/
Changelog.txt
211 lines (187 loc) · 13.1 KB
/
Changelog.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
=================================
FUBI CHANGELOG
=================================
Changes in 0.13.1
=================
-Hot fix for broken references to hidden recognizers
-Added option to get converted training data of a template recognizer
-Minor improvement for the Linux Code::Blocks projects
Changes in 0.13.0
=================
-Added API methods for getting the tracking data of hands tracked by a finger sensor
-XMLGenerator in WPF GUI now works with finger sensor data as well
-Support for finger sensor image streams
-Added optional event system with callback functions for gesture recognitions
-Renamed DTWRecognizers to more general TemplateRecongizers; DTW is only one option for them; new options: apply resampling, use Gaussian Mixture Regression (GMR) or Malhanobis distance
-Fixed bugs in Matrix operations resulting in incorrect orientations in some cases
-Improved Tracking data classes
-Threaded KinectSDK2Sensor for not blocking the framerate anymore
Changes in 0.12.0
=================
-Fixed samples for switching from Kinect SDK 2 to other sensors
-Added DTWRecognizers using dynamic time warping on a recorded joint motion with a windowing approach
-Added corresponding recording functionalities
-Improved recognizer generator and added several new options, e.g. finger counts and DTW recognizers, tolerance types, multiple recognizers per combination state, ...
-Multiple fixes for the Kinect for Windows SDK 2.x integration, e.g. fixed the face tracking
-One installer for all sensor types and preselected sensor combinations
Changes in 0.11.1
=================
-Hot fix for the XML generator in the Fubi GUI
Changes in 0.11.0
=================
-Visual Studio 2013 support
-Preliminary support for Kinect for Windows SDK 2.x
-First version of a recognizer training tool integrated in Fubi GUI
-Fubi GUI remembers the last configuration and improvements for the key/mouse bindings
-Hands from finger sensors now get assigned to the hands of users from the user tracking
-Support for OpenCV 3.x
-Switched XML scheme from *.dtd to *.xsd
-Installer checks for previously installed version
Changes in 0.10.0
=================
-Several improvements for the finger count calculation
-FingerCountRecognizers now have a configurable window length for median calculation
-JointRelationRecognizers can now have a third joint which can be related to the line formed by the two other joints
-Added API method for getFloorPlane() (OpenNI only)
-Several fixes and improvement for the WPF GUI
-Added option to bind key or mouse events to certain gestures in the WPF GUI
Changes in 0.9.2
================
-Fixed WPF GUI sensor switching
Changes in 0.9.1
================
-Fixed registerStreams and mirrorStreams for OpenNI2
-Various performance improvements, OpenMP support for image processing
-Several (minor) image rendering fixes
-Now using the OpenCV C++ interface consistently
-More general coordinate conversion added; only from color stream conversions are missing
-Support for float images; added options for changing image format in RecognizerTest
-NSIS script for Fubi Windows installer
-C# wrapper and WPF GUI now consistently use .NET v. 4.0
-Large refactoring for WPF GUI, added loading circle animation when changing sensor
Changes in 0.9.0
================
-New function: getCurrentCombinationRecognitionState() provides more information about the current status of combination recognizers
-New type of linear movement recognizer based on movement lenghts instead of speed only
-New angular movemement recognizers look at the change of joint orientations
-New sensor type: FingerSensor with new class FubiHand, with inital support of the Leap motion sensor.
-New optional return for recognizer functions to get a correction hint
-New XML meta data format in general name->value format for adding custom meta data to your xml
-Major update for the doxygen docu (see doc folder)
-Registering of depth to color stream now also possible with Kinect SDK sensor
-Separated FubMath from FubiUtils
-Now using the high performance timers for timestamps
-WPF_GUI: added stats window for displaying more information about the user defined combination recognizers
Changes in 0.8.0
================
-Complete redesign of the .NET CS GUI, many new options, switched to .NET 4.0
-Refactored FubiUser class and added FubiXMLParser class
-Now optionally using 1€ filter for easy and efficent joint data filtering, see Casiez et al. in CHI 2012 proceedings. Filtered data usabable in all recognizers and for the tracking data rendering. The two parameters (filter strength for still joints, and how much it is reduced according to the movement) also adjustable in the FUBI GUI.
-Fix for the face joint references and their confidence values
-Some more fixes for platform/compiler independence (thanks to Christian Frisson and Fabien Grisard)
-Fix for specific combinations of basic relations/directions in XML
-New options for combination recognizers: onFail for going back one state instead of a complete restart
-Fixed Kinect SDK IR stream integration
-Better OpenCV integration with automatic version check
-Defines all have a "FUBI" prefix to avoid conflicts
-Added linear acceleration recognizers (currently only usable when manually inputting data)
-New type of joint orientations with orientation direction and max angle difference similar to the linear movements
Changes in 0.7.0
================
-Enhanced the Add..Recognizer(..) functions to have the same functionality as available in XML.
-Added new face tracking rendering
-Fixed bugs for maxAngleDiff-property, clearRecognizers()-function, OpenCV-preprocessor, face joints names, the updateUsers()-function, the updateTrackingData()-function and some more minor ones ..
-Removed some remaing Windows-specific parts -> Fubi has now already been succesfully compiled under Linux and Mac (CMake files or similar will probably follow in one of the next releases)
-Local positions now have the complete torso transformation (translation and rotation) removed, so they might be useful in cases where we want to look at directions only relative to the body (not the world coordinate system)
-Added body measurements (also renderable) usable for joint relation recognizers as alternative to concrete milimeters
-All printfs now are replaced by advanced logging functions that can also be deactivated according to the logging level (set in the FubiConfig.h)
-OpenNI2 sensor also approximates face positions for chin/forehead/nose/ears to make those more usable (Note: they are dependent on the torso orientation, but not on the head orientation [as this is not tracked by OpenNI at the moment], so they can give a good approximation in many cases, but not all the time).
-Combination states can now contain two other types of recognizers: NotRecognizers are fulfilled if the refered recognizer is NOT recognized; AlternativeRecognizers define a group of recognizers that are only tested if the regular ones have already failed -> a way to combine the recognizers with OR instead of AND.
-Combinations with the isWaitingForLastStateFinish flag now return WAITING_FOR_LAST_STATE_TO_FINISH if last state has been fulfilled but not finished
-Several minor additions to the XML gesture definitions: noInterruptionBeforeMinDuration for Combinations and useOnlyCorrectDirectionComponent for linear movements.
Changes in 0.6.0
================
- Added full support for the Kinect SDK including face tracking (can still be deactivated via preprocessor defines in FubiConfig.h)
-> Added options to switch between sensors during run time
-> Added rendering options for face tracking
-> Removed MSKinectSDKTest sample from Visual Studio solution (Still available in the samples folder under "TrackingDataInjectionTest")
- Fixed finger rendering option
- Fixed a bug in the samples causing too many gesture notifications
- Fubi coordinate sytem is now fully right-handed (also for orientations), only the y orientation is rotated by 180° to have the 0 orientations when looking directly to the sensor (-> Changes needed for orientation recognizers!)
- Several new options for the recognizer xml:
* combination recognizers can now be delayed until the last state is quit by the user
* finger counts can be calculated by the median of the last frames
* direction of linear movements can be limited by a maximum angle difference
* joint relations and linear movements can be defined without directly using the coordinate axis, but by more intuitive types as "above, below, .." and "up, down, .."
Changes in 0.5.1
================
No API changes this time, about several internal ones:
-Added FubiConfig.h for configuring which OpenNI and OpenCV version to use and printing additional info for the combination recognizers
-Fubi is now tested with OpenCV 2.4.3
-Added swap r and b option and finger shapes for rendering
-Added minconfidence for all recognizers and maxAngleDiff for linear movements to the FUBI XML definition for setting
-Fubi is now using the new OpenNI version 2.0.0 by default!
Changes in 0.5.0
================
-Replaced OpenNI data types with Fubi specific ones to get more independet from OpenNI
-Added realWorldToProjective() function even without OpenNI
-Removed center of mass - not used in the past anyway. If you want to get a user's position even before tracking, look at the torso joint
-Fixed orientation recognizers that have the +180/-180 between a min and max value (e.g. minDegrees.x = -170, maxDegrees.x = -170 = 190).
-Added more getClosestUsers functions.
-Separated OpenNi completely from the rest using the FubiOpenNISensor class and a FubiISensor interface.
-getImage() can now be used with a user and joint of interest as saveImage()
-Added init witout xml file, but with Fubi::SensorOptions
-Renamed PostureCombinationRecognizers to the more appropriate CombinationRecognizer and all corresponding functions and the xml scheme
-Added struct around the predefined recognizers so that they are not directly in the Fubi root namespace anymore
-minConfidence now configurable per recognizer or globally via xml
-Added Unity3D integration and sample
Changes in 0.4.0
================
-Several API changes for the gesture recognizer calls: different names and now returning a Fubi::RecognitionResult instead of a simple bool
-new options for combination recognizers: "ignoreOnTrackingError" for specific recognizers within one state of the recognition for making them more "optional", and "maxInterruptionTime" for defining the maximum time the recognition within one state may be interrupted
-new recognizer type: finger count recognizer
-init now returns whether it was succesful, calibration files are no longer supported
-Several OpenNI only functions are now named accordingly
-Added new sample for integrating the MSKinect SDK with FUBI
-FUBI WPF GUI sample: fixed threading for Fubi update calls
Changes in 0.3.0
================
-added option for drawing local joint orientations in the depth image
-added new method for finger detection using morphological opening
-splitted update and getImage() function
-added new saveImage function
-much more options for image rendering
-added local positions and render options for local/global positions/orientations
-added function for directly injecting joint positions/orientations, e.g. enables direct integration with the MS Kinect SDK
Changes in 0.2.4
==================
-added new xml attribute for joint relations/linear movements/joint orientations: visibility. Default is visible. If set to hidden the recognizer won't be accessible from the outside and can only be used in combinations.
-added support for setting the openni skeleton profile.
-added local joint orientations
-updated C#-Wrapper and examples
Changes in 0.2.3
=================
- fixed bug with using elbows in xml recognizers
- a relative joint for linear movements is now optional, so movements can be measured absolute (usefull for whole body movements)
- added JointOrientationRecognizers for recognition of specific joint orientations
- new recognizers: left/right hand close to arm
Changes in 0.2.2
================
- C# Sample GUI now includes a mouse simulator:
* You can start the mouse tracking by clicking on the button "start mouse emulation" or by the activation gesture "waving" if the corresponding check box is activated
* You can switch between right and left hand control. Right hand control means you control the mouse cursor by moving the right hand in the air.
* Clicking is done via gestures. These gestures are configured in the "MouseControlRecognizers.xml". You can change them, but you have to keep the names "mouseClick" and "mouseClick1"
- FUBI API: getClosestUsers returns the users standing closest to the sensor, but still in view and tracked.
Changes in 0.2.1
================
- C#-Wrapper with sample GUI
- XML-Configuration for recognizers (You can look up the format in the FubiRecognizers.dtd)
- Counting shown fingers by the users (beta)
Initial published version: 0.1.1
================================
- OpenNI 1.x integration
- OpenCV 2.2 optionally used
- Gesture recognizers defined in C++ code (JointRelations, LinearMovements, Combinations)
- GLUT test application
- initial XML support (JointRelations and LinearMovements)
- initial C# wrapper and sample (Debug image and recognized gestures list)