html>
The state of the palm is defined as follows in Kinect for Windows SDK 2.0.
Quoted from Kinect.h of Kinect for Windows SDK 2.0 |
enum _HandState { HandState_Unknown= 0, HandState_NotTracked= 1, HandState_Open= 2, HandState_Closed= 3, HandState_Lasso= 4 }; enum _TrackingConfidence { TrackingConfidence_Low= 0, TrackingConfidence_High= 1 }; |
In NtKinect, you can get the palm state (Open, Closed, Lasso) when calling setSkeleton() function.
返り値の型 | メソッド名 | 説明 |
---|---|---|
pair<int,int> | handState(int id =0, bool isLeft = true) |
setSkeleton()関数を呼び出した後で呼び出して、手の状態を取得できる。 引数
|
Call kinect.setSkeleton() function to set skeleton information to kinect.skeleton. The first argument of handState() function is "the index of kinect.skeleton vector". So the program loops with the control variable i and access the skeleton information and palm state with the variable i .
The position information of the left and right palms is in kinect.skeleton[i ][JointType_HandLeft] and kinect.skeleton[i ][JointType_HandRight] , respectively. The position of each joint is looped with variable j and displayed as a red rectangle. When the value of j is JointType_HandLeft or JointType_HandRight, a larger rectangle is displayed by the color representing the palm state.
main.cpp |
#include <iostream> #include <sstream> #include "NtKinect.h" using namespace std; void doJob() { NtKinect kinect; cv::Scalar colors[] = { cv::Scalar(255,0,0), // HandState_Unknown cv::Scalar(0,255,0), // HandState_NotTracked cv::Scalar(255,255,0), // HandState_Open cv::Scalar(255,0,255), // HandState_Closed cv::Scalar(0,255,255), // HandState_Lass }; while (1) { kinect.setRGB(); kinect.setSkeleton(); for (int i = 0; i < kinect.skeleton.size(); i++) { auto person = kinect.skeleton[i]; for (int j = 0; j < person.size(); j++) { Joint joint = person[j]; if (joint.TrackingState == TrackingState_NotTracked) continue; ColorSpacePoint cp; kinect.coordinateMapper->MapCameraPointToColorSpace(joint.Position,&cp); cv::rectangle(kinect.rgbImage, cv::Rect((int)cp.X-5, (int)cp.Y-5,10,10), cv::Scalar(0,0,255),2); if (j == JointType_HandLeft || j == JointType_HandRight) { pair<int, int> handState = kinect.handState(i, j == JointType_HandLeft); cv::rectangle(kinect.rgbImage, cv::Rect((int)cp.X - 8, (int)cp.Y - 8, 16, 16), colors[handState.first], 4); } } } cv::imshow("rgb", kinect.rgbImage); auto key = cv::waitKey(1); if (key == 'q') break; } cv::destroyAllWindows(); } int main(int argc, char** argv) { try { doJob(); } catch (exception &ex) { cout << ex.what() << endl; string s; cin >> s; } return 0; } |
Recognized joints are indicated by red squres on the RGB image. A square is writen with the next color according to the palm state.
palm state | colo | cv::Scalar's specification |
---|---|---|
Unknown | Blue | 255,0,0 |
Not tracked | Green | 0,255,0 |
Open | Cyan | 255,255,0 |
Closed | Magenta | 255,0,255 |
Lasso | Yellow | 0,255,255 |
Since the above zip file may not include the latest "NtKinect.h", Download the latest version from here and replace old one with it.