How to Integrate Intel® Perceptual Computing SDK with Cocos2D-x Introduction

How to Integrate Intel® Perceptual
Computing SDK with Cocos2D-x
Introduction
In this article, we will explain the project we worked on as part of the Intel® Perceptual
Computing Challenge Brazil, where we managed to achieve 7th place. Our project was
Badaboom, a rhythm game set in the Dinosaur Era where the player controls a caveman,
named Obo, by hitting bongos at the right time. If you’re curious to see the game in action,
check out our video of Badaboom (www.youtube.com/watch?v=GJCcWb2mM28).
To begin, you’ll need to understand a bit about Cocos2D-X, an open-source game engine that is
widely used to create games for iPhone* and Android*. The good thing about Cocos2D-X is
that it is cross-platform and thus is used to create apps for Windows* Phone, Windows 8,
Win32*, Linux*, Mac*, and almost any platform you can think of. For more information, go to
www.cocos2dx.org.
We will be using the C++ version of the SDK (Version 9302) as well as the Cocos2D-X v2.2
(specifically the Win32 build with Visual Studio* 2012). Following the default pattern of
Cocos2D, we will create a wrapper that receives and processes the data from the Creative*
Interactive Gesture Camera and interprets it as “touch” for our game.
Setting the environment
To start, you’ll need to create a simple Cocos2D project. We will not cover this subject as it is
not the focus of our article. If you need more information, you can find it on the Cocos2D wiki
(www.cocos2dx.org/wiki).
To keep it simple, execute the Python* script to create a new project in the “tools” folder of
Cocos2d-x and open the Visual Studio project. Now we will add the Intel Perceptual Computing
SDK to the project.
To handle the SDK’s input, we will create a singleton class named CameraManager. This class
starts the camera, updates the cycle, and adds two images to the screen that represent the
position of the hands on the game windows.
CameraManager is a singleton class that is derived from UtilPipeline and imports the
“util_pipeline.h” file. Here, we need to reconfigure some of the Visual Studio project
properties. Figure 1 shows how to add the additional include directories for the Intel
Perceptual Computing SDK.
$(PCSDK_DIR)/include
$(PCSDK_DIR)/sample/common/include
$(PCSDK_DIR)/sample/common/res
Figure 1. Additional include directories
You must also include the following paths to the additional library directories:
$(PCSDK_DIR)/lib/$(PlatformName)
$(PCSDK_DIR)/sample/common/lib/$(PlatformName)/$(PlatformToolset)
Figure 2. Additional Library Directories
Add the following dependencies in the input section:
libpxc_d.lib
libpxcutils_d.lib
Figure 3. Additional Dependencies
Now we are ready to work on our CameraManager!
Start Coding!
First we need to make the class a singleton. In other words, the class needs to be accessible
from anywhere in the code with the same instance (singleton classes have only one instance).
For this, you can use a method:
CameraManager* CameraManager::getInstance(void)
{
if (!s_Instance)
{
s_Instance = new CameraManager();
}
return s_Instance;
}
After that, we’ll build a constructor, a method that starts the camera:
CameraManager::CameraManager(void)
{
if (!this->IsImageFrame()){
this->EnableGesture();
if (!this->Init()){
CCLOG("Init Failed");
}
}
this->hand1sprite = NULL;
this->hand2sprite = NULL;
hasClickedHand1 = false;
hasClickedHand2 = false;
this->inputAreas = CCArray::createWithCapacity(50);
this->inputAreas->retain();
}
Many of the commands initialize variables that handle sprites which symbolize the users’
hands and get input as they close their hands. The next step is processing the data that comes
from the camera.
void CameraManager::processGestures(PXCGesture *gesture){
PXCGesture::Gesture gestures[2]={0};
gesture>QueryGestureData(0,PXCGesture::GeoNode::LABEL_BODY_HAND_PRIMARY,0,&gestur
es[0]);
gesture>QueryGestureData(0,PXCGesture::GeoNode::LABEL_BODY_HAND_SECONDARY,0,&gest
ures[1]);
CCEGLView* eglView = CCEGLView::sharedOpenGLView();
switch (gestures[0].label)
{
case (PXCGesture::Gesture::LABEL_POSE_THUMB_DOWN):
CCDirector::sharedDirector()->end();
break;
case (PXCGesture::Gesture::LABEL_NAV_SWIPE_LEFT):
CCDirector::sharedDirector()->popScene();
break;
}
}
To be clear, it is in this method that you can also add switch cases to understand voice
commands and to implement more gesture handlers. Following this, we must process this
information and display it in the CCLayer (Cocos2D sprite layer).
bool CameraManager::Start(CCNode* parent){
this->parent = parent;
if (this->hand1sprite!=NULL
&& this->hand1sprite->getParent()!=NULL){
this->hand1sprite->removeFromParentAndCleanup(true);
this->hand2sprite->removeFromParentAndCleanup(true);
}
this->hand1sprite = CCSprite::create("/Images/hand.png");
this->hand1sprite->setOpacity(150);
//To make it out of screen
this->hand1sprite->setPosition(ccp(-1000,-1000));
this->hand1Pos = ccp(-1000,-1000);
this->hand2sprite = CCSprite::create("/Images/hand.png");
this->hand2sprite->setFlipX(true);
this->hand2sprite->setOpacity(150);
this->hand2sprite->setPosition(ccp(-1000,-1000));
this->hand2Pos = ccp(-1000,-1000);
parent->addChild(this->hand1sprite, 1000);
parent->addChild(this->hand2sprite, 1000);
this->inputAreas->removeAllObjects();
return true;
}
This method should be called each time a new frame is placed on the screen (most of the time
into the onEnter callback). It will automatically remove the hand sprites from the previous
parent and add them to the new CCLayer.
Now that our hand sprites have been added to the CCLayer we are able to handle their
position by calling the follow method on the update cycle of the CCLayer (which is scheduled
by the call: “this->scheduleUpdate();”). The update method is as follows:
void CameraManager::update(float dt){
if (!this->AcquireFrame(true)) return;
PXCGesture *gesture=this->QueryGesture();
this->processGestures(gesture);
PXCGesture::GeoNode nodes[2][1]={0};
gesture->
QueryNodeData(0,PXCGesture::GeoNode::LABEL_BODY_HAND_PRIMARY,1,nodes[0]);
gesture->
QueryNodeData(0,PXCGesture::GeoNode::LABEL_BODY_HAND_SECONDARY,1,nodes[1])
;
CCSize _screenSize = CCDirector::sharedDirector()->getWinSize();
if (nodes[0][0].openness<20 && !this->hand1Close){
this->hand1sprite->removeFromParentAndCleanup(true);
this->hand1sprite =
CCSprite::create("/Images/hand_close.png");
this->hand1sprite->setOpacity(150);
this->parent->addChild(hand1sprite);
this->hand1Close = true;
} else if (nodes[0][0].openness>30 && this->hand1Close) {
this->hand1sprite->removeFromParentAndCleanup(true);
this->hand1sprite = CCSprite::create("/Images/hand.png");
this->hand1sprite->setOpacity(150);
this->parent->addChild(hand1sprite);
this->hand1Close = false;
}
if (nodes[1][0].openness<20 && !this->hand2Close){
this->hand2sprite->removeFromParentAndCleanup(true);
this->hand2sprite =
CCSprite::create("/Images/hand_close.png");
this->hand2sprite->setFlipX(true);
this->hand2sprite->setOpacity(150);
this->parent->addChild(hand2sprite);
this->hand2Close = true;
} else if (nodes[1][0].openness>30 && this->hand2Close) {
this->hand2sprite->removeFromParentAndCleanup(true);
this->hand2sprite = CCSprite::create("/Images/hand.png");
this->hand2sprite->setFlipX(true);
this->hand2sprite->setOpacity(150);
this->parent->addChild(hand2sprite);
this->hand2Close = false;
}
this->hand1Pos = ccp(_screenSize.width*1.5nodes[0][0].positionImage.x*(_screenSize.width*HAND_PRECISION/320) + 100,
_screenSize.height*1.5nodes[0][0].positionImage.y*(_screenSize.height*HAND_PRECISION/240));
this->hand2Pos = ccp(_screenSize.width*1.5nodes[1][0].positionImage.x*(_screenSize.width*HAND_PRECISION/320) - 100,
_screenSize.height*1.5nodes[1][0].positionImage.y*(_screenSize.height*HAND_PRECISION/240));
if (!hand1sprite->getParent() || !hand2sprite->getParent()){
return;
}
this->hand1sprite->setPosition(this->hand1Pos);
this->hand2sprite->setPosition(this->hand2Pos);
CCObject* it = NULL;
CCARRAY_FOREACH(this->inputAreas, it)
{
InputAreaObject* area = dynamic_cast<InputAreaObject*>(it);
this->checkActionArea(area->objPos, area->radius, area>sender, area->method);
}
this->ReleaseFrame();
}
This code not only handles the position of the sprite, it also sets a different sprite
(hand_close.png) if the camera detects that the hand is less than 20% open. In addition to this,
there is simple logic to create hand precision, which makes the user input more sensitive and
easier to get the edges of the screen. We do this because the Perceptual Camera is not that
precise on the edges, and the position of the sprites commonly gets crazy when we approach
the edge.
Now it is indispensable that we add some ways to handle the input (a closed hand is
considered a touch). We need to write a method called “checkActionArea” (called in the
update method) and register the actionArea.
void CameraManager::checkActionArea(CCPoint objPos, float radius,
CCObject* sender, SEL_CallFuncO methodToCall){
if (sender==NULL)
sender = this->parent;
float distanceTargetToHand = ccpDistance(this->hand1Pos, objPos);
if (distanceTargetToHand<radius){
if (this->hand1Close&& !hasClickedHand1){
this->parent->runAction(CCCallFuncO::create(this>parent, methodToCall, sender));
hasClickedHand1 = true;
}
}
if (!this->hand1Close){
hasClickedHand1 = false;
} //TODO: repeat for hand2
}
Follow the method registerActionArea() for the registration of areas:
void CameraManager::registerActionArea(CCPoint objPos, float radius,
cocos2d::SEL_CallFuncO methodToCall){
InputAreaObject* newInputArea = new InputAreaObject(objPos, radius,
methodToCall);
this->inputAreas->addObject(newInputArea);
}
Now it is easy to add the Intel Perceptual Computing SDK to your Cocos2D game!!! Just run:
CameraManager::getInstance()->Start(this);
When entering the Layer, register the objects and methods to be called:
CameraManager::getInstance()->registerActionArea(btn_exit->getPosition(),
150, callfuncO_selector(LevelSelectionScene::backClicked));
About us!
We hope you have liked our short tutorial. Feel free to
contact us with any issues or questions!
Naked Monkey Games is an indie game studio located at
São Paulo, Brazil currently part of the Cietec Incubator. It
partners with Intel on new and exciting technology
projects!
Please follow us on Facebook (www.nakedmonkey.mobi)
and Twitter (www.twitter.com/nakedmonkeyG).
Notices
INFORMATION IN THIS DOCUMENT IS PROVIDED IN CONNECTION WITH INTEL PRODUCTS. NO LICENSE,
EXPRESS OR IMPLIED, BY ESTOPPEL OR OTHERWISE, TO ANY INTELLECTUAL PROPERTY RIGHTS IS
GRANTED BY THIS DOCUMENT. EXCEPT AS PROVIDED IN INTEL'S TERMS AND CONDITIONS OF SALE FOR
SUCH PRODUCTS, INTEL ASSUMES NO LIABILITY WHATSOEVER AND INTEL DISCLAIMS ANY EXPRESS OR
IMPLIED WARRANTY, RELATING TO SALE AND/OR USE OF INTEL PRODUCTS INCLUDING LIABILITY OR
WARRANTIES RELATING TO FITNESS FOR A PARTICULAR PURPOSE, MERCHANTABILITY, OR
INFRINGEMENT OF ANY PATENT, COPYRIGHT OR OTHER INTELLECTUAL PROPERTY RIGHT.
UNLESS OTHERWISE AGREED IN WRITING BY INTEL, THE INTEL PRODUCTS ARE NOT DESIGNED NOR
INTENDED FOR ANY APPLICATION IN WHICH THE FAILURE OF THE INTEL PRODUCT COULD CREATE A
SITUATION WHERE PERSONAL INJURY OR DEATH MAY OCCUR.
Intel may make changes to specifications and product descriptions at any time, without notice.
Designers must not rely on the absence or characteristics of any features or instructions marked
"reserved" or "undefined." Intel reserves these for future definition and shall have no responsibility
whatsoever for conflicts or incompatibilities arising from future changes to them. The information here
is subject to change without notice. Do not finalize a design with this information.
The products described in this document may contain design defects or errors known as errata which
may cause the product to deviate from published specifications. Current characterized errata are
available on request.
Contact your local Intel sales office or your distributor to obtain the latest specifications and before
placing your product order.
Copies of documents which have an order number and are referenced in this document, or other Intel
literature, may be obtained by calling 1-800-548-4725, or go to:
http://www.intel.com/design/literature.htm
Software and workloads used in performance tests may have been optimized for performance only on
Intel microprocessors. Performance tests, such as SYSmark* and MobileMark*, are measured using
specific computer systems, components, software, operations, and functions. Any change to any of
those factors may cause the results to vary. You should consult other information and performance tests
to assist you in fully evaluating your contemplated purchases, including the performance of that product
when combined with other products.
Any software source code reprinted in this document is furnished under a software license and may only
be used or copied in accordance with the terms of that license.
Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
Copyright © 2013 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.