Vous êtes sur la page 1sur 2

A C++ implementation of "Real-Time Style Transfer for Unlabeled Heterogeneous Hu

man Motion" by Shihong Xia, Congyi Wang, Jinxiang Chai, Jessica Hodgins
appear at SIGGRAPH 2015

To open and compile the whole VS2012 project, you should install CUDA 5.0 or hig
her version SDK in your computer. Our testing platform is Windows 32. Make sure
all the paths are set correctly.

The external library is header-only library:

1. Eigen3: http://eigen.tuxfamily.org/index.php?title=Main_Page
2. glm: http://glm.g-truc.net/0.9.7/index.html
All the other libraries such as quaternion and matrix operations are implemented
by ourselves, which uses the namespace MathEngine and RenderEngine
The testing data and training database used in the paper are in the test_motion
folder and database folder. For the training database file structrue, you can lo
ok at the ReadDatabase function in the CYEngine::CMotionDatabase class.

The core part of our algorithm is in the CYEngine and CUDA folder.
Core Engine
MARCore: The core implementation of the whole algorithm
MixAlg: implementation of MARs model
motion_utilize: helper functions part 1
motion_utilize_ex: helper functions part 2
MotionDatabase: build training data for style mapping using the KNNs
Searcherxxx: searching KNNs
To tweak the parameters, you can set the KNN threshold(static variable knn_ther)
in the MixAlg.cpp to reduce the computational time and set the default paramete
rs in the MARCore.cpp DefaultOptions function for better performance, and we als
o allow the user to specify the labels of the input motion by providing the file
label_info.txt, which is a n x 2 matrix, the first column represents the conten
t label and the second column represents the style label.
The program has two modes,namely batch mode and online mode, which can be set in
the StAvatar.cpp file with m_batchmode variable.
In the batch mode, the transfer is computed offline, the style target is given b
y setting the style polygon rendered in the right top, you should click two or m
ore style vertices, but we only transfer the style given by the first style vert
In the online mode, the transfer is computed in an online manner, you can drag t
he style polygon rendered in the right top.
Special note that, the framerate achieved in the online mode is about 50-55fps,
which is much lower than the capture rate 120fps, unless reducing the database s
ize like the online demo in our paper, what we can do is to reduce the data capt
ure rate to about 40 fps, which will lead to a slower motion.

The main procedure of running the system is as follows:

1. Open the MotionToolkit.exe and click the file open button, load a test motion
in the test_motion folder, now the program is loading the training database, ju
st wait a few seconds until the whole prepration is done, which is seen in the c
md window.
2. You will see the first pose of the input pose at begining, you should adjust
an optimal view and press 'v' to lock the camera view for better visualization,
and then select the vertices on the style polygons located in the right top.
3. If you are all ready, press 's' to start the whole system, for batch mode, yo
u will wait until the stylistic motion is generated, and then you can press the
space button to visualize the results. For online mode, you will wait until the
first pose of the stylistic motion is generated and then you can press the space
button to start, drag the style polygon along the path you have selected in the
step 2 to allow for style transitions.

Also, we provide the whole motion database in the Matlab .mat format, see skelet
on.mat for skeleton (the skeleton we use.) and style_motion_database.mat (the mo
tion data represented as quaternions, the real part is the fourth dimension.) fo
r more details!
All files copyright Shihong Xia 2015 unless otherwise noted.