Vous êtes sur la page 1sur 14

7/23/2014 Rendering a 3D environment from Kinect video

http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 1/14
Hackaday
Fresh hacks every day
Home
Hackaday Projects
Our Videos
Submit a Tip
Forums
About
July 23, 2014
Rendering a 3D environment from Kinect video
November 15, 2010 By Mike Szczys 76 Comments
[Oliver Kreylos] is using an Xbox Kinect to render 3D environments from real-time video. In other words, he takes the
video feed from the Kinect and runs it through some C++ software he wrote to index the pixels in a 3D space that can
be manipulated as it plays back. The image above is the result of the Kinect recording video by looking at [Oliver] from
his right side. Hes moved the viewers playback perspective to be above and in front of him. Part of his body is missing
and there is a black shadow because the camera cannot see these areas from its perspective. This is very similar to the
real-time 3D scanning weve seen in the past, but the hardware and software combination make this a snap to
reproduce. Get the source code from his page linked at the top and dont miss his demo video after the break.
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 2/14
[Thanks Peter]
Virtual Reality Gets Real with 3 Kinect Cameras Holograms With The New Kinect
Virtual Physical Reality With Kintinuous And An Oculus Rift
The Race Is On To Build A Raspi Kinect 3D Scanner 3D Printed Stick Shift Handle
Google+
Mike Szczys
Filed Under: Kinect hacks, video hacks Tagged With: 3d, c++, Kinect, render, scanning
Comments
1. Michael says:
November 15, 2010 at 12:05 pm
I wonder if you took 3-4 of theses and synchronized them, putting one on each wall of a room, you could get a
higher quality environment.
Follow 308
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 3/14
Reply Report comment
2. Daniel Magliola says:
November 15, 2010 at 12:10 pm
Hmmmm, this would be interesting to see with TWO Kinects, thatd fill all the blanks, right?
Reply Report comment
3. Michael Bradley says:
November 15, 2010 at 12:10 pm
I love it!!!! Can you take 3 of these and point them at the center of the room so as to build a complete 3D image
without shadows???
Reply Report comment
4. mixadj says:
November 15, 2010 at 12:10 pm
Thats aweswome
Reply Report comment
5. twistedsymphony says:
November 15, 2010 at 12:19 pm
this is probably one of the cooler Kinect hacks Ive seen I was apprehensive about the device at first, but now
I want one just to fool around with.
Reply Report comment
6. spyder_21 says:
November 15, 2010 at 12:19 pm
Its a nice start towards the right direction. I might buy one soon if cool stuff like this comes out. Would not buy it
to play stupid kinecct games.
Reply Report comment
7. xeracy says:
November 15, 2010 at 12:27 pm
@Michael Bradley 3x Kinect == Cheap Mocap? Im looking at getting one for some form of live production
visuals.
Reply Report comment
8. spiritplumber says:
November 15, 2010 at 12:29 pm
This is incredible. The Kinect is going to open up a lot of avenues of research.
Reply Report comment
9. Garak says:
November 15, 2010 at 12:32 pm
Really really cool
Will using more than one kinect sensors work? My understanding is that it projects a grid of IR dots over its
field of view and uses them for the measurement. Will intersecting grids confuse the sensors?
Reply Report comment
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 4/14
10. Roon says:
November 15, 2010 at 12:41 pm
I really want to see someone do something with 4 of these, you could do so much
Reply Report comment
11. xeracy says:
November 15, 2010 at 12:41 pm
@Garak i imagine this could be done my quickly turning the dots on/off and sampling them in a continuous
cycle.
Reply Report comment
12. turn.self.off says:
November 15, 2010 at 12:42 pm
ok, how long until someone builds a esper machine out of all this?!
Reply Report comment
13. Michael Bradley says:
November 15, 2010 at 12:49 pm
@xeracy, I think so, and this guy did a great job. I am impressed with how when he rotates it, how much
information is available around the corners, ie: the front of his face, when the cam is to the side.
When he rotated, I had flash back to The Matrix, the first scene when the girl is in the air, and all stops, camera
rotates, and she continues. Just imaging, that was done with several still cameras all positioned, etc. with this,
just freeze, rotate, and continue!!!
Reply Report comment
14. Drake says:
November 15, 2010 at 12:56 pm
Next stop 3D PORN!
Anywho
Would interfacing the kinect with a wii be HaD worthy?
I may have a crack at it later this week if so
Reply Report comment
15. xeracy says:
November 15, 2010 at 12:56 pm
@Michael Bradley and the kicker? ITS ALL REAL TIME! I really wish i was skilled enough to do this on my
own.
Reply Report comment
16. macpod says:
November 15, 2010 at 12:56 pm
Wow, I did not realize the system was that precise. I thought in situations as this that the depth stepping
increments would be closer to a foot or so in distance if not more.
Judging from the coffee mug and torso shots however, it seems the distance granularity is much smaller! Now I
want one
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 5/14
Reply Report comment
17. IssacBinary says:
November 15, 2010 at 12:56 pm
Just use real time photoshop content aware fill to fill in the gaps ;)
Reply Report comment
18. Oren Beck says:
November 15, 2010 at 1:03 pm
Hopefully, someone will write a Parser environment for Kinect data. To make files that could somehow be
rendered into Skeinforge Etc parameter/object details.
Enlisting the commercial solid print bureaus like oh-Shapeways and their peers in a scheme of printed object
credits for prize funding might kickstart the ideas.
It would be way cool to have 3D busts of my Grandkids..
Reply Report comment
19. Filespace says:
November 15, 2010 at 1:05 pm
i would like to see this guy manipulate a virtual object in real-time if even a ball perhaps
Reply Report comment
20. jc says:
November 15, 2010 at 1:35 pm
Im wondering what would it look like if you added a mirror in the kinects vield of vision?
Reply Report comment
21. Whatnot says:
November 15, 2010 at 1:42 pm
Sorry guys but you likely cant do more than one at the same time, it projects a IR pattern as part of gathering
data, that would interfere and fail with more than one in a room.
Reply Report comment
22. Sci says:
November 15, 2010 at 1:48 pm
Now someone just needs to hook this up to a cheap 3D display for live holographic video calls.
My bets are on something like the DLP project+spinning mirror combo. Or possibly a spinning LCD if someone
can get all the power & signal connections to it.
Reply Report comment
23. rizla says:
November 15, 2010 at 2:08 pm
In response to the can it use more than one kinetic. If you changed the frequency of the IR, would you be able
incorporate more kinetics without having them step over each other?
Reply Report comment
24. TheZ says:
November 15, 2010 at 2:24 pm
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 6/14
Quick Thought: You can use different IR wavelengths. You would have to replace the IR LEDs. And code how
to detect them.
>Sorry guys but you likely cant do more than one at the same time, it projects a IR pattern as part of gathering
data, that would interfere and fail with more than one in a room.
Reply Report comment
25. chbab says:
November 15, 2010 at 2:42 pm
It seems that your shape evolved did a huge work in this direction for its player projection as its silhouette is very
clean compared to what you see here It opens the door to augmented reality stuff with a cheap device :)
Reply Report comment
26. Removed says:
November 15, 2010 at 2:56 pm
kind of reminds me of that software from movie Deja Vu
Reply Report comment
27. rasz says:
November 15, 2010 at 3:02 pm
IR projector in Kinect means you cant use more than one at the same time. You can sync them like ToF
cameras.
But you could use few more normal cameras and use Kinect depth info to reconstruct/simulate/cheat the whole
scene.
There are algorithms that reconstruct 3D scene from ONE video feed http://www.avntk.com/3Dfromfmv.htm
Having few at different angles + one with 3D data should speed things up.
Reply Report comment
28. da13ro says:
November 15, 2010 at 3:04 pm
Not sure if the kinect requires any reference points for calibration but instead of different wavelengths (which I
imagine would be difficult/impractical) couldnt you setup a shutter system, solid state. Block the IR of other units,
sample data on one and cycle. Would slow down your available refresh rate.
Sweet hack mate, very impressed.
Reply Report comment
29. rasz says:
November 15, 2010 at 3:06 pm
^^^CANT sync them like ToF cameras.
I like the idea about different IR wavelengths, but i think Kinect uses laser instead of led.
I guess you could use two Kinects directly in front of each other just making sure IR dots dont end up at each
other cameras that would give you almost 90% of 3D and texture data.
Reply Report comment
30. cornelius785 says:
November 15, 2010 at 3:08 pm
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 7/14
@TheZ
I dont know about different wavelengths for multiple kinects. Alot of it depends if the kinect can differentiate
between different wavelenghts AND being able to hack the firmware to do stuff appropriately. Hasnt all the
hacking been on the computer side of just controlling it and getting useful information back? Id either go with
very narrow filters or synchronize all the kinect together and some multiplexing. If it is possible, Im sure someone
will figure out.
Wasnt it in Hitch Hikers Guide to the galaxy that they mention the progression of user interfaces as: physical
button -> touch interface -> wave hand and hope it works? Isnt the third stage upon now? Im wondering how
long I have to wait before I can control my Mythtv box with hand gestures in the air.
Reply Report comment
31. aarong11 says:
November 15, 2010 at 3:35 pm
Hmm, instead of replacing the IR LEDs, wouldnt it be possible to place IR filters in front of both the LED and
Detector of different Kinects? That way, each Kinect should only detect the wavelength of IR light it was
emitting.
Reply Report comment
32. sarrin32 says:
November 15, 2010 at 3:46 pm
you could put on a gimp suit where each joint is a different colour (forearms, hands, thighs etc). The computer
could use the colour coding to identify each joint, do measurements etcthen it could do motion capture.add
the motion capture to a real time or post calculated 3d scene with digital actors.hurray. How long till we get
kinect to bvh converters?
Reply Report comment
33. Eamon says:
November 15, 2010 at 4:05 pm
The trick would be to calculate SIFT points on some frames, and use those to track objects as they move. This is
the basic mechanism behind current reconstruction techniques, whether they use one or two cameras. The depth
map would improve the fidelity of the representation, and should provide shortcuts that would let this run faster.
Reply Report comment
34. blue carbuncle says:
November 15, 2010 at 4:30 pm
Keep up the good work everyone! Kinect is coming along nicely :)
Reply Report comment
35. Mike says:
November 15, 2010 at 4:44 pm
Outstanding work
Reply Report comment
36. jim says:
November 15, 2010 at 4:59 pm
I swear Ive imagined doing this for years, and how cool the glitches and shadows in some set ups would look.
Reply Report comment
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 8/14
37. Colecago says:
November 15, 2010 at 5:01 pm
That is pretty amazing. Picture does not do it justice, video is awesome.
Reply Report comment
38. qn4 says:
November 15, 2010 at 5:06 pm
In regards to those arguing against using multiple Kinects at once, one could consider putting something along the
lines of the shutter glasses (used for many of the current 3D displays) over the IR projectors, and dropping the
(depth) frames not associated with the currently projecting Kinect. Im sure that a bit of crafty software design
could interpolate the two 15Hz (normally 30Hz IIRC) streams fairly well, too.
Better yet if the exposure time is less than 1/60th of a second (30Hz/2) and the sync can be intentionally offset
Reply Report comment
39. NatureTM says:
November 15, 2010 at 5:08 pm
After this seeing this, Im absolutely getting one. Very cool.
Reply Report comment
40. qn4 says:
November 15, 2010 at 5:12 pm
da13ro beat me to it I really have to refresh the page sometimes before posting things. Still, this thing is full of
awesome capabilities, and its great to see that so many skilled people are making use of it.
Reply Report comment
41. Martin says:
November 15, 2010 at 5:19 pm
Could you polarise the IR coming from two kinects at 90 degrees to each other. Then use filters on the cameras
to block the other set.
Reply Report comment
42. macw says:
November 15, 2010 at 6:24 pm
polarization would work just fine provided that there was enough light remaining after the fact for the camera to
work properly.
So would strobing them on and off alternately its a very common procedure when you have multiple sensors
operating on the same band (ultrasonic distance sensors are a notable case, since theres not much ability to
reject returns).
Filtering for wavelength might work, provided that you had physical bandpass filters on the camera. The depth
sensor is monochrome and would react basically the same to any frequency its responsive to (different brightness
but because its an uncontrolled environment you cant rely on that to differentiate two sources).
I would guess that polarization would be the cheapest and quickest to implement, with bandpass filters being not
that much more complex. Time-division multiplexing would only be a good idea where you absolutely cannot
modify the kinect hardware in any way for some reasonotherwise its just a waste of effort.
I do really want to see what happens if you put a mirror in the path, though. Im imagining a window in the feed
through which you can look and see the other side of your room, just as if the mirror were actually a window into
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 9/14
an alternate dimension :P
Reply Report comment
43. Torwag says:
November 15, 2010 at 6:34 pm
about the IR grid stuff I would assume they modulate the IRs by some frequency to avoid influence of other IR
emitters. Either that, or one has to modulate the IR by themself. After that it would be relatively easy to use
several units simultaenoulsy by giving each of them a different modulation frequency and using an electrical filters
or an FFT algorithm to isolate the indvidual frequencies from each other.
No need for different wavelength, optical filters, etc.
Reply Report comment
44. TR says:
November 15, 2010 at 8:28 pm
Its unlikely that they modulate the IR output (as in pulse the laser/LED whichever it is) because the camera would
have to be able to capture at least that fast. >120Hz for US incandescent lights. So, a camera that captures video
at greater than say 300-400 FPS to adequately figure out whats noise and whats signal. Doubt they used anything
like that. I think the polarization would be the best bet without opening up the connect. Would try it if I had
another connect and time. Maybe someone can try using two pairs of the free 3D theater glasses. One glasses
lens for each projector and depth camera.
Reply Report comment
45. curious says:
November 15, 2010 at 10:26 pm
Could you hook up multiple kinects to capture the other angles of the room and have a full 3d map?
Reply Report comment
46. Michael Bradley says:
November 15, 2010 at 10:32 pm
I just looked at this guys youtube page, OMG, this guy is on it! I thought I was fast with code (only uControllers)
this guy rocks! He did some augmented vr, addressed the mirror question, etc..
Reply Report comment
47. qwed88 says:
November 15, 2010 at 10:40 pm
Im not wanting live video like that, but if it could fix a Kinect to a rotating base and scan environments in 3d!
This would be functional I could use this.
Or software written to take slices so an object could be rotated in front of it and scanned. This would make a
relatively cheap 3d scanner.
Seems as if one could use this with a program like Zbrush to sculpt with your hands.
Reply Report comment
48. qwed88 says:
November 15, 2010 at 10:52 pm
As a follow up to my comment I just saw another video of his, with a cg model of a creature sitting on his desk
moving and all in real time. Hes really not that far from the Zbrush idea.
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 10/14
Imagine a Kinect above your monitor as your sculpting with your hands the model on the monitor?
Reply Report comment
49. EquinoXe says:
November 16, 2010 at 12:15 am
re 2 kinect: in theory should be easy.
get 4 linear polarization filters (2 for each kinect).
Polarize kinect 1 @ 135 and kinect 2 @ 45
(place a polarizing filter on depth cam as well as upon IR source)
now both kinects cant see each other but they can see their own beam.
Reply Report comment
50. yeah says:
November 16, 2010 at 3:14 am
@EquinoXe
Polarization is normally not sustained under diffuse reflection, so even though the light would be polarized, the
light coming back from the scene wouldnt. I guess a shutter system as suggested above could do the trick, but
then youd have to hope that the kinect doesnt use any type of temporal coherence.
Reply Report comment
Newer Comments
Leave a Reply
The Hackaday Prize
COUNTDOWN FOR ENTRIES
28:
Days
01:
Hours
Enter your comment here...
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 11/14
27
Minutes
33
Seconds
Never Miss a Hack

Search
Search this website Search
Categories
Select Category
In case you missed it
Talking BeagleBoard with [Jason Kridner]
Leave a Comment
HOPE X: Interviews with Ellsberg and Snowden
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 12/14
40 Comments
HOPE X: Commodore 64s Are Back, Baby
14 Comments
HOPE X: Hackaday Shirt Gets Hacked at Hacker Convention
52 Comments
HOPE X: Lock Picking and Lock Sport
7 Comments
More Posts from this Category
Our Columns
Retrotechtacular: Were Gonna Have Manual Transmissions the Way
My Old Man Told Me!
25 Comments
Hackaday Links: July 20, 2014
7 Comments
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 13/14
THP Hacker Bio: hackersbench
18 Comments
Developed on Hackaday: Discovering Shenzhen and its Companies
21 Comments
Retrotechtacular: AT&Ts Hello Machine
33 Comments
More Posts from this Category
Recent comments
Jim Thompson on Cold War Clock is all Tubes
Kip on Retrotechtacular: Were Gonna Have Manual Transmissions the Way My Old Man Told Me!
trandi on POV Display Does it on the Cheap
matseng on POV Display Does it on the Cheap
Andrew on Smart Hat Puts Your Head in the Game
Bogdan on USB Rotary Phone: A Lync to the Past
Remy Dyer on VoCore, The Tiny Internet Of Things Thing
Slingrock on Building a keyboard from scratch
reelyActive on HOPE X: Creating Smart Spaces With ReelyActive
rasz_pl on HOPE X: Creating Smart Spaces With ReelyActive
Find Hacks by Date
November 2010
S M T W T F S
Oct Dec
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30
7/23/2014 Rendering a 3D environment from Kinect video
http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/ 14/14
Return to top of page
Copyright 2014 Genesis Framework WordPress Log in
Hackaday, Hack A Day, and the Skull and Wrenches Logo are Trademarks of Hackaday.com
Our Policies

Vous aimerez peut-être aussi