A Kinect Princess Leia Hologram In Realtime
Saturday, 29 January 2011

True 3D realtime holography is not only possible - it makes use of a Kinect as its input device. A team at MIT has recreated the famous 3D Princess Leia scene from the original Star Wars - but as a live video feed!

Michael Bove's group at the MIT Media Lab has managed to create real 3D holograms in real time and transmitted a reenactment of the Princes Leia scene in the original Star Wars. What is important to notice is that the resolution and frame rates may be low but this is live 3D - not stored (as the hologram in R2D2 was in the movie).

 

hologram

 

It's a great stunt but don't miss the importance - this is realtime 3D holography and that means you can view it without any glasses or other gadgets and you can move around and see behind objects in the scene. This is more than the flat 3D you get in movies.

 

 

The idea being developed here is very simple to understand but very difficult to implement. A hologram creates a true 3D (no glasses or any other trickery needed) by recreating the wavefront of light that the real 3D scene would have created.

When you view a hologram you really are interacting with the light field that the original object would have created. The big problem is that creating the wavefront involves taking a parallel beam of light and passing it through an interference pattern that transforms it into the desired wave front. The standard way of doing this is to record an interference pattern onto a photographic plate. The interference pattern is obtained from the original 3D scene so what you have is essentially a 3D camera.

Unfortunately you can't easily turn this into a broadcast system because the resolution needed to record the interference patterns is too great, as is the exposure time. 

A better solution would be to create the interference patterns by computing them and then displaying them on a screen. The problem is that you have to invent a whole new type of screen with a very high resolution and an ability to change the "phase" of the light at each point of the screen.  The display being used at MIT was developed by students of Stephen Benton, a pioneer of holographic imaging who died in 2003.

In many ways it is this display which is the most important part of the system. The team are working on something better, simpler and hopefully cheaper.

Where does the Kinect fit in?

The Kinect simply acts as a cheap, off-the-shelf, 3D camera. It works out the location of each pixel in 3D and using this information and the color the computers can work out the hologram - in real time. The Kinect data is fed to a laptop which sends it to a PC with three GPU based graphics cards which then compute the interference patterns needed to create the wavefront.  At the moment the computation only results in 15 frames per second but with more work they expect to get up to standard frame rates.

More Information

How about holographic TV?

Michael Bove

MIT Media Lab

Getting Started with PC Kinect

 

Banner


Apache Lucene Improves Sparce Indexing
22/10/2024

Apache Lucene 10 has been released. The updated version adds a new IndexInput prefetch API, support for sparse indexing on doc values, and upgraded Snowball dictionaries resulting in improved tokeniza [ ... ]



It Matters What Language AI Thinks In
23/10/2024

We are currently polarized in how we think about Large Language Models. Some say that they are just overgrown autocompletes and some say that they have captured some aspects of intelligence. How well  [ ... ]


More News

 

Last Updated ( Sunday, 28 January 2018 )