So, you know what the big news from this week is: the CNN holograms. “Help me Wolf Blitzer, you’re my only hope.” In class last night we talked a little about this, and how hard it would be to get one of these things to play with. Well, I contend, not as hard as it seems. Yes, the original system probable cost a bundle, in part because it is in HD. But couldn’t we make a cheap version for Ustream?
First step, is you would need to get those 35 cameras. Let’s be cheap, cover, say, 300 degrees with 15 cameras. My guess is that the hop wouldn’t be to bad at 20 degrees. Remember, this is for web video, so let’s buy these $13 web cams (free shipping!), for a total of $195. Need a mega-USB hub to handle all these cameras; chances are a 13-port hub will work, if we can rely on a couple of built-ins as well, for another $35 or so.
From here, it’s all about sets and software. You need to set up a “green room” with green screen all (or most) of the way around, except for pinholes for the webcams. I’m thinking something a little larger than a phone booth. Need to light that thing appropriately, which could be a little difficult, so for now, we’ll just deal with a few lights on top and the bottom–enough to make sure there’s no shadow on the background. This booth is at the remote site where the “hollowgrammed” actor is located. The computer needs to have a broadband connection, but nothing special, and a little program that will switch cameras according to a request from over the web.
The next part is the studio set-up. You could either do a green tube in a regular studio, or for ease, do the whole thing as a virtual studio. Let’s go the difficult route. The computer doing the processing needs to be able to compose a video stream to ustream… but folks are already doing this using stuff like CamTwist (free!). The computer on that end needs to know where the camera is, and where it is pointed. There are a few ways you could hack this out of say a Wii remote and an old mouse.
Once the studio computer knows where the camera is, and where it is pointed, it can (a) send a signal to the remote system to indicate which camera to use, (b) distort the image (scale & perspective) to simulate the viewing angle, (c) place this image into the location on the virtual, or real, set in the studio, and (d) transmit this video/audio stream to ustream.
Of course, it might not be as smooth as CNN’s system. And, to reduce potential lag, it would probably be smart for the local hologram booth to be transmitting more than a single image–say one from each camera on either side–to anticipate a quick moving camera or angle edge. But I think this could definitely be done.
So who wants to build it as their masters project :).