Virtual Vaudeville is a prototype of the "Live Performance Simulation System," a fully generalizable system for simulating live performance events from any historical period. The project was funded principally by the NSF, but also received grants from the University of Georgia Research Foundation, and the NEH.  There were seven universities involved in the project and I worked as the digital content & technical director directly under the principle investigator, Dr. David Saltz, at the University of Georgia.
 
The goal of the project was to create a real-time simulation of turn of the century American vaudeville using state of the art game engine technology.  Users could view the performance from any vantage, interact with audience members, react to performance events, and get extensive notes covering just about everything.  This real-time simulation would allow users to truly experience historical events in a way never before possible. 

I established the methods for the project (800 animated spectators + acts and theater in Gamebryo game engine), created the Union Square Theater (models, textures), Sandow mesh, some audience meshes, motion capture for, rigged and animated Frank Bush, created cloth sim coat and fake beard, created shaders, and animated cameras. Near the end of the grant we determined that we would not be able to complete the full real-time version of the project within the grant time frame, so we set about making pre-rendered and limited scope real-time products, which I also programmed.
Products
Union Square Theater Fly Thru - Requires Shockwave - click to view
Explore the Union Square Theater as it looked in the late 19th century. No longer in existence, we have painstakingly recreated the UST from historical etchings, descriptions, floor plans and photographs of other theaters from the period. Take a 3-D tour through the New York theater where some of the most famous performers of vaudeville performed.
Unfortunately the Performance Viewer app uses RTSP and the movies are no longer hosted, so this project is not available for viewing as it was intended.  Basically the app allowed a user to switch between any of eight perspectives at any time and read the extensive hypermedia notes to gain a richer understanding of the performance in its historical context.  The movies below will provide a sense of the content I produced for the project.
This is an edited clip of the "fly through" video that ran the full length of the Frank Bush act.
This is the "edited view" video for the full Frank Bush act
Comparison of real-time assets and our modified pre-rendered assets with hair and cloth simulation.
Documentation
Process
A video that explains our process of using motion capture for Virtual Vaudeville.
We employed a variety of setups and several actors in capturing the Frank Bush act , the Sandow act, and our "intelligent agent" audience members.  Here are a few snapshots of that process.
 
We did marker based facial capture and used the data to drive blendshapes through Filmbox (now called MotionBuilder). I then edited the animation and added additional expressions.  I also setup driven textures for adding wrinkles.
Some examples of our reference video (George Contini - actor) matched up with driven and edited Frank Bush facial animation.
Academic Research
Published:

Academic Research

Class demo result

Published:

Creative Fields