Remember playing in a sandbox as a little kid, and imagining your Teenage Mutant Ninja Turtles being trapped in a lava stream from an apparently active volcano? Well now your dream can be a reality (minus the potentially hazardous lava).
There are a number of magical sandbox solutions over the years. (Shown above is the UC Davis Augmented Reality Sandbox discussed below).
This concept originates way back in 2002 with Hiroshi Ishii’s group’s SandScape/Illuminating Clay project series (work done by Ben Piper, Yao Wang, et al.). In this original incarnation (see video below), which was wayyyy pre-Kinect, SandScape used a hacked Minolta Vivid-900 laser scanner. This massive scanner cost $40,000 and for that low, low price you got 320×240 depth values at a blazing fast frame-rate of 1.2 seconds. This was at the time referred to as “near real-time.”
Despite the archaic equipment of the time, Ishii’s group demonstrated an impressive system for interactive visualization for planning and simulation via a projection mapped sandbox. They showcased landscape analysis functions which color code the digital elevation, the slope, shadows, orientation and waterflow of the topographical shape formed by the sand.
Disney's Magical Sandbox
Disney was one of the pioneers of projection mapping, so it is no surprise that they’ve built a kick-ass projection mapped sandbox. Originally premiering at D23 in 2009, the Disney sandbox featured a story of burying turtle eggs in the sand, then watching them hatch and swim out to a gloriously projection mapped sea. In fact, as a result of this project, Disney patented a method for guiding users to create a specified shape in a sandbox.
UC Davis Open-Source Sandbox
This brings us to the open-source Augmented Reality Sandbox of Oliver Kreylo’s group at UC Davis (the UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences, http://www.keckcaves.org). Originally built for an NSF-funded project on informal science education, these AR sandboxes were set up as hands-on exhibits in science museums.
The Augmented Reality Sandbox uses a Microsoft Kinect Sensor to scan the sandbox in real-time, fully realizing the original vision of Ishii’s group. The Sandbox features a water flow simulation based on the Saint-Venant set of shallow water equations, which are a depth-integrated version of the set of Navier-Stokes equations governing fluid flow, is run in the background using another set of GLSL shaders. Sounds sciency right?
The sandbox hardware was built by project specialist Peter Gold of the UC Davis Department of Geology. The driving software is based on the Vrui VR development toolkitand the Kinect 3D video processing framework, and is available for download under the GNU General Public License.
There is even a company selling magical sandboxes, iSandBox, out of Russia. For only ~$3500 you can have your own Kinect enabled projection mapped playland. The website is in Russian, so don’t ask me anything else about it… Check out their video though…