Projects
Surface depth hallucination offers a simple fast way to acquire albedo and depth for textured surfaces that exhibit mostly Lambertian reflectance. We obtain depth estimates entirely in image space, and from a single view so there are no complications that arise from registering texture with the depth obtained.
The user simply takes two photos of a textured surface from an identical position parallel to the surface; one under diffuse lighting conditions as might be encountered on a cloudy day or in shadow, and the other with a flash (strobe). From these two images together with a flash calibration image, we estimate an albedo map. We also estimate a shading image primarily from the diffuse lit image capture. We develop a model that relates depth to shading specififically tailored for textured surfaces with relatively little overall depth change. By applying this relationship over multiple scales to our shading image, we arrive at a per-pixel height field. Combining this height field with our albedo map gives us a surface model which may be lit with any novel lighting condition, and viewed from any direction. Provided we have a suitable exemplar model, our method can also work from a diffuse lit image alone by histogram matching it with the albedo and shading images of the exemplar model further simplifying our data capture process.
We validated our approach through experimental studies and found that users believed our recovered surfaces to be
plausible. Further, users found it difficult to reliably identify our synthetically relit images as fakes. Details
of our method, and the results of our validation were published in Siggraph 2008 PDF.
Videos
You can download this and other videos illustrating our method from here:
Movie showing a relit scene with the viewpoint rotated.
.mov (54 MB file)
Movie showing the depth recovered for a rock wall using our method.
.mov (11.9 MB file)
Movie showing a recovered surface depicting Mayan Glyphs which is rotated showing the shadows cast by our hallucinated depth map.
.mov (38.9 MB file)
Movie showing the Mayan Glyphs surface with uniform specularity added to the material model.
.mov (10.4 MB file)
Siggraph 2008 video.
.mov (137.8 MB)
Example Surfaces
Here we show a selection of relit images illustrating the wide variety of different kinds of surfaces that we used to test our method. Scaled versions of the full set of experimental stimuli used in our validation studies are available here. If you would like access to our high resolution images then please contact me via email.
Brick
Brick with Leaves
Brick Walk
Doormat
Dry Stone Wall
Head Stone 1
Rock Wall
Wood Chips
Sample Models
A selection of our models are available here as tar, gzipped archive files. Each archive contains two files, an .obj format file containing a surface mesh and a high resolution albedo map in .tiff format. There models are provided free to use for any purpose but we would appreciate an acknowledgment.
Brick leaves model.
.tar.gz (48.6 MB)
Rock Wall model.
.tar.gz (27.5 MB)
Headstone model.
.tar.gz (25.5 MB)
Mayan Glyphs model.
.tar.gz (40.5 MB)
News Stories
This work is being widely reported by the media. The following links are just a few of the stories that have turned up on the web: [New Scientist...] [Youtube video...] [Slashdot...] [Pressetext...]
Acknowledgments
Surface depth hallucination was developed as part of the Daedalus project, and funded by the UK-EPSRC under grant EP/D069734/1 (May 2006 - April 2009). We gratefully acknowledge the support of Kevin Cain and the Mayaskies project, for providing access to Chichén Itzá. We also thank Timo Kunkel for converting our SIGGRAPH video to a flash video.