[blank_spacer height=”5px” width=”1/1″ el_position=”first last”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/1″ el_position=”first last”]
Soon depth sensors will be in your iPhone. In fact any mobile device you have: phone, tablet, laptop.
I’ve been hoping this would come true for years, but now it is an undeniable reality. There are a multitude of companies talking about putting depth sensors into mobile phones and wearables, like Structure Sensor, iSense, Meta. Also, Apple just bought Primesense, makers of the Kinect depth sensor, for $345 million.
[/spb_text_block] [blank_spacer height=”5px” width=”1/1″ el_position=”first last”] [spb_slider revslider_shortcode=”mobile-depth” width=”1/1″ el_position=”first last”] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”first”]
[blockquote3]So what does a future with depth sensors in your iPhone look like?[/blockquote3]
[/spb_text_block] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
So, we can safely say that our iPhones will eventually have depth sensors. But what will we do with them? Well the things we usually do with our phones: take pictures and play games.
[/spb_text_block] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/1″ el_position=”first last”]
In Arto, we explore the future of photography with depth sensors. We use two depth sensors, one to capture 3D information about the world and another to capture 3D gestures. This means you can “reach into your photograph” to edit it.
[/spb_text_block] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_single_image image=”497″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
You can position virtual light sources. Instead of carrying around a light kit, photographers can just wave their hand around. You move your hand left, the virtual light moves left, move your hand up, the light moves up, etc.
[/spb_text_block] [spb_single_image image=”506″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
We use a very simple lighting model, but we envision a future of photography with more sophisticated lights (like area light sources).
[/spb_text_block] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_single_image image=”501″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
You can insert virtual objects into your photos. Like Justin Bieber, of course. And you can insert Bieber at his exact height of 5’7″. You can reach around Justin and correctly occlude him (to give him a hug).
[/spb_text_block] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_single_image image=”502″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
You can edit the lens blur (depth-of-field) of your photo. By moving your hand backwards and forwards in depth, you can change the focal plane depth and aperture. This is all done by simulation, using the depth-map to “fake” depth-of-field.
You know how your iPhone pics never seem to look as good as those of a professional photographer. Well a lot of that is due to the lack of lens blur. The tiny optics in your iPhone limit the depth-of-field, but with depth sensors we can fake it. The end result, photos with buttery lens blur.
[/spb_text_block] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_single_image image=”503″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
With Instagram you can apply filters to images.
With Arto, you can easily select the foreground of the image by moving your hand through space, then apply Instagramy filters to the foreground of the image only. Thereby making something in the foreground “pop out” of your image.
[/spb_text_block] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_single_image image=”505″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
Finally, you can capture photos of fast moving objects, like pets, wild animals, children, sporting events, etc. You simply place a “3D Trigger” into your photograph. If anything enters into the trigger, your camera takes a photograph.
[/spb_text_block] [spb_single_image image=”504″ image_size=”full” frame=”noframe” intro_animation=”none” full_width=”no” lightbox=”yes” link_target=”_self” width=”1/2″ el_position=”first”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/2″ el_position=”last”]
This means you can take photos of fast moving objects (like these falling objects).
[/spb_text_block] [blank_spacer height=”15px” width=”1/1″ el_position=”first last”] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/1″ el_position=”first last”]
You may have noticed that the prototype is quite large (hence the DSLR and tripod). This project was actually done back in 2012, before things like Structure Sensor (PrimSense Capri) existed. Also, this was before things like LeapMotion, so we had to build our own finger tracking library from scratch.
We are looking into updating the technology to try the interactions in a truly mobile form factor.
Whatcha think?
[/spb_text_block] [spb_text_block pb_margin_bottom=”no” pb_border_bottom=”no” width=”1/1″ el_position=”first last”]
See the full paper here: Arto.pdf
[/spb_text_block]