Ranged image cameras are a recent technology that capture distance information as well as light. As software support develops these cameras may come to have a big impact on the way that photos are taken and manipulated for use in graphic design compositions.
A single 2D photograph that includes distance information becomes 2.5D. This is not full 3D because the camera has only captured light and distance information from a single perspective. However future photo manipulation tools can take advantage of depth information. Depth information will help software determine more naturally the objects contained within a scene so that pixel masking will become easier. Simply every adjacent pixel at a similar depth is probably part of the same object. Combining existing 2D edge detection with the distance information will make automatic object selection much more accurate and faster. So objects can be moved, moved or replaced much easier.
Depth information will allow for stretching in the Z dimension to create interesting perspective effects. By manipulating the depth information directly and recasting the perspective into 2D space rooms could appear longer, a car on a road accelerates and a scene can be flatten like an old cartoon.
The most interesting possibilities come when multiple ranged image cameras are used. A simple rig of two (or more) cameras starts to give 3D capabilities. The rotation of objects could be tweaked, the angle of the viewport could be tweaked or the position of the camera changed. Such data could be fed into 3D software or 3D painting programs with a little conversion.
If objects are moved in 3D space then the lighting will look wrong. Fortunately, in 3D with pixels and ranges, it could be possible to guess the light sources and perhaps even automatically modify the pixels as they are rotated to fix lighting effects. Or the designer could leave the lighting incorrect on purpose – relying on the near undetectable incongruity of “wrong” lighting to have an attracting effect on the viewer.
Technologies like QuickTimeVR would move in a far more life-like manner if ranged image information is taken into account. Presently zooming in QuickTimeVR tends to stretch objects that are near to the camera, but off centre from the zoom. Utilizing distance information would let QuickTimeVR handle these objects in a more natural way.
The possibilities for prosumer level ranged imaging are enormous and could have a big impact on the way we edit images in future.
(Read more articles in the Future Design Software series)
No comments:
Post a Comment