Updated

When we tickle our artistic sides, playing with the varying fields of focus in our camera lenses can be a form of aesthetic expression. But for more practical uses -- say, filming a multi-layered scene like a concert where various subjects are at various depths -- it would be advantageous to capture the entire scene in perfect focus.

A researcher in Toronto claims he's created an omni-focus camera that does exactly that.

The Omni-Focus Video Camera is based on a novel distance-mapping principle that allows the camera to capture both near- and far-field images in real-time, high-resolution focus. It does so by employing an array of color video cameras, all focused at different distances.

The Divergence-ratio Axi-vision Camera -- the key component in the new video rig -- then maps each pixel in space, while software stitches a composite image together using the most focused pixels at each distance, creating omni-focused video in real time.

The ability to bring an entire scene into focus regardless of its depth could vastly impact the commercial film industry and A/V hobbyists alike, but the inventors see it also making waves in defense, security, and medicine. Security cameras could capture large swaths of real estate with unprecedented clarity, and doctors using laparoscopes for less-invasive surgeries could better see what's going on inside a patient without constantly adjusting optics.

But we're still a bit fuzzy on the details behind the tech; physically speaking, two different cameras cannot capture identical images because they cannot occupy the exact same space at the exact same time. As such, it seems like the compositing process would slightly distort or dull the images, especially as the software tries to do all of this in real time.

That being said, if the Omni-focus Video Camera works as well as the University of Toronto team says it does, we want one.

Read more at PopSci.com.