Polarized 3D

By exploiting the polarization of light, researchers from MIT have been able to boost the resolution of 3D imaging devices up to 1,000 times. The researchers are calling the new system Polarized 3D, which they describe in a paper presented at the International Conference on Computer Vision.

In relation to the applications to virtual reality (VR) and 3D printing, this technology will allow us to scan 3d objects with far, far better resolution. This will make the objects that are imaged appear more life-like in the VR world. The same fundamental idea applies to 3D printing, as it relies on scans in order to operate and produce items. 

In an MIT News piece, Achuta Kadambi, PhD student in the MIT Media Lab and one of the system’s developers, explains the origins of the breakthrough, saying that “Today, they can miniaturize 3-D cameras to fit on cellphones, but they make compromises to the 3-D sensing, leading to very coarse recovery of geometry. That’s a natural application for polarization, because you can still use a low-quality sensor, and adding a polarizing filter gives you something that’s better than many machine-shop laser scanners.”

The nature of polarized light means that any measurements based on it will offer two equally plausible hypotheses about its orientation and deciding which makes the most sense geometrically is a time-consuming computation. To overcome this, the researchers have an experimental setup that makes use of a graphics processing unit, in this case, a Microsoft Kinect that can gauge depth using reflection time and a polarizing photographic lens placed on the Kinect’s camera.

3D Printing and Self-Driving Cars

The new technique would allow manufacturers to build cellphones with high-quality 3D cameras and even use photos taken with it to create sculptures using a 3D printer.

The technology could also be applied to autonomous vehicles. Since self-driving cars are highly reliable under normal conditions, where the surroundings are well lit, their vision algorithms struggle to process rain, snow, or fog, because the air particles present during these conditions unpredictably scatter light and make it much harder to for the system to interpret.

The researchers that in milder conditions that still cause conventional vision algorithms to go haywire, their Polarized 3D system is able to interpret the scattered light waves. “Mitigating scattering in controlled scenes is a small step, but that’s something that I think will be a cool open problem,” says Kadambi.

Share This Article