Snap Inc on Wednesday announced the launch of Lens Studio 3.2 which will allow augmented reality creators and developers to build LiDAR-powered Lenses for the new iPhone 12 Pro.

Lens Studio is Snap’s AR creation platform. The new update will allow creators and developers to build lenses for Snapchat that leverage the LiDAR Scanner on iPhone 12 Pro and iPhone 12 Pro Max.

Apple on Tuesday launched its new range of iPhones which include the iPhone 12 Pro and iPhone 12 Pro Max. The Pro models are equipped with LiDAR scanners which enable a more immersive AR experience.

“It allows Snapchat’s camera see a metric scale mesh of the scene, understanding the geometry and meaning of surfaces and objects,” explained an official release.

“This new level of scene understanding allows Lenses to interact realistically with the surrounding world,” it added.

Developers will be able to render thousands of AR objects in real time leveraging Apple’s A14 Bionic chipset and Snap’s ARKit.

“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” said Eitan Pilipski, Snap’s SVP of Camera Platform. “We’re excited to collaborate with Apple to bring this sophisticated technology to our Lens Creator community.”

Developers can create and preview lenses through a new interactive preview mode in Lens Studio 3.2. They can also test out these LiDAR-powered Lenses on Snapchat on Apple’s latest iPad Pro.

comment COMMENT NOW