Snap Inc on Wednesday announced the launch of Lens Studio 3.2 which will allow augmented reality creators and developers to build LiDAR-powered Lenses for the new iPhone 12 Pro.
Lens Studio is Snap’s AR creation platform. The new update will allow creators and developers to build lenses for Snapchat that leverage the LiDAR Scanner on iPhone 12 Pro and iPhone 12 Pro Max.
Apple on Tuesday launched its new range of iPhones which include the iPhone 12 Pro and iPhone 12 Pro Max. The Pro models are equipped with LiDAR scanners which enable a more immersive AR experience.
“It allows Snapchat’s camera see a metric scale mesh of the scene, understanding the geometry and meaning of surfaces and objects,” explained an official release.
“This new level of scene understanding allows Lenses to interact realistically with the surrounding world,” it added.
Developers will be able to render thousands of AR objects in real time leveraging Apple’s A14 Bionic chipset and Snap’s ARKit.
“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” said Eitan Pilipski, Snap’s SVP of Camera Platform. “We’re excited to collaborate with Apple to bring this sophisticated technology to our Lens Creator community.”
Developers can create and preview lenses through a new interactive preview mode in Lens Studio 3.2. They can also test out these LiDAR-powered Lenses on Snapchat on Apple’s latest iPad Pro.
Comments
Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.
We have migrated to a new commenting platform. If you are already a registered user of TheHindu Businessline and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.