The more our Camera understands spaces around us, the more creativity can be unlocked by our AR creator community, positively impacting the world we all share.
Over the years, we’ve invested in technological building blocks to make the Snapchat Camera smarter and understand the world around it, from recognizing and tracking buildings, to real-time mapping and geometric and semantic scene understanding.
For example, in 2019, we empowered creators and developers to build AR anchored to a handful of iconic structures through a Lens Studio feature called Landmarkers. Over time, we expanded our collection of Landmarker templates to more than 30 sites, including the Gateway of India and Egypt’s Great Sphinx of Giza.
For our team, creating a new Landmarker template has required collecting tons of imagery of the location, taking time and resources. And for every Landmarker we added, there were thousands more that our AR community wanted to bring to life.
So we launched a new feature called Custom Landmarkers in Lens Studio, which lets AR creators and developers anchor Lenses to local places they care about – from statues to storefronts – to tell richer stories about their communities through AR. Now, any creator with a LiDAR-enabled phone can create their own Landmarker in a matter of minutes – taking a big step forward in democratizing AR.
Using a mobile device with a LiDAR Scanner, Lens Creators can unlock the “Custom Landmarker Creator” Lens to build a 3D representation of the area they’d like to create a Custom Landmarker of. Then, a unique ID is generated within the Lens once the 3D representation has been created. Finally, in Lens Studio, creators can access this 3D mesh by entering the unique ID code shown in the Creator Lens. The code will allow the mesh to be imported into Lens Studio, so creators can build an AR experience on top of it. Lens Creators have already used the new feature to build Custom Landmarkers, like Luke Hurd’s AR experience anchored to a monument of jazz legend Charlie Parker in Kansas City!
What happens behind the scenes when someone uses our Creator Lens to bring a new Custom Landmarker to life? We actually create not one, but two 3D representations of the world. First, we create a model to correctly identify and track the target Landmarker when Snapchatters unlock those experiences. We also capture the 3D structure of the Landmarker, and make it available for the Lens Creator so they can use it in their Lens (e.g. for occlusion or object placement).
For the first representation, we created our own mapping and tracking systems based on natural feature tracking. The mapping part worked by building upon SLAM-based AR capabilities (e.g. ARKit and ARCore) in order to produce models capable of detecting and tracking the target’s visual features with high accuracy.
As for the second representation (the 3D structure), one option would have been to use photogrammetry to generate those structures. However, a much simpler alternative presented itself once we started to see phones come to the market that had built-in LiDAR sensors. LiDAR sensors emit invisible light into the world and understand structure based on how long it takes for the light to bounce back. We immediately added support for LiDAR-based 3D scanning with the World Mesh feature (which was featured in Apple’s iPhone 12 Pro keynote!), and then proceeded to combine the resulting mesh with our models to create our first prototypes for Custom Landmarkers. Meanwhile, Lens Creators and developers are still able to use photogrammetry (e.g. using Polycam or Apple’s ObjectCapture) or any other methods to obtain a more accurate geometry of the location and use that to supplement the LiDAR-based one we provide.
The next challenge was developing an accessible user experience around the Landmarker scanning process for our vast Lens Creator community. We streamlined a complex process into simple steps, replacing text directions with imagery and animations when possible, and providing fun and easy to understand ways to test a scan before finalizing. We pulled ideas from everyone on the team, and experimented with several different iterations of the user experience until we settled on a prototype that achieved strong results from internal team testing.
What really helped us turn this into one of the best-in-class scanning tools was our focus on multiple rounds of user testing. From our external testing process, we learned that we could improve how we explain the second phase of the scanning process with some creative 3D visualizations, which ultimately became the Perspective Capture process. After the second round of user testing confirmed a high level of satisfaction with the entire scanning process, we were ready to release the Lens!
Custom Landmarkers first became available to members of the Snap Lens Network, who receive early access to our most advanced AR tools before they’re released publicly. Lens Creators in the Snap Lens Network, including Qreal, Luke Hurd, Pradeepa Anandhi, and BLNK, have already built Custom Landmarkers that are entertaining, educational, and that help businesses grow. And now that Custom Landmarkers are available to everyone in our Lens Creator community, you can check out the Custom Landmarker guide if you want to try this out yourself.
We’ve only just scratched the surface of linking physical locations to AR experiences through the Snap AR platform. Next, we’re exploring how we can empower a wider set of Lens Creators to build with Custom Landmarkers by enabling creation on phones without LiDAR Scanners through our World Mesh capabilities. We’re also working on bringing Custom Landmarkers capabilities to our next-generation Spectacles. Ultimately, we're building towards a future where people will be able to view and interact with contextually relevant AR wherever they go.