Point and shoot? No, point and interact. Snapchat can now help with your homework. The app’s camera is becoming the foundation of an augmented reality developer platform known as “Scan.” Snap today announced partnerships with Photomath to add the ability to solve math problems, and Giphy for detecting objects, which then spawn related GIFs onscreen. Scan will roll out to all Snapchat users soon, and developers interested in joining the platform can contact Snap.
Previously, Snapchat’s camera could identify songs with Shazam and recognize objects so you could buy them on Amazon. But now instead of just offering a few scattered tools, Snapchat is crystallizing its plan to let you reveal hidden information about the world around you.
“Our camera lets the natural light from our world penetrate the darkness of the internet . . . as we use the internet more and more in our daily lives, we need a way to make it a bit more human,” said Snap CEO Evan Spiegel at the company’s first-ever press event, the Snap Partner Summit. There it also announced it would launch an ad network, power Stories in other apps and launch a real-time multiplayer games platform.
Others like Blippar have tried to build AR utility platforms, but they lacked the community and daily use necessary to already be top of mind when people want to scan something. But Snap CEO Evan Spiegel today revealed that, “In the United States, Snapchat now reaches nearly 75 percent of all 13 to 34-year-olds, and we reach 90 percent of 13 to 24-year-olds. In fact, we reach more 13 to 24-year-olds than Facebook or Instagram in the United States, the U.K., France, Canada and Australia.”
The comparison data comes from Facebook’s ad manager estimates, which aren’t always totally accurate. Still, the stats demonstrate that amongst the audience likely to explore the world via augmented reality, Snapchat is huge. Even if Facebook wanted to build this behavior, it can’t, because the Facebook Camera isn’t the heart of its social network.
When users tap and hold on the Snapchat camera, they’ll start to Scan their surroundings. Answers to math equations will magically appear. If you view a $10 bill, Hamilton will come alive and sing a song from the musical. Scan a slice of pizza and a dancing Giphy pizza slice appears. Users will also see the new Snapchat AR Bar with dedicated buttons to Scan, create a lens or explore the 400,000 AR Lenses created by Snapchat’s community. Indeed, 75 percent of Snap’s 186 million daily users play with Lenses each day, combining into 15 billion total plays to date. Scan was built off the acquisition of a startup called Scan.me, which until now has powered Snap’s QR Snapcodes that let people add friends or unlock Lenses.
Outside of utility, Snapchat is also adding a slew of new creative AR features to keep that audience entertained and loyal. For example, it’s launching Landmarkers, which uses point cloud data from user-submitted Our Stories of major landmarks to power animated AR transformations of famous places. Now the Eiffel Tower, Buckingham Palace, LA’s Chinese Theater, DC’s Capitol Building and NYC’s Flatiron Building can spew rainbows, shoot lightning and more.
For developers and Lens creators using Snap’s Lens Studio tools, Snap is launching new Creator Profiles where they can show off all the Lenses they’ve contributed. They’ll all have access to new AR templates for hand, body and pet effects that take care of all the hardcore computer science. Creators just add their graphical assets like a mustache for dogs, fireballs that shoot out of people’s hands or rainbows that appear over someone when they hold their arms out.
Snap will even surface relevant community Lenses in the Lens Carousel based on what its Scans pick up. One place it falls short, though, is there’s no direct monetization opportunities for independent Lens creators, beyond Snap occasionally connecting the best AR artists to brands for paid Lens development deals. Snapchat admits it will need to create better incentives long-term.
At a big press briefing yesterday, the company’s top execs explained that growth isn’t Snapchat’s success metric any more. That’s convenient, considering the launch of Instagram Stories cut Snap’s growth from 17 percent per quarter to it actually losing users and only stabilizing this quarter. Instead, Spiegel says, deepening user engagement, and thereby the ad revenue users generate, is Snap’s path forward.
The more Snap gets users playing with augmented reality filters and the better development tools it provides, the more brands and devs will pay to promote their Lenses in the Lens Carousel or through video ads where users swipe up to try a Lens.
But that engagement is also critical to beating Facebook and Instagram to the next phase of AR. Instagram Stories might have 500 million daily users, but they’re mostly applying AR to their faces, not to interact with the world. Snapchat needs as many fun AR entertainment experiences like Landmarkers as possible to normalize AR exploration, which will unlock the potential of the Scan platform. That could one day fuel affiliate fees from AR commerce sales and other revenue streams.
Plus, Snapchat says Lenses are coded to be compatible with not just iOS and Android, but future AR hardware platforms. To build the biggest repository of AR experiences, Snapchat needs help, as I wrote two years ago that Snap’s anti-developer attitude was an augmented liability. Now it’s finally building the tools and platform to harness a legion of developers to fill the physical world with imaginary wonder. “If we can show the right Lens in the right moment, we can inspire a whole new world of creativity,” concludes Snap co-founder Bobby Murphy
Shiv has over 8 years experience working on Internet of Things and an avid user of Drones