Snap's increasing focus on utility AR
"Augmented Reality grounds us with a view of the physical world, and brings all that's available in the digital world to us..."
This is how Bobby Murphy, co-founder of Snap defined augmented reality (AR) a few months before the tech world turned its focus to the metaverse following Meta's announcement. Snap, Snapchat's parent company, has been quietly working on expanding the development of its AR tools and experiences, making it easier for users to enhance our physical world through digital add-ons, rather than create a virtual one from scratch.
At the latest LensFest 2021, a festival organised by Snap to inspire and innovate in AR, the company announced that since 2017, more than 2.5 million Lenses have been created by 250,000 creators from 200 countries and territories around the world. Lenses, which are similar to image filters that one can see and interact with in real-time on a smartphone or smartglasses, have been used for everything from taking funny pictures with fake monkeys, shopping, to making the Eiffel Tower throw up rainbows.
With more than 75% of Snapchat users engaging with some form of augmented reality, the company has been quietly building a number of AR-related products as part of the 'Snap AR' ecosystem. The launch of Lens Studio in 2017 ensured that millions of users and creators around the world can design, develop and use their own Lenses, allowing increasingly creative Lens filters. Since then, Lens Studio has been constantly updated to reflect changes in AR technology, adding tools and functions that enhance the experience of Snapchat users.
Lenses for Utility and Education
The latest updates of Lens Studio have been the most fascinating because while currently most of the AR lenses available on Snapchat are more personal and informal, there is a growing number of Lenses that have real-world use or are educational in nature.
This concept is not a new one. I have previously written about how IKEA is using augmented reality to enhance their customers' experience with buying furniture, while many others are co-opting AR in order to deliver a better product to their customers. Snapchat is, however, in a better position than companies like IKEA, because its AR tools cater for more areas than just single specific tasks.
Snapchat's launch of 'Scan' in 2019 is the most obvious proof of the company's increasing interest in rolling out AR for everyday use. The 'Scan' feature contains what are called 'Utility Lenses', which essentially turns Snapchat into a visual search engine. This allows the user to receive real-time information on dog breeds, plants, identify cars and sounds, as well as details on bottles of wine and health facts of different products.
With the latest update of Lens Studio and some coding knowledge, one can now also create utilitarian Snapchat Lenses by connecting these to APIs (application programming interface) to create what look like visual mini-apps. Currently, available APIs allow users to get real-time language translation, check weather data, the stock market, and the price of several cryptocurrencies.
Combining these with other available AR tools, such as 'Location Triggers', which allows the creation of GPS-responsive Lenses that respond to changes in one's location, allows users to create AR-ready experiences that enhance everyday activities. An example is the tweet below, where weather data and the 'Location Trigger' effect was used by a user to create a Lens allowing him to view the name and weather conditions of a city, together with the estimated time of arrival while on a moving train.
These tools have also been applied to Lenses that are more educational in nature, such as the Landmarker Lens - which allows users to transform landmark buildings across several cities around the world in real-time. Snap joined forces with organisations around the world in highlighting the capabilities of this Lens, collaborating with the Great Barrier Reef Foundation in Australia, where AR was used to raise awareness on their mission to plant a million corals in 2021, and with the London Design Museum, where AR helped to turn the same museum into a living exhibition.
The AR ecosystem is also being pushed forward thanks to hardware upgrades in mobile phones and smartglasses. Following the release of Apple's iPhone 12 Pro in 2020, Snap became among the first to ensure that the smartphone's integrated LiDAR sensor can be used through LiDAR-powered lenses on Snapchat. LiDAR is a way to determine the distance between the source and an object, by targeting that same object and calculating how long it takes for the light to bounce back. This makes any AR Lens look more authentic by better calculating distances between the user and objects in the field of view.
Many of these Lenses have also been made available to Snap's own smartglasses, Spectacles. With smartglasses, Snap has been taking a cautious approach before releasing the product for mass consumption. The three different versions of Spectacles have mostly been used by tech journalists and Extended Reality experts, who in turn have been giving feedback to Snap on their experience. From this, we have been able to see how smartglasses and their experimental features are pushing the boundaries of what we currently know about AR.
Something I am also really looking forward to is the use of AR wearables to change the way we learn and interact with history. This is not strictly a utilitarian use of AR, but the possibility of having a quasi-time machine that takes you back to whatever period you want to learn about overlayed on what you're currently seeing is too exciting to miss out on. An example of this was the experiment by Lauren Cason and 'Hashtag Our Stories' which overlayed original images of the Boston Marathon from the Library of Congress on the location of the same marathon to create a real-time history lesson.
The future of 'everyday' AR
The above is a small part of what's possible with SnapAR's augmented reality ecosystem. There are many more tools, Lenses, and effects that one can play with and combine to create AR for everyday use. In the coming years, augmented reality is predicted to grow substantially as part of the broader Extended Reality (XR) industry. This is expected to reach $300 billion by 2024, from $30.7 billion in 2021 and Snap won't be missing out on this expected growth, thanks to its increasing emphasis on AR, and the general acceptance of AR by the market.
Snap's own AR innovation lab, titled 'GHOST', is propelling this interest by offering funding and expertise to developers creating Lenses and leveraging Snap's AR tools. Around $4.5 million are being invested by Snap on partners looking to innovate in AR through available tools such as early access versions of Lens Studio, who in turn have access to the millions of Snapchat users around the world. Utility and Education are two of the six tracks under which individuals and groups can apply for funding, with Snap seeking to find developers tackling "day-to-day challenges" and "accelerate learning around the world".