Snapchat is upgrading its visual search features, and putting them at the center of its app. The app is now rolling out changes it announced back in May during its Partner Summit event. The updates include more prominent placement of the “scan” feature — now located directly under the camera’s shutter button — and new capabilities that will suggest lenses and music based on your surroundings.
Snap has been experimenting with visual search, called “scan” since 2019. The feature allows Snapchat users to identify plants and music, solve math problems, and scan food and wine labels with the in-app camera. But up until now, much of this functionality was easily overlooked as it required a few extra taps to access. With the update now rolling out, “scan” is visible whenever the camera is open.
Snapchat’s also adding a few new features it previewed earlier this year, like the ability to shop for outfits by pointing the camera at articles of clothing. It’s also adding Camera Shortcuts, which will suggest a combination of augmented reality lenses and music based on your surroundings. For example, pointing the camera at your pet may suggest AR lenses meant to work with dogs and music to go with your clip.
Though Snap has been working on its “scan” capabilities for some time, the fact that it’s now making the feature much more prominent underscores how big a priority it is for the company. Snap has also integrated scanning abilities into its latest AR Spectacles, which can similarly suggest lenses based on what’s around you (unlike previous versions of Spectacles, the newest ones aren’t for sale just yet). Visual search also helps Snap compete for creative talent with rivals like TikTok and Instagram (which also happens to be working on its own visual search feature). The company told The Verge that it’s working on adding camera shortcuts to Spotlight to make it easier for people to riff on other users’ clips.