5 Future Scenarios for Google Lens

When David Pierce covered the transitional birth of Google Lens for WIRED in 2017, he pegged the tool as “a long-term bet for the company and a platform for lots of use cases.” In the following years, as Google Lens out-survived other experimental forays, senior writer Lauren Goode wrote about the long journey toward perfecting visual search queries.

Five years in, Google Lens remains great for identifying weird plants and helping students power through their algebra homework. The longer it sticks around, the more it seems to encompass. Ever use reverse image search? That’s powered by Google Lens now. Was Lens software inside the augmented-reality glasses prototyped at Google I/0 2022? Unconfirmed, but quite possible.

Google CEO Sundar Pichai opined for a world filled with augmented reality accessed without a smartphone at the company’s May presentation. He said, “These AR capabilities are already useful on phones, and the magic will really come alive when you can use them in the real world without the technology getting in the way.”

Wanting to better understand what’s in store for Google Lens, I visited the company’s San Francisco office and sat down with Lou Wang. He is a director of product management at Google who’s worked for years on Lens.

When asked about how he balances creating quality features for right now and building toward the future, Wang sounds convinced that phones and desktop computers will continue to dominate the present. “I’m personally really excited about glasses, but it’s going to take a while for that to scale up,” he says. “So, our focus is very much on smartphones, with the understanding that some of the things we’re talking about, like scene exploration, become more powerful when you don’t have to pull out your phone.”

After this conversation, I took a vacation to Yosemite and contemplated what’s in store as AR applications are layered over real world spaces, even locations we visit to disconnect and experience the natural world. What follows are five future situations involving Google Lens. The predictions are illustrative, not all-encompassing.

Hiking Trails Guided by AR

It’s 2030, and you’re low-key surprised by how well the electric car auto-navigates the twisting mountain roads as you arrive at what’s left of Yosemite. A motor from the car’s air purification system incessantly whizzes on this smoky morning. As you look out the window, Google Lens draws outlines of cliff faces visible on clearer days.

After getting to camp and setting up a tent, you travel south toward the Mariposa Grove Trail. Around the hike’s halfway point you notice a trail placard that reads, “Activate AR for Accurate Historical Recreations.” All right. A pulsating arrow now floating midair points you in the direction of the fallen Wawona Tunnel tree. As you turn the corner, all that exists is a small charred log. A towering 3D model of the mammoth plant is opaquely overlaid. You stand in the distance and watch pretend horses transport families through the tunnel. On the hike back to the car, you wonder why the men used to dress like that.

SEO Strategies Focused on Unique Images

You’re in Yosemite for a night to relax, sure. You’re also here to take a few striking, scenic product photos for a budding ecommerce store that sells custom climbing gear. As more people use product photos and short videos to shop with Lens and Lens-powered store search tools, this is one of your favorite methods of search engine optimization. You unpack lighting gear for the shoot as the sun starts to set. The photos you capture will be recreated by rival businesses who utilize programs with artificial intelligence in pursuit of the top shopping result for items like carabiners.

You think about your photographic touch as a human capturing that special essence. You wonder whether you’re foolish.

Late-Night Snacks Chosen by Algorithms

It’s well into the night as you finish the photoshoot. None of the campsite snacks you packed into the bear box—a relic from ecologically diverse times—looks tasty. You ask the car to go pick up snacks at the only nearby gas station open all night. It’s a 45-minute drive, one way. You doze off under the stars by the campfire pit filled with concrete.

A soft ringing noise coming from your glasses lets you know the car has arrived. A videofeed sent by the car flashes in front of your face via Google Lens. As you home in on the different ice cream flavors offered, the AR software notices your gaze and circles the top three flavors based on previous late-night snacking purchases. Tonight, you want something new and say aloud at the campsite, “Most popular flavor near me.” A digital blue jay descends from the top of your altered plane of view and perches on a vanilla ice cream mixed with animal crackers. Almost too excited, you whisper, “Purchase. Confirm.” A small arm extends from the car and places the treat into a compact freezer.

Source

Author: showrunner