Posted on October 22, 2020 at 4:00 pm

Featured Lifestyle News

Google Lens Can Now Recognise 15 Billion Things

Google Lens which can search what you see using your camera can now recognise 15 billion things, up from one billion just two years ago. To help users identify plants, animals and landmarks, among other things.

Google Lens

The search engine giant announced new ways users can use Google Lens and augmented reality (AR). While learning and shopping as part of its SearchOn event on Thursday.

If you are learning a new language, Lens can translate more than 100 languages. Such as Spanish and Arabic, and you can tap to hear words and sentences pronounced out loud.

From the search bar in the Google app on Android and iOS, you can use Lens to get help on a homework problem.

With step-by-step guides and videos, one can learn and understand the foundational concepts to solve math, chemistry, biology and physics problems.

Another area where the camera can be helpful is shopping. Especially when what you are looking for is hard to describe in words.

With Lens, you can already search for a product by taking a photo or screenshot.

Lens uses Style Engine technology which combines the world’s largest database of products with millions of style images.

Then, it pattern matches to understand concepts like “ruffle sleeves” or “vintage denim” and how they pair with different apparel.

Google also announced that those who are in the market for a new car will soon be able to search for it on Google and see an AR model right in front of them.

“You can easily check out what the car looks like in different colours, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway,” Chennapragada said.

“We’re experimenting with this feature in the US and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon,” she added.

For more updates from the world of Technology stay tuned to UrbanAsian. 

Please follow and like us: