Google Lens was one of the main highlights during Google’s developer conference keynote last year. The tech giant promised to make an online search as effortless as pointing a smartphone at it. Since then, there has only been a handful of announcements that Google has released about it. In fact, Google Lens was pretty much pushed aside in the latter part of 2018 in favor of other Google updates.
The good news is that the beta period of the Lens is finally over.
A Closer Look
Artificial intelligence is practically interwoven into everything Google-related. It’s strikingly obvious with Google Lens. The system implemented on this feature fuses Google’s most innovative advancements in computer perspective and the organic language processing found in Google Search. Google Lens is basically the company’s algorithm to see, understand, and augment the real world.
The Lens can be found in the camera viewfinder of software powered by Google, such as Google Assistant, and inside the natural camera of the top-tier Android phones. Google Lens can distinguish virtually anything that a human can distinguish. “Anything” can range from people to animals, things to environments, street signs to restaurant menus, screens to books, and so on. From these, Google utilizes the extensive data platform of Google Search to provide actionable information. This includes purchase links for products and the Wikipedia details of popular personalities.
The main objective of Google Lens is to provide users context about their surroundings and all or most objects within those environments. Moreover, Google Lens can do a variety of remarkable, time-saving, and even possibly life-changing actions, like:
- Real-time search – When you pan your camera around, Google Lens can identify practically everything it sees. The Lens will inform you about the flowers and plants nearby, the breed of the dog playing with a toddler in your periphery, and the book the stranger near you is reading.
- Smart Text lookup – Whenever you highlight a text using Google Lens, you can search for its meaning via the Google Assistant. This makes it easy for you to define unfamiliar words or learn how to use them in a sentence.
- Copy-paste – The Smart Text Selection feature of Google Lens enables you to highlight real-world text, and then copy it to use directly on your smartphone. For example, you can point your phone at a Wi-Fi password, and then copy-paste the code straight into the Wi-Fi login page in your phone’s browser.
- Clothing/Decor recognition – If you’re out and about and see a piece of clothing or decor that you like, you can easily identify what brand it is using Google Lens. Furthermore, the Lens will provide you with similar articles of clothing or furnishings complete with customer reviews and shopping choices.
Pixel 2 and Pixel 2X users already have Google Lens on their phones. They can enjoy the new feature as it will go live very soon. The AI feature will shortly be added to camera apps on various Android smartphones from Sony, Asus, LG, Xiaomi, OnePlus, and more in the coming weeks.