Google Assistant’s “Search Screen” feature, powered by Lens
Google has officially started rolling out the new Google Assistant with a new feature called the Search Screen, powered by Google Lens. It is already quite useful, as Google Lens allows you to search for images but also allows you to translate, find similar products, and provide other such functionalities. Now, you can directly use Google Lens with Google Assistant.
However, this feature has already been there for years, but the company is now replacing “What is on my screen?” with a new feature called Search Screen. Although the Google Lens functionality was accessible through shortcuts, it was not a viable option until now, and integrating this new feature into Google Assistant will encourage users to use it even more. Previously, users had to use the “What is on my screen?” command, but direct integration makes this functionality much easier and more streamlined.
Google Lens-Powered Google Assistance Feature Called Search, lets users view their screen’s contents with ease.
This feature is directly integrated into Google Assistant, so you do not need to give a voice command. This makes it much easier to use. This feature is useful in many cases, especially with direct integration, as users can now launch Google Assistant with a long press of the power button.
After launching Google Assistant, you will find the Search option in the Google Assistant dialogue bar. You don’t need to manually launch Google Assistant whenever you capture or share a screenshot in Google Lens. When you use the Search Screen, it may take some time to analyze, but if it finds something on your screen, you will also get the option to Read, which will be helpful. It will likely use the recently launched Google Reading Mode for article reading.
Google announced this feature in February, and now the feature has started rolling out officially. This Lens shortcut is starting to appear on Pixel devices, which will decrease the use of “What is on My Screen.” For now, “What is on My Screen” has not been removed because it does not trigger everywhere, making it an unreliable tool. However, this feature will appear everywhere. Recently, Google made some minor adjustments to Google Assistant and its visuals, which will be further refined with the update.
With this feature, you can use Google Lens in various ways, such as analyzing what is on your current screen, translating, copying text, shopping, searching Google, and finding places. For now, this feature is exclusively available on Google Pixel phones in the Google App v14.3.1 Beta. This feature is expected to start rolling out to other OEM devices globally in the coming weeks.
- How to Install Windows 11 24H2by Romeshwar Prasad
- Does Rice a potential solution for Wet iPhone?by Akhil Sharma
- One UI 7-based Android 15 Tracker: Eligible Deviceby Mehak
- Galaxy S24 Series starts receiving its first update.by Romeshwar Prasad
- Download Android 15 OTA for Pixel Devices | Developer Previewby Romeshwar Prasad