Google Unveils Four Innovative AI Products But Is Yet To Answer Key Queries About The Future Of Responsible AI

Search engine giant Google just rolled out four AI products and they’re mostly for mobile phones.

But while it may have created massive hype while doing so, it failed to answer some leading questions in people’s minds about how AI can be used responsibly for a better future.

This includes questions about how accurate AI-based data can be, and how it plans on compensating the right sources to produce content that its models are using for the sake of training.

The latest feature can be found as the ‘circle to search’ that would be unveiled on the Pixel 8, 8 Pro, And Samsung Galaxy S24 mobile phone series. This rollout means users finding something interesting when they scroll can enable a search on the Search Engine by simply circling that the touch of your finger or by highlighting this.


The results will pop up at the screen’s bottom as witnessed below and many call it the thrift flip. The results on offer are going to be similar to the classic search on Google with users having to select from a long list of clickable links. But if users opt for the SGE pilot, they’ll witness summaries produced via AI.

The mere fact that the leading search engine puts out its marketing pictures using SGE is a clear hint to many about how serious they are in terms of equipping a host of their features with large language models. In particular, for this example, the success only relies on it, giving rise to fast summaries through AI is more attractive than seeing the same thing on tiny screens that have a list of links to pass into.

The second feature to roll out today is called multi-search and it’s a major of the Google Lens series. Beginning today, users of Google Lens will be allowed to take images of objects using their mobile devices or even put up images through this means. They can ask difficult queries regarding the matter too, quite like how one would use the ChatGPT Vision.

The company gave the example of it being like someone seeing a game at a yard sale and isn’t quite sure about it. So simply, they click the image in question and ask for details about how it works. This leads to an AI-generated explanation being produced in seconds.

The more thought and effect that Google was putting into these amazing AI features, the more it couldn’t help but ask what it felt about some other leading queries that have been troubling critics for so long regarding AI. It’s almost as if the company is intentionally dodging bullets and keeping them in the dark, despite it being a pressing matter.
When someone requested more data about how accurate these responses really were since they were generated through AI, the company’s rep mentioned how the firm would be striving for perfection at all costs. But some things might be slipping through the cracks.

AI models would go through a plethora of data and see those types of opinions on display. After finding a consensus over there, this AI model would distill information into just a few sentences. And if it doesn’t find the right consensus on a certain argument, it’s hard for the model to pick the right one.

But the search engine giant remains hopeful that all its SGE results are very meaningful to the firm, even if it's just done internally. They are now getting a lot of good feedback as mentioned by one spokesperson on a call.

Clearly, there is a wave of the latest AI features and the process continues to arise as we speak. The third update released by the Android maker today is linked to an innovative partnership with South Korean tech giant Samsung.

Users of the S24 can now search with the company’s most capable AI model dubbed Gemini when they are using the applications and services offered by Samsung on their phones. For instance, things like Notes, Voice Recording, and Keyboard features would use Gemini Pro to give out the fastest and quickest summary. For instance, lectures could be recorded through a Voice Recorder and you’d attain a summary of key lessons on this front.

Last but not least, there’s news about another AI announcement linked to Android Auto. Here is where drivers can see summaries of long texts or busy group chats taking place. This way, they stay in touch while moving ahead on the roads. This includes reply suggestions and more actions that can be done without touching their devices.

For now, it’s not yet answering questions about plans to compensate all sources that models will attain data. It even failed to share figures related to traffic generated through AI-based summaries.

Read next: Threads is Working on a Post Saving Feature, Privacy Improvements and More
Previous Post Next Post