2

Google Search’s Video AI Lets Us Be Stupid

[ad_1]

Now you can get answers to all the dumb questions you are afraid to ask another person or find it difficult to formulate in a traditional Google Search.

The Google I/O this week’s keynote was a two-hour commercial about all the ways AI will extend and permeate many of the company’s biggest software and applications. There were demos showing how existing AI features will be enhanced by Gemini, the leading AI-powered generative chatbot. But one of the more impressive examples was how it could enable Search to answer your questions while shooting a video.

This is the AI ​​future my shame-fearing self wants when I don’t know a seemingly obvious auto part or whether I should get a rash checked by a doctor.

On the other hand, I can’t ignore the fact that the usefulness is enhanced by how sharply the quality of Google Search has declined over the past few years. The company has practically invented a Band-Aid for a problem that keeps getting worse.

More from Google I/O 2024

Onstage at Google I/O, Google Search VP of Product Rose Yao walked viewers through how they can do just that. She uses Google Lens to troubleshoot a record player, recording video as she carefully asks aloud, “Why doesn’t this stay in place?”

Without naming the offending part—the tone arm that carries the needle on the record—Yao forced Lance to use context clues and offer answers. The search gave the AI ​​a summary of what it thought the problem was (arm balance), suggested a fix, identified the make and model of the turntable, and highlighted the source of the information so it could search for additional answers.

Read more: Google is upping its AI game with Project Astra, AI previews and Gemini updates

google video search google video search

Google/Screenshot from CNET

Yao explained that this process was made possible by a series of AI queries strung together in a seamless procedure. Natural language processing analyzed her verbal request, then the video was broken down frame by frame from Gemini’s context window to identify the player and track the movement of the offending part. Search then scoured online forums, articles, and videos to find the best match for Yao’s video query (in this case, an article from audiophile manufacturer Audio-Technica).

Right now, you can do all of these things separately and arrive at more or less the same answer… eventually. You can point Google Lens at something and have it identify an object. You can also carefully phrase your problem and hope that someone else has asked something similar on Quora, Reddit, or elsewhere. You can even try searching for the brand of your turntable by trial and error to find the exact model so you can refine your search.

But assuming Gemini-powered Google Lens works as demonstrated, you can’t get answers to your questions from the Internet as quickly as we saw on stage at Google I/O. Perhaps more importantly, you’ll get plenty of help as you ask sensitive — and possibly uncomfortable — questions.

Consider the possibilities. “Which part of the car is that?” you might ask. “How often should I change them?” you might say, pointing at the sheets. “What’s the best way to clean this?” you might say from your car as you point to the food stain on your shirt. “How do I turn this into a pitcher of margaritas?” you might ask a little too confidently as you point to a counter covered in ingredients. Or maybe when you point to a part of your body in alarming shape, “Should I check this?”

Rose Yao uses pixel phone to search with google gemini AI to search Rose Yao uses pixel phone to search with google gemini AI to search

Rose Yao receives results on her phone screen from her Google Lens video and oral question.

Screenshot/James Martin/CNET

Google Lens, Search, and its AI tools aren’t a substitute for expert knowledge or medical perspective, so don’t think the company has substituted professional opinion. But it can help you get over that painful first hurdle of trying to understand What to search. In the turntable example above, I needed to describe in text which part had the problem — so I searched for “anatomy of a turntable” to visually identify the part while writing this article.

Experienced internet searchers can take it from there. But Google Lens can speed up the friction of refining searches when fixing specific problems, which can become even more difficult if it’s a rare problem with sparse results. If it is difficult to specify the problem in a given search term, and your frustration is combined with shame, you can abandon your search.

So the Google Lens process—assuming it works widely enough for people to use it to search for things in real life—seems like a great tool for many of the simple questions you might have missed answers to decades ago. Heck, for those with severe anxiety, asking the faceless Google Lens for help could be a lifesaver instead of a human being.

And if Google Lens lets me ask which part of my engine the oil cap is without having to suffer the judgment of my mechanic I’ve been going to for years, so much the better.

Of course, these answers are only useful if they are true. A Google I/O promotional video shared with the audience had another example of using Google Lens to get answers; in this case a faulty film camera. Like On the edge Notably, the responses provided by Search’s AI included opening the back plate, which would have exposed it to daylight and destroyed the undeveloped roll of film.

If a company’s AI can’t avoid making harmful suggestions, it shouldn’t recklessly analyze online sources of information. Then again, maybe the reason I’m so intrigued by AI showing search results is that it’s getting harder and harder to find useful information online.

AI, Google’s search patch

The new and useful features of Google Lens are a reminder that information is harder to find on the internet these days, period. Search results are front-loaded with ads that look exactly like legitimate links, and after numerous algorithm tweaks over the years that mix up which results appear first, the overall quality of highlighted sites in the results looks much worse than in the past.

Amidst these algorithm changes that change the way sites get search traffic, the search ecosystem is suffering as sites turn to SEO tricks to rank pages higher than competitors (full disclosure: CNET uses some SEO tricks). I’ve heard many friends sadly say that they add “Reddit” to every Google search for a chance to get their query answered.

In this reality, as manual searches yield less useful results every year, using AI to automatically parse the nonsense seems like the better choice. But for the search ecosystem, this seems like a temporary solution that is harmful in the long run. If enough people rely on AI to search for them, the sites dependent on that traffic will starve — and there will be no online answers for Google to send its AI to retrieve.

Editor’s note: CNET used an AI engine to help create several dozen stories that were labeled accordingly. The note you’re reading is attached to articles that deal essentially with the topic of AI, but were created entirely by our expert editors and writers. For more see our AI Policy.



[ad_2]

نوشته های مشابه

دکمه بازگشت به بالا