Google expands Search Generative Experience's testing to include users who didn't opt in to see the experimental feature.
If you're in the US, you might see a new shaded section at the top of your Google Search results with a summary answering your inquiry, along with links for more information. That section, generated by Google's generative AI technology, used to appear only if you've opted into the Search Generative Experience(SGE) in the Search Labs platform. Now, according to Search Engine Land, Google has started adding the experience on a "subset of queries, on a small percentage of search traffic in the US." And that is why you could be getting Google's experimental AI-generated section even if you haven't switched it on.
Google Search was so good when it came out. Complete polar opposite to the cluttered and bloated Yahoo Search. Haven't really using it for years now because the search results became worse and worse, especially when that rounded edge theme came along.
It is useless in searching for new info, I mostly use it for searching for things I already know/seen, but don’t want to bother with URLs or bookmarks.
Even then, I have to scroll to the middle of the page, to get to the actual results below all the sponsored crap.
No clutter, meant faster loading time, and that was important at the time. Nowadays, you can just type the search query to the address bar, but that wasn’t available back then. Initially, you didn’t even have one of those extra toolbars with a little search box, so loading the search page was the only way. If you do like 50 searches a day, those seconds spent on waiting the page to load really begin to add up.
Isn't it the training of the models which is the most energy intensive? whereas generating some text in answer to a question is probably not super intensive. Caveat: I know nothing
Yes training is the most expensive but it's still an additional trillion or so floating point operations per generated token of output. That's not nothing computationally.
oh that's that same shit that bing does that ends up filling the top quarter of my search results page with useless chatGPT garbage that doesn't help my search query
(both my employer and my school have forced edge+bing as the standard browser and it makes me want to die)
As someone in IT I get an employer enforcing Edge (I don't do that, but I understand why an IT department might), but why would anyone enforce a specific search engine? That seems bonkers to me.
Well, it's the system default, and while you can change it during each session or manually browse to Google/DDG if you want, it will always reset the next time you log in.... I am incredibly lazy and 99% of the time will smash my super quick search into the omnibar and end up stuck with it until I eventually get mad enough at Bing to force keep a tab open with Google.
Normal search results are already littered with useless ai generated seo optimized crap. It's got to the point where sometimes it's quicker to learn the knowledge you seek the old fashion way: by reading books.
I just assume that these people never have any problems that they have to solve personally. Otherwise they would be frustrated by the inability to find necessary information. They are either rich or children or both.
I wish that was the case but sadly most of them are basically Bing or Google frontends or belong to entities that I trust even less. As far as I can tell there are very few independent crawls out there.
This may actually be a net improvement to the Google Search experience, since the engine is borderline unusable without uBlock Origin. But also it feels weird that Google would make an AI generated prompt the focal point and not the entire rows of sponsored ads that litter all search results.
How did the big tech industry get this terminally stupid?
Almost every time I ask a direct question, the two AI answers almost always directly contradict each other. Yesterday I asked if vinegar cuts grease. I received explanations for both why its an excellent grease cutter, and why it doesn't because it's an acid.
I think this will be a major issue with AI. Just because it was trained on a huge wealth of knowledge doesn't mean that it was trained on correct knowledge.
Just because it was trained on a huge wealth of knowledge doesn't mean that it was trained on correct knowledge.
Which makes its correct answers and it's confidently wrong answers look as plausible as each other. One needs to apply real intelligence to determine which to trust, makikg the AI tool mostly useless.
I don't see any reason being trained on writing informed by correct knowledge would cause it to be correct frequently. unless you're expecting it to just verbatim lift sentences from training data
It should be illegal to force people to use generative ai to do things it is not needed for.
Seeing Microsoft's plans to add ai to windows was the last straw that made me change to linux.
Ecosia added ai chat which I think runs on the same thing as copilot. I don't see the point though and would like to be able to hide the ai chat option.
Technically if you submit a query to the search engine, you do so because you want answer to a question in the best way possible without having to do too much digging.
So does it matter if it uses AI to help you? I say its a great feature.
I'm searching to get specific information, and good information. I've seen LLMs make shit up and be wrong enough times for me not to trust them. I'd rather turn that feature off.
No. A lot of times I'm looking to compare many answers. I'll give you an example.
If I want to look for interesting barbecue rubs that I haven't tried before I'll query a search engine. Historically (not so much recently) Google has been better at searching through forums than a direct forum search. So I can check many different sources for the ratios people are using and make my decision.
Google's half baked AI is really terrible right now. It has a memory of about two answers, barely understands context, and hallucinates more often than both copilot and ChatGPT.
Now I'm looking for a coffee rub and it's giving me injection advice (happened when I tested Gemini), it gets barbecue styles mixed up, doesn't follow dietary restrictions that are explicitly stated, and will give you recipes for the wrong cut and type of meat.
It's not ready, and anyone trusting it for an answer to a question is going to have a bad time. If you have to verify it by checking a bunch of links anyway then it's not only worthless, it's making search take longer and take up screen real estate.
We're in the technology sub. People here are old enough to know how to Google (old forums, preferably Reddit, as Lemmy is absent), they don't know how to use an AI effectively (just look at how they're trying to justify that). Don't worry about the downvotes and their nonsense responses. Those are the same people who microwave their water instead of using an electric kettle.
Lol. The generated result that is incomplete and slower than the rest of the search. I usually scroll past it because it's not done generating. If it does generate fast enough, it's usually too vague or broad