Ask HN: Are AI language models making search engines unusable for you too?

Are you sure your browser doesn’t have some sort of weird extension installed that is overwriting the results, etc? Sometimes, I have seen malware installed on people’s computers that hijack results.

It’s very subjective, but I feel this too. It’s like in adding ML, search has sacrificed precision.

For example, adding ‘-‘ no longer seems to work for excluding terms. Yesterday I searched for “digital agency” and then “digital agency -marketing”. The latter had 3x more results than the former.

Google can be great when I don’t quite know the right term for what I’m looking for, but often when I need something very specific I get nowhere.

The – symbol does still work, I think you need to place it outside of the quotes though.

I just got 29m results for

“digital agency”

And 14m results for “digital agency” -marketing

I feel that, I find that when I want the answer to something I add “reddit” to the end of the question or query. I dont use a lot of reddit but I have found that usually you get much better / non-paid answers on there. So much of todays internet are just advertising in disguise.

Usually when it comes to questions or comparisons with products, at least with reddit its more likely to be akin to a customer review than some random article that who knows who paid them to tell you x is better than y.

I do this too and also sometimes substitute “reddit” with “forum” if I’m researching DIY projects.

PageRank is based on conectivity so the gibberish content needs to be linked to by other highly ranked content to become an issue.

> When I query things in my own domain of knowledge, many of the responses to the questions are outright wrong

That’s the Fake Internet. Information pollution will continue flooding the web, that will cause “islands of trust”, with highly curated/moderated content.

Classic search engines like Goggle will become irrelevant. People will be able to “build” their own search engines, not just the queries. Actually they won’t need queries, search will be predicted and automated to user actions. Full trust.

I’m developing one of those islands with collective curation.

I call it the imaginary web. I’m making an imaginary web browser and a blockchain to enhance the trust. Imaginary/fake web is useful. If you want to live in a curated web, go ahead, but web3 is the way to embrace the full imaginary+real internet

I still see web3 as one of those islands, maybe a continent, but will be walled by the trust method you describe. There is no need to filter the imaginary Internet if an AI can build a whole network on request. So basically the Fake Internet will stay there, polluted, as a relic, until the cost of maintaining it becomes larger than whatever can be exploited from it.

This has been going on for long before the newer fancy language models. And it still is mostly simple shit, like crappy scraped content off of other sites (like all the product comparison sites).

Related Posts

Leave a Reply

%d bloggers like this: