A judge ruled that Google’s ubiquitous search engine has been illegally exploiting its dominance to squash competition and stifle innovation.
"WASHINGTON (AP) — A judge on Monday ruled that Google’s ubiquitous search engine has been illegally exploiting its dominance to squash competition and stifle innovation in a seismic decision that could shake up the internet and hobble one of the world’s best-known companies..."
It can proxy anything you want to. There are a lot of searxng instances out there who have different setups. You could proxy only google or all the search engines that exist. Up to you. Ideally, I would make it so searxng can operate independently and have their own search engine algorithm but so far, this is the most open source and self hostable option available.
That's great an all, but it doesn't change the market one bit. Nearly all "alternative" search engines are bing proxies already. When bing went down a month or so ago, many of those "alternatives" went down too. Even the ones that supposedly had their own indices. I know this because I was using an alternative that simply went down too.
Searxng just serves as a proxy in front of a proxy.
In fact it improves search results, when you have multiple search engines turned on searxng does some sorting or filtering thing and manages to filter out lots SEO crap and ads from search.
No, I mean the tech behind it, not the concept. The bittorrent application is able to find a file to download from a bunch of other people. Not only the file itself, but parts of it. It's a distributed search.
This works because it's the same file just distributed. But in the case of search, every node would need to have the entire index of the web. If not, how would the client decide who's index is better and which page rank fits better with the search? I really don't see how this would work.