Been trying to get away from DuckDuckGo because of their support of “AI” stuff, but come on, one of these isn’t even in English, and that’s what I have my language set to.
Been trying to get away from DuckDuckGo because of their support of “AI” stuff, but come on, one of these isn’t even in English, and that’s what I have my language set to.
DDG’s backend is really just Bing, though …
They give different results, it seems they have different treatment to the data, which is interesting. Startpage gives different results than google, duckduckgo gives different results than bing, but on the bigger picture, a bigger sample gives the statistically optimal result??? (closer to the source truth on the most probable good result) question mark, huge number theory on statistics.
(i’m drunk)
Bing in a sandboxed session yielded results identical to DDG in my experiments.
In searxng they don’t give the same results, bing brings some completely different stuff both on an selfhosted instance and on an public instance (it just gives random garbage), it seems proxying does some stuff to bings internal algorithm, as google is different than startpage on searxng.
Edit: a screenshot showing it.
Given the same methodology (search algorithm) a larger sample size gives a truer picture of what that methodology favors. If the methodology isn’t neutral or doesn’t bias in the same way as your desired outcome then a smaller sample size analyzed by a different methodology may provide better results.