- cross-posted to:
- technology@slrpnk.net
- cross-posted to:
- technology@slrpnk.net
I guess we all kinda knew that, but it’s always nice to have a study backing your opinions.
I guess we all kinda knew that, but it’s always nice to have a study backing your opinions.
A lot of people dont remember pre-google these days.
Normal search engines worked, but Google was better results.
Now that every website is gaming SEO and the top half of search results is ads that pay to be first…
Google isn’t that much better. I went to DuckDuckGo recently. The only thing Google does better is local results. But that’s because Google always knows where I am and where I’ve been.
There’s no longer a reason to use Google as a search engine, except habit.
Pretty much same with chrome
The main thing that got me switching to Google back then wasn’t the better results, but their promise not to collect or use our data.
That all changed after 9/11, but by then Google had grown so huge it was hard to avoid them.
Even so, I still went back to Webcrawler and the others quite a lot and never really consistently used one search engine faithfully.
DDG uses Bing as the search API, and I don’t see any evidence that it doesn’t use SEO as well.
Just to be clear; “SEO” or “Search Engine Optimization” is a technique marketers use to craft web pages in a way that tricks search engine crawlers into considering them more relevant. It is not something search engines themselves do, and in many cases they actively fight against it.
So, it’s not whether or not DuckDuckGo uses SEO, it’s whether or not they’re susceptible to it.
To add to that, Google is the big one.
So everyone tries to get around googles SEO prevention measures.the little guys just have to do literally anything different
Coincidentally, I happen to have been reading into SEO more in depth this week. Specifically official SEO docs by google:
https://developers.google.com/search/docs/fundamentals/seo-starter-guide
To be clear, SEO isn’t about tricking search engines per se. First and foremost it’s about optimizing a given website so that the crawling and indexing of the website’s content is working well.
It’s just that various websites have tried various “tricks” over time to mislead the crawling, indexing and ultimately the search engine ranking, just so their website comes up higher and more often than it should based on its content’s quality and relevancy.
Tricks like:
Those docs linked above (that link is just part of much more docs) even mention many of those “tricks” and explicitely advise against them, as it will cause websites to be penalized in their ranking.
Well, at least that’s what the docs say. In the end it’s an “arms race” between search engines and trickery using websites.
I remember pre-Google. There were a few human curated sites back then (like DMoz and Yahoo). I’m thinking that might be a way to combat spam and AI sites. As a side bonus, maybe it will help de-Google the planet.
I’m looking for a Wikipedia-but-for-the-web, where human curators find real web content for me. I found Curlie.org, and tried to sign up for it, but never got a response back on my sign-ups. Still I’m hopeful for something like that.
Yahoo was DMOZ (its directory used DMOZ data).
DMOZ had 100k volunteers curating the content at some point, and had a whole complex process to prevent abuse and so on. It will be hard to get going again.
But yeah, who would’ve thought that a mere decade after being discontinued it would become relevant again.
I need to rollback to Google from DDG because the latter seems to refuse to understand that I want to find specific words with “”
And DDG isn’t perfect either, I need to add Reddit as well more than I’d like to.
Google ignores “” too these days.
Really? I haven’t noticed that…