Tuesday, January 4, 2022

Search Engines and SEO Spam

Michael Seibel (Hacker News):

A recent small medical issue has highlighted how much someone needs to disrupt Google Search. Google is no longer producing high quality search results in a significant number of important categories.

Health, product reviews, recipes are three categories I searched today where top results featured clickbait sites riddled with crappy ads. I’m sure there are many more.

I’ve long been in the camp saying that Google’s search was way ahead. I’m not sure what’s happened, but in the last few months I’ve noticed a huge decline in the quality of its results. I now regularly repeat my searches with DuckDuckGo to make sure I’m not missing something. Sometimes the problem is SEO spam, where the page I want isn’t on the first page or two of results, but perhaps if I clicked Next enough times I would eventually see it. Other times, I’m searching for something rare, Google only finds a handful of matches, and it appears that the page in question is not even indexed.

frenchyatwork:

I think one of the fundamental things that make search work well about 1-2 decades ago was that web sites would link to each other, and that those links could vaguely correlate with reputation. There were link spammers, but there was actually a some decent organic content as well.

What’s happened since then is that almost all the normal “people linking to things they like” has gone behind walled gardens (chiefly Facebook), and vast majority of what remains on the open web are SEO spammers.

ijidak:

Because, years ago, linking to lower reputation sites would drain your page rank. So everyone worried about SEO became afraid to link to anything except: 1) Their own website 2) High reputation sites like NYTimes, etc.

Previously:

11 Comments RSS · Twitter

I personally noticed a solid decline around 2018, when they started tweaking their deboosting algorithms. It seems to have had awful knock on effects, because results went from excellent, to borderline unusable, as you note.

It appears they have done to independent websites what they have done to email - unless you’re on one of the big platforms (medium), they just drop your content.

Beatrix Willius

The problems with search on health topics started in 2019. Google boosted the official health sites like WebMB and made other websites like https://chriskresser.com invisible. Unfortunately. WebMB has wafer thin information (sugar is good for your heart!) and https://chriskresser.com is very good. I thought that at some point Goggle and the pharma mafia got into bed.

AFAIK Bing and the other search engines don't have that problem.

Or perhaps that site is ranked below WebMD because it's just a huge advertising for questionable books? But it is funny that this comment appears below an article that points out that the "vast majority of what remains on the open web are SEO spammers." And thus, this blog post has become part of that exact problem.

There is a search engine called Marginalia that looks for non-fancy websites.

It's not great for finding facts, but it's a fascinating experiment

It's interesting to hear this because, just in the last couple of months, I've noticed the search results from DuckDuckGo have become a lot worse too, at least when I'm searching for technical or programming related topics. Searches that contain a few key terms that really ought to easily find me the official doc page for an API find a bunch of crappy results, like semi-related forums posts or other people's unofficial doc pages from ten years ago, and Google (or Startpage as I prefer) will find what I'm looking for on the first try. Overall though it does seem like search has declined from where it was five to ten years ago.

If only Google had a product that would have millions of people were using every day, and that product would allow Google to find and index sites that a human has found valuable enough to subscribe to. The product would be able to rank readers by reliability, and bring value of those sites to search.

One of Google's dumbest mistakes was killing Google Reader.

Using Reader as a signal for ranking pages would only have caused spammers to mass-subscribe to garbage websites' RSS feeds. If this was such an easy problem to solve, Google would have solved it long ago.

@Plume, it's not that they would look at the number of subscribers, they would look at the people that are reading articles (length of time reading on the article) and acting on the article (sharing, starring, clicking on links).

They already have a good idea of which users they could trust and which users they can't.

My point is that if you give search engine spammers a direct way of influencing search results, they will just generate that signal, regardless of what it is. If it is subscribers, then they will subscribe. If it is reading articles, then they will have scripts that click on articles and scroll through them. If it is sharing articles, then they will have bots clicking on the "share article" button.

If Google only looks at trusted users, then they will buy accounts from trusted users, and use them to get the system to trust their own accounts.

It's annoying that Google's current ranking algorithm is so opaque, but at the same time, it's also unavoidable.

[…] to get their clients’ pages to the coveted first page of results, often through means which make results worse for searchers. Artificial intelligence is, it seems, a way out of this mess — but the compromise is that search […]

Leave a Comment