How to Avoid SEO Misinformation via @martinibuster
A lot of good information about SEO is out there. But there is also quite a bit of bad information. It doesn’t help when Google’s search results amplifies the bad information
For example, Google’s John Mueller recently debunked the SEO myth of LSI Keywords:
” width=”698″>John Mueller recently tweeted: “There’s no such thing as LSI keywords — anyone who’s telling you otherwise is mistaken, sorry.”
But Google subverts his message by ranking SEO misinformation at the top of the SERPs.
If you search for LSI Keywords on Google, the number one ranked web page asserts that LSI Keywords matter for SEO and the next two search results are LSI Keyword generators.
John Mueller and Google’s search engineers may scratch their heads about where SEO myths come from. As you see above, many times it is Google that is amplifying and reinforcing those SEO myths.
How is a search marketer to know what SEO information is correct when Google’s search results reinforces SEO misinformation?
Discern Between Opinion and Fact-Based Insight.
It’s important to verify if the writer is citing and linking to an authoritative source. Something like a Googler statement, a patent or research paper helps to elevate an opinion into a fact-based insight.
Everything else is just an opinion and they don’t matter if there is zero basis to support it. That something “sounds reasonable” is not enough.
Just because Google ranks something at the top of the search results does not make it true, either.
Googler Statements Must be In Context
Some people have agendas. When that happens they tend to cite Googler statements out of context in order to push their agendas.
The typical agenda consists of sowing fear and uncertainty for the purpose of creating more business. It’s not uncommon for search marketers to say that Googler’s contradict themselves.
I find that Googler’s are remarkably consistent, especially John Mueller. What is inconsistent is how some people interpret what he says.
Google’s John Mueller lamented in a podcast that “two thirds of what he is quoted as saying is misquoted or quoted out of context.”
An Example of Fact-Based Insight
If your rankings dropped nowadays it could be because the algo decided that another page is more relevant to the search query and the users. We know this because Google has published official guidance on their updates.
Among the many insights that Google’s official guidance says about the updates, it shares this:
“…the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.”
That’s an official statement that one of the reasons a site may lose search positions is because another site was “under-rewarded.”
Now here’s a reason that has not been confirmed. Another reason could be because the content is not factually correct.
Nobody’s discussed the algorithms surrounding this. No Googler has confirmed that the algorithm is fact checking. The SEO community has a feeling that fact-checking is going on.
Is there any basis for the idea that Google is fact-checking? Yes, there is.
Nobody else (as far as I know) has discussed the following research paper. The Google research paper is called, Relevant Document Discovery for Fact-Checking Articles.
That research paper describes a way to fact-check articles. It proposes a way to verify factual information.
So the claim that Google might be fact-checking health related sites has some basis to it. We don’t know for sure. But the fact that there is this research paper (and others) elevates the opinion to a possibility. There is a basis for the idea.
We don’t know for certain. But there is at least evidence that fact-checking is something that Google has been researching.
The next best evidence is a statement from Google confirming that they are doing something.
Fact-check What You Read
In an article about what was said at a Webmaster Hangout, always watch the cited video clip yourself. By watching it you can determine for yourself if the article you read was correct or if it was omitting something in order to push an agenda.
Correlation Studies are Not Reliable
Articles featuring correlation data attract a lot of attention. Data obtained from studying millions of search results will show patterns.
It’s undeniable that patterns are revealed.
But the patterns are meaningless because… correlation.
For example, if we extract that the XX percentage of top three rankings are published on WordPress, does that mean publishing on WordPress helps rankings? No, it does not.
Correlations tend to be meaningless. Meaningless correlations happen all the time and are the norm. Assigning meaning where there is no proven meaning is a mistake.
A correlation study of SERPs that typically consists of multiple search intents will not reveal useful information about today’s AI/Machine Learning algorithms.
Articles based on correlation are, in my opinion, great clickbait but generally have no usefulness for understanding ranking factors.
Correlation-based SEO articles consistently reach the wrong conclusion of what caused an effect.
1. Data is concrete and irrefutable.
2. Interpretation of the data is fluid and refutable.
IF there is some research, patent or a Googler statement that shows that it’s been researched, then the test conclusion has a higher probability of being correct.
I’ve been working in SEO for almost 20 years. I have seen all kinds of crackpot hypotheses and reasonable ideas floated to explain things. But they were just ideas. They had no basis in fact.
They are essentially just guesses. Guessing is a poor basis for creating a business strategy.
Proof via citation (research, patent or Googler statement) shows that an idea is at least possible or factual.
Nobody can say with certainty that X caused Y because what happens between the X and the Y happens inside Google’s so-called black box in which nobody can see what is happening.
And what happens in Google’s black box stays in Google’s black box.