Marketing Misinformation: Solving the Fact Gap

You can’t actively solve a problem until you understand it.  So we need to understand where the spread of digital marketing information has gone wrong before we start solving the problems.

Mis-information (Accidentally / On Purpose)

Most marketing bloggers have good intentions.  They don’t start their day with the intention of lying, cheating, stealing, misleading and confusing their readers.  Bloggers feel the need to perform: write the next post, comment on a few blogs, respond to your own commenters, get a few social media conversations going, etc.  Writers refine their processes and try to be more productive but somewhere along the line speed and belief become more important than accurate or well-researched information.

Where do bloggers cut corners? 

Primarily, the mistake is relying on other people’s thoughts, guesses and research.  They also find (and then re-post) outdated information because it’s easier than checking the facts.

What’s the most agreed on piece of SEO advice in 2014?  Possibly “don’t use meta keywords as Google has said they don’t use them and they may act as a negative ranking signal for catching spam.”  Google has said they don’t use them. That’s pretty clear-cut (if you trust Google.)

But maybe I get information from online blogs like First Click’s blog.  Let me quote:

11. Make use of meta description and meta keywords

Make use of tags, categories, meta description and meta keywords. This is especially helpful on your regular blog posts as it will help them rank quickly.

Wait, WHAT?  Publish date? 

Lack of Data

I’ll put it out there very simply: I’ve read blogs that say “disavow doesn’t work.”

Bullshit.

This assumption some SEOs make is categorically untrue based on my direct and extensive experience.  I have personally completed disavow recovery files for hundreds of clients.  Not 2.  Not 10.  Hundreds.  Here’s the truth: disavow works and 95% of SEOs don’t understand what it does.  That’s the plain truth.  The disavow tool works if you use it correctly.

So why do so many people say it doesn’t work? Lack of data and research, a few bad anecdotes and not understanding the tool. No Google Penguin refresh for over a year? That didn’t help either.

The more data you have, the more you can theorise about that data.  However, you also need to watch out for another danger:

Invalid Analysis & Poorly Constructed Theories

One SEO disavowed “every link” to his site and was surprised at the results.  A couple months after submitting the file, his rankings dropped and never recovered.

Really?

TRULY?

What is wrong with this “test?”

In our testing, Google seems to start actively considering disavow files either:

  1. At the time of a reconsideration request or
  2. 6-8 weeks after submission (edit: it’s actually as the links get cached.)

Unfortunately, an algorithm updated happened a couple months after Cyrus submitted his disavow file.  The prevailing theory was that it updated at a major algo change.  Over one or two tests you don’t learn anything – there’s no baseline.  What are we comparing to – previous rankings?  But over 100 tests, you find out “oh, it kicks in about 6 weeks later. 8 weeks is well within the bounds.”  You can speed this up dramatically if you read between the lines of the information I’ve given above, too.

This type of marketing “research” just becomes the next myth we read on hundreds of blogs.  This disavow test has 186 backlinks on Ahrefs, 193 Tweets, well over 300 G+ shares and 50+ likes.  You know that many SEOs have since told their clients, boss, and other blog readers that “disavow doesn’t work” or “one guy disavowed everything and his rankings tanked!”

For the record, I love almost all of what Cyrus writes. He is an industry hero of mine but we need to point out when things aren’t right and this disavow “test” is Swiss cheese with all its holes.

So who do you trust?

First, trust yourself.  Do follow-up research. Test everything you can. Someone says red performs better, someone says blue.  Test both, use the winner, try to improve on that result with further testing.

Trust experience. When you start reading a new blog, check on other posts.  Do they make sense? Is it all well-written or does it seem rushed and thrown together?  Do you think the person writing it has done & tested these theories or does it sound like every other blog post, rehashed in a few new sentences?

Finally, trust cumulative research over single data points. Trust experience and success.  Remember that we’re all in business to succeed and some people will lie just to “win.”  Others have reasons to lead you astray (Google included.)  Don’t let corporate greed derail your marketing.  Let’s get on with fixing our industry.

Leave a Reply

Your email address will not be published. Required fields are marked *