It’s been a while since Google released any major algorithm update that totally rocked the content marketing world. But I remember that day in 2013 when Google unleashed its fury and no longer let us see the search terms people were using when they were trying to get to our site, crashing our SEO strategy. And it was right in the middle of me conducting an AdWords audit to make our campaigns more effective.
I’ve always been a die-hard content evangelist with the mindset that you should always write for people and not machines, but I was lost.
My coworkers and I sat there in the office for a good long while with our mouths open in front of our computer screens, looking at that fateful “Not Provided” message. We knew we were writing for people and not machines, but our content decisions were definitely driven by what Google told us would perform well.
Eventually, we got over it and starting working again, even though we felt lost and unsure of what we were doing and whether or not it would work, since it was no longer truly data-driven.
But honestly, our content writing was better because of it; now we could let go of those nagging thoughts of “but will this be good for our SEO strategy?” and instead focus on content that would be 100 percent geared toward the reader and his needs.
With each and every update Google puts out, its search algorithm slants more and more toward favoring behavior-based indicators from readers that signify engagement and content quality and less and less toward the backend metadata of SEO.
I know that I, for one, am glad that I’m no longer seeing Suite101 articles in my search results, and I can only guess that other searchers feel the same way.
Google is way more efficient at delivering me what I want, and I love them all the more for it.
And in fall 2015, I saw Rand Fishkin give his presentation called SEO in a Two Algorithm World, and the direction of SEO all finally made sense to me.
Yes, picking out a keyword to optimize your content around, making sure it’s mentioned in the page title, and inserting it throughout the text is still important.
You do, after all, still have to tell the search engines what the heck your page is about so they’ll know what kind of searches it should rank for.
After all, if I’m writing an article about weight loss and nothing remotely familiar to “weight loss” or “losing weight” or “shedding fat” appear in the article or it’s metadata tags, the search engine is probably going to just think I’m talking about either food or exercise. Which is close, but not really.
This piece of SEO still comes first and foremost in the foundation of content creation, but as you’ll see when we discuss the second algorithm, it’s much less important than it ever has been. It gets you in the ballpark, so to speak, but it doesn’t necessarily get you on the team, let alone a home run.
And in fact, beyond just picking a keyword phrase and making sure you implement it throughout your text a few times, you may not want to spend any more time than that on this part.
The second algorithm is all about how a reader interacts with your content after they click through.
If your search result gets lots of clicks because you’ve got a clever headline but people close your tab after three seconds because they see it’s not for them, the search engines will take notice and your ranking will start to drop.
But if you’re ranked on page two and everyone who clicks through spends five minutes reading and then stays on your site because of all the other awesome, relevant content you’re providing, your rankings will start to rise.
It makes sense, of course, but because smart SEOs have traditionally been able to “hack” the system, it requires a completely new way of thinking and approach to content production.
“Google’s last three years of advancements erased a decade of old-school SEO practices,” says Rand on slide 10 of his presentation, meaning this “hacking” is becoming impossible.
He points out that now and moving forward, search engines will be ranking less for publisher-controlled inputs like backlinks, keywords, anchor text, and page loading time, and will rank more for searcher behavior outputs like click-through rates, bounce rates, and on-page engagement.
Back in the day—and admittedly, even when I was first getting started as a writer online—content creators and publishers were mostly just concerned over keyword-driven work and how keywords would help them edge above their competition in the SERPs.
Now, though, we see a lot of companies trying to edge out their competition not through more keyword-based articles, but through what a lot of content marketers call “skyscraper posts.”
These posts are long. Epically long. And they focus on answering any and all questions about a topic that searchers might have, drawing higher click-through rates, more time on page (because there’s SO much to read), and fewer bounce rates because the content is high quality.
It’ll be interesting to see if Google does anything to discredit marketers trying to “hack” the system with these skyscraper posts, but if those posts genuinely help out the searchers in the way they’re looking for, then why wouldn’t this be a winning strategy?