I got my first taste of SEO in 1999, when I started learning to build websites. Granted, SEO in 1999 didn’t consist of much more than keyword stuffing and raw link quantity. Things have progressed just a bit since then.
After all these years, I’ve learned that there are two kinds of SEOs: those that chase Google, trying to figure out what just happened, and those that stay 10 steps ahead of Google by studying what they’ve done in the past and making educated guesses as to where they’re heading. I’ve done both over the years, and I can tell you without hesitation that life as an SEO is far more enjoyable when you’re not constantly chasing the Google monster.
So how exactly do you stay 10 steps ahead of some of the smartest engineers on the planet?
With practice…and a handful of tricks.
Trick #1 – Know Your History
Before you can figure out what Google’s going to do next, you need to understand what they’ve done in the past. Aside from extensive experience, there are three key sources I turn to for that kind of information:
- SEOmoz’s Google Algorithm History – SEOmoz keeps a running history of known algorithm updates, and as far as I know they’re the only site that has anything like this. Their list runs back to 2000, so it’s about as comprehensive as it gets.
- SEO by the Sea – SEO by the Sea is Bill Slawski’s blog, and it focuses primarily on analyzing Google’s patent filings. This one is kind of a blend of past, present and future, but it’s a great way to better understand the way Google engineers approach things and some of the things they’ve at least considered trying.
- Google’s Inside Search Blog – Last but not least is Google’s Inside Search blog. Now, you need to take anything directly from Google with a fistful of salt, but this will still give you some insight into the changes Google is making. I personally think that some disinformation gets mixed into here, but at least some of it is accurate.
Know thy enemy!
Trick #2 – Understand Google’s Main Motivation
Roughly 96% of Google’s revenue comes from advertising, and that revenue is totally dependent on Google providing good enough search results to keep people coming back, and that new people keep coming in. On top of that, as a publicly traded company Google is required by law to make decisions that are in the best interest of its investors.
With that in mind, please understand that every decision that Google makes is made with one goal in mind: to make more money. From buying companies and releasing new products to making changes to the search algorithm, the motivation is the same.
A reduction in spam means better search results, and better search results mean more people searching on Google, which means more ad views/clicks, and thus more money.
You get the picture.
On the flip side, an update like Penguin that bounces truly relevant sites out of the top spots for using something that has been a legitimate, tried-and-true practice for over a decade forces many of those domain owners to spend more money on Google AdWords to make up for the loss in organic traffic. It should come as no surprise that Google has seen some killer increases in ad revenue post-Penguin!
If an SEO technique is being used to get low quality sites ranking well on Google, you can bet your bottom dollar that Google will figure out what it is, and how to remove its effectiveness algorithmically, a la Panda and Penguin.
Which brings us to…
Trick #3 – What’s Working Now That Really Shouldn’t Be?
Private blog networks, exact match domains with exact match anchors, buying and 301 redirecting niche domains…hell, even articles, directories and press releases. If SEOs have used a technique for ages, and you think to yourself “Man, I can’t believe this still works! Stupid Google.” you can bet your bottom dollar that Google’s working on a fix.
Why don’t they just fix it already? Simple. Fear of false positives.
Algorithms aren’t perfect, and false positives can be a PR nightmare, causing an immense amount of harm to Google’s reputation. For that reason any changes to the algorithm have to be tested to the Nth degree to make sure there will be very, very few if any false positives. This is why so many changes simply eliminate the value of certain tactics instead of penalizing sites for using them, because of the fear of accidentally penalizing an innocent site and having it blow up in the news.
When you know what Google has done in the past and what they’re patenting, what motivates them, and most importantly what SEO tactics are working that really shouldn’t be, you’ve got everything you need to stay way the hell ahead of Google’s black and white army.
Thanks to this “stay ahead of Google” mentality I started making major adjustments to the way I approached link building and anchor text in the Fall of 2010, a year and a half before Penguin, because I could see that it was only a matter of time before Google devalued such a highly abused signal.
And that brings us to the golden question: What will Google change next?
While I think you’d be far better off going through the process of figuring it out for yourselves, I’ll give you a few freebies. First, I think links coming from sites that allow you to easily build your own links will be completely devalued (WordPress.com, Tumblr, Blogger, WetPaint, etc.) Yes, many of those have likely seen a partial devaluation already, but I think a bigger hit is coming.
Second, I think Google may reach the point where they only pass value from one link per root domain (perhaps only counting the first link they discover to your site on that domain.) You know, kind of like how they only pass anchor text for the first instance of a link on a page, the first link found on a domain passes the juice.
Third…sorry, two freebies is all I’m giving out to day. Teach a man to fish and all that.
Everything Google does, from building self-driving cars to supplying ultra high-speed Internet to Kansas City, is done to increase revenue. Google knows what they’re good at (ads) and they work like mad to increase their ad clicks and impressions through a wide variety of initiatives.
For example, the US has roughly 190 million registered drivers, and the average driver commutes a little over 100 hours per year, plus or minus. If all cars were self driving, that would be an additional 19 billion hours per year (just in America!) that could be spent online during transit. Seeing as Google commands over 80% of actual search engine searches in the US…Cha-ching!!!
Ultra high speed internet in Kansas City? I think this is a classic (albeit somewhat elaborate) Google A/B test. I’d bet my bottom dollar that Google will monitor usage to see if a vast increase in Internet speed will lead to additional time spent online and thus additional ad views/ad revenue. At the same time, Google as an ISP would have all kinds of additional demographic and Internet usage data that would allow them to more effectively target ads to their Internet subscribers.
There’s a hell of a lot they could glean from an experiment like this. A lot of people are speculating about whether Google will provide free Internet to everyone, and the accompanying cost…they’re following a red herring. Google doesn’t want to be an ISP…they just want the data. If there’s a benefit to vastly faster connection speeds they’ll move to another city or two, and then hope that fear of major competition forces other ISPs to boost their own speeds, doing Google a huge favor in the process.
But I digress.
Learn to think like Google, and you’ll be on your way to truly mastering SEO. Remember, it’s the SEOs that stay 10 steps ahead that make the big bucks!