The truth about SEO
SEO means Search Engine Optimization. It’s all about getting your pages and sites to the top of Google and other major engines, in order to gain a greater share of users, and ultimately it is used to drive people through your acquisition and conversion funnel, for commercial purposes.
To internet publishing insiders, the very concept of SEO reeks of Google hacking and tricksterism. But what is SEO? How did SEO come to be what it is today? Where is SEO going? Why do marketers love it and use it? Why is it so often reviled, and so rarely loved by the search engine operators of the world? And how can you make sure you’re doing good SEO that the search engines and real people at the other end of the search, will actually love and promote for you?
In a word, accountability.
Recommended SEO Tools:
| Know Who Is Your Competition:
Find out with
You see, back in the early days of search engine development, people would search freely for all kinds of things, just as they do now, only to find a lot of searches have very thin search results come up. And then the keyword marketers came along. They (we) figured out very early on in the game, how to determine what people were searching for, how competitive those search terms were, and how to rank well for them.
Ranking well was easy, because a) there were not so many SEOs gaming the system, and b) it was all about the number of links pointing from servers around the web, to a target website being pushed up the rankings in an unnatural way.
It was easy, because there was no accountability. Anyone could be anyone, and the more accounts an SEO set up, the merrier. You could buy thousands or even millions of of links, follows, friends, likes, shares, and just about any other kind of semi-trackable and wholly valuable “link juice” from an army of websites willing to sell automatic links, real-human links, and more, for less than a pack of cigarettes.
Also, tons of content management systems, or CMSes, were maturing tyo the point were nearly anyone online could set up a site, or even a network of sites, easily and quickly. Quality be damned; the volume game was still on. Blogger and WordPress became the leading causes and sources of low and medium quality webspam.
People who built a single site with high quality information were screwed, left to dwell at the bottom of page 549 of a search result, unless they lucky enough to have tons of real followers who, through organic word of mouth and social media marketing, were happy to share their content.
This meant that the people calling themselves SEOs were growing rapidly, in large numbers, with high salaries for doing easy, low quality work. For those of us focused on quality, it was at times annoying and frustrating; at others, downright infuriating, and often a business-killer.
That meant of course that many great sites simply died on the vine, and not for lack of trying. They were just too ethical for their own short-term needs. However, in the long-run, those who stuck to playing by the rules, would get a better chance at success again, as long as they survived until the inevitable search engine evolutions that would change SEO forever.
And there was little to nothing that Google or other engines could do about it. Their algorithms were still just stupidly counting link volume and link velocity as the main ingredients for determining what made a site rank higher. This lack of accountability was terrible.
By allowing the system to be gamed for so long, many regular users started to understand that most of the links they actually wanted, were probably not on page 1 of their search results. In face, it was not surprising to see all commercial links on page 1, above even the wikipedia links for important or popular subject searches.
So the big search engines decided to get smarter. Doing so would not be impossible do to, as long as they could eliminate the gamers, or at least reduce their impact. One team, the Google web spam and web quality team, was led by a guy who seems to really communicate very well what google looks for. His name is Matt Cutts, and he is famous in the SEO industry.
Matt is the lead googler when it comes to figuring out who is trying to game the network, and who is an honest and accountable person or organization, trying to legitimately post interesting information for their target audiences. Matt frequently posts blog articles and videos, as do many of his colleagues, about what is good content, and what is shady, or outright spammy content.
Recommended SEO Video:
| See Matt Cutts about producing high quality content:
Find out with
Matt’s team was integral in helping Google improve its search algorithms, by removing the spammy links and lowering the rankings, or even removing tons of websites form the search engine’s index of search engine results pages, aka: SERPs. But they did not roll out this quality assurance protocol all at once. Instead, they let it trickle across the network, almost one category at a time, so that by the time it was noticed by most marketers, it had already done its job and would become harder and harder to outgame in the future.
Meanwhile, SEOs were busy trying to figure out how to game search by using social media. Given their popularity and contant connection to people’s smartphones, this seemed like a great goldrush again, and now easily dominating the relatively virgin mobile search entered the market as a tasty proposition for smart marketers. Thanks to iPhone, and then Android devices, hundreds of millions of smartphones appeared in people’s hands, and people were searching from the handsets as much or more than they were from their desktop computers and laptops.
So the SEOs were partly distracted by social and mobile while changes were happening behind the scenes in California.
And then the little engines that could, got smarter.
As I have suggested, from 1994 – 2009, SEOs had a very easy time, ranking just about any website for just about any keyword. All we had to do was pump up the link count and the diversity of those links across as many servers, subnets, IP numbers and C-Classes as possible. It was a period of netting fish in a barrel.
There was even great, cheap software that did this semi-automatically, if not fully automated. For less than the cost of dinner, you could own a range of keywords for any number of sites you wanted to push, and you could do it quickly, without too much competition. People on sites like WarriorForum and DigitalPoint Forum could semi-anonymously exchange information and trade links, or buy/sell links in massive numbers, and for very little money.
It seemed as if the heydey would last and create a second Internet marketing bubble. But those who held that view, and were clogging the Interwebz with garbage, were very narrow-minded, and were causing their brands more long-term damage than they imagined.
Around 2010, search engines started getting smart. Very smart. The algorithms were hitting hundreds, even thousands of calculations for every search a user made. They pulled results that included not just the number of links, or backlinks pointing to a target site, they also started being quite refined at pulling contextual search results, relating the actual search terms being looked up, against the anchor text of a link. They also started searching smarter for the words near the anchor text, or link.
Come 2011, and Google was not just the leading search engine on Earth, it was also quietly building itself up as one of the leading networks of verified users. Other massive networks, like Facebook, were also growing at a tremendous pace, with hundreds of millions of users signed up and at least partially verified. Yes, you could still create fake accounts (and people still do), but various steps were required to make multiple accounts on any single device.
Simultaneously, the searches on social sites were being logged, and fed into their own search engines. Twitter and facebook were then also mostly taken out of google’s search results, and as a result, you had a tumultuous period of finding and losing pages very quickly within a google search, and people started to rely more on searches directly on their favorite social networks.
Then, Google launched Google+, and it seemed like a godsend for SEOs who were looking to beat the system once again. They created tons of accounts as fast as they could, but they had to use names that seemed real to Google, and they have to give information that was a little more verifiable than ever before. Soon it seemed as if the only people really using Google+ very much were:
1) Google employees, and others who have a vested interest in seeing Google+ succeed.
2) Tech snobs or the so-called technorati, aka: elites, who just wanted to show the world that they were cooler than facebook, and they, as early adopters, would use google’s all-powerful social network to remain on the leading edge.
3) SEOs and SEMs who market brands for various products and services, using the power of the internet.
Meanwhile, the big change people were noticing was that on Google+, was that it was ripe for hacks. It was possible to rank on page 1 on the same day that an article was published, if Google+ was tricked into believing that it had received a number of +1s quickly, and given the still partly unverified Google accounts, it was easy to have bots or cheap labor in India and across Asia and Latin America, and even Eastern Europe, provide “real” people-generated profiles, that could then be used to pump up the link-juice of any target site. So even though Google’s web quality team was fighting the good fight, trying to deliver the best research for searches, they were also allowing themselves to be one of the leading causes of webspam. Give that a big +1 for irony.
And then came the Panda
Near the end of 2011, Google quietly started rolling out their first famous Panda update. It took effect by the end of January 2012, and hit hundreds of thousands, if not tens of millions of websites.
Panda basically counted not just the traditional algorithm, but also the freshness of content. Panda also was smart at making connections and bringing out into the daylight many link networks and shady link juice practices, which helped them cull millions of bad SEO pages from their index. Of course, this also hit many legit pages, which were maybe the unintended victimes of bad SEO from the old-school days of simple link-building techniques.
As SEOs took measures to defend themselves against the damage caused, or about to be caused by Panda, Google had more surprises in store. Just when everyone in the industry thought that they had applied the necessary changes to reduce the minimum possible amount of webspam published in their brand names, the GOOG went off and set up a rolling update strategy. This means that changes would take effect on an ongoing basis, and that they would not all be announced.
Now, SEOs were faced with updating their sites and pages every 90 days (maximum) or have those sites and pages fall out of google entirely.
Soon, Panda 2 and others in its wake would take hold and affect even more sites. So many changes came, in fact, that even leading search engine publications and analysts stopped trying to keep track or name every single change.
Then followed the Penguin
Penguin quickly followed Panda as the biggest change in the google algorithm in years. It counted not only freshness, but it also counted social signals much more.
Penguin caused more alarms to go off than any previous update, even though in theory, Google had long-since discounted and social search results. This major change meant that although links on social networks should not have accounted for much link juice, actual links, likes, shares, re-pins, and follows by real people were powerful indicator’s of a site’s real value, at least as far as the googlebot was concerned.
Now SEOs had to figure out how to publish all kinds of great content, in quick, regular succession, and get it shared socially, or ignore social media at their own peril.
The Penguin update, in line with the previous Google dictates about rolling updates, quickly had Penguin 2.0 and other udpates streaming out so fast, that people in the know stopped tracking every minute change.
The search engines were back on top … for a while. And it seemed they were gaining ground, wiping out zillions of pages of garbage. Yet plenty of junk remained, and much of it was still gaming the system. This was especially true in sectors which Google ignores, mostly because they don’t like them. And the acronym of these sectors is, ironically, PPC. And as we all know, PPC, or Pay-Per-Click advertising is the bread and butter of Google and other big engines.
Of course, the PPC I’m talking about here stands for something else entirely, although it is also the leading source of money spent in PPC ads. The PPC I’m talking about is Pills, Porn and Casino. Those three sectors are topics that Googlers would prefer just did not exist. And yet, they are by far three of the most popular search categories since the dawn of civilization, let alone the search engines.
So too came the Hummingbird
Come mid-2013, Google announced that its latest major overhaul was called Hummingbird. And by overhaul, we mean a complete reboot for the google machine.
You see, Hummingbird effectively replaced the entire google search engine. It left in place the different calculations that google makes for every search query it receives. And it made real-time, social and location-based search results stand out much more, especially for the majority of regular users, who are plugged in to their google accounts in one way or another, at just about any time that they’re googling.
So now Hummingbird delivered more search results that seemed to be relevant uniquely to the searcher, and it did so faster than ever before.
At this point, SEOs had to look at their strategies and techniques much closer than ever before. The notion that you could build a strong brand with grey hat and black hat SEO was dying quickly. White Hat SEO, the bane of lazy SEOs everywhere, was once again seen as the best of breed method of getting the best possible SEO results.
Of course, White Hat SEO is harder, more time consuming work, than the tricks and hacks of the lesser forms of SEO. Still, the merits of long-term value building with white hat SEO made the case clearer than ever. Just as you would not succeed long-term by tricking people to eat at your restaurant when you serve them garbage, you will no longer be able to succeed at building top ranked sites and pages by tricking users into visiting your site, or even putting up poor quality links to it from as many sites as possible.
Nowadays, in 2014, it is quite literally more beneficial to your SEO work and your page’s rank and targeted traffic, to have 1 link from a great website, than 1000 links from 1000 low quality sites. And while it will take you much more time to get that great link, the fact is, it’s worth it.
Don’t just believe that I write. Test against my theories, always.
If you’re not yet convinced, go ahead and try it on just one page, or one new site. Track that new content’s performance against the pages and sites you maintain with old techniques. If you do it right, you will see your better content with higher quality links shoot up quicker and for longer periods of time, than all of the lower quality pages you make in the next 30 – 90 days.
While you’re chewing your mind candy over what you’ve just learned here, watch this great video which goes in-depth about many instances of historical facts and sidenotes of the SEO industry, from 1994 until early 2014.
Post Lesson Note
My unofficial social contract with you: If you like what you learned in this article and care to help me in my challenge to test Google’s claims about publishing high quality content, then please share this page on your social network profiles and anywhere that you think it might help other people.
Previous Lesson: Web Marketing Lesson 2: Branding 101: Build a Brand to Remember
Next Lesson: Web Marketing Lesson 3: SEO Do’s and Don’ts The current best practices top SEO guide in 1 page.