What is SEO?
If you’ve ever pondered the question: “What is SEO?”, you’re not alone. Because in an industry full to overflowing with acronyms, the term SEO is typed 18,000 times per month in Australia alone!
But what is it?
Simply speaking, SEO stands for Search Engine Optimisation. But when has anything worthwhile been simple…
In today’s modern era of information technology, search engines such as Google, Bing, Yahoo and Ask have come to perfectly understand the intent and context of each search query entered. If you type ‘SEO Sydney’ into Google (for example), it’s pretty obvious you’re a business owner looking for an SEO company in Sydney. But what about if you type ‘How do I get my small business website on the first page of Google?’ from a Sydney IP address? You haven’t typed in the keyword ‘SEO’ or the keyword ‘Sydney’, yet you’re looking for the EXACT SAME THING.
Same Intent Different Words
Confused? Well, don’t be, because Google isn’t. In 2025 keyword ambiguity is no longer a problem, because Google now has the ability – thanks to an algorithm called Hummingbird – to work out the ‘intent’ driving your question. This, combined with pinging the device you made your search on, allows Google to list results that answer the ‘intent’ as much as the actual ‘question’ you’ve asked. And to geo-target the results to suit the location where the search was made. So whether you’ve typed ‘SEO Sydney’ or ‘How do I get my small business website on the first page of Google’, you’ll get the correct results to fulfill your search. Listed (in the ‘opinion’ of the Google) from the most relevant / best to the least relevant / worst.
The Evolution of Search
Performing search this way sounds like common sense doesn’t it? But the search hasn’t always been conducted like this. Far from it in fact. There was a time in the mid-late 90s when search results were based almost entirely on which keywords the website focused on, and how many examples of those keywords were found within the website itself. If you wanted to rank well for ‘online marketing,’ (for example) you would repeat the words ‘online marketing’ throughout your content (and sometimes hidden in the HTML code itself) as many times as possible, paying little or no heed to the context the words were actually placed in.
Keyword Density
Back then it was all about keyword density, with some hitherto unknowable mathematical formula supposedly working out how many keywords were required to rank your site well. Have ‘online marketing’ twenty times on the page, and you might not appear on page one of the search engine because you didn’t mention your desired keywords enough. Mention ‘online marketing’ thirty-five times and you might not appear because you had them too many times and were guilty of ‘keyword stuffing’. The $64,000 question back then was how to work out the correct on-page keyword density percentage in order to rank well. Get that right and the burgeoning on-line world was your oyster. Get it wrong and your website would be all-but-invisible.
What’s the Magical Number?
Of course, nobody knew what the magical number of keyword iterations in any particular industry vertical was, (although so-called ‘SEO experts’ all claimed they did). But whatever the ideal keyword density percentage was, you could be sure that the number was different for Yahoo, Web Crawler, AltaVista, Excite, InfoSeek, etc; because they all utilised different proprietary methodologies to determine which websites ranked and which didn’t. Some, like Yahoo for example, even used people to sort out the wheat from the chaff, rather than letting algorithms do the heavy lifting. Because back then you had to email your website’s URL to the lads at Yahoo so they could manually check out your site to work out whether or not your site was worth adding to the Yahoo Directory or not!
More Keywords Does not Equate to Better SEO
Consequently ye-olde webmasters tended to err on the side of, ‘More is better’ when deciding on the number of keywords each page needed to have. Which naturally made the user’s experience of any website they went to unpleasant in the extreme. For while a website is not expected to read with the poetic eloquence of Shakespeare, the rhythmic precision of Mamet or the semantic exactitude of Pinter…
‘We’re on-line marketing experts in Sydney and our Sydney online marketing experts know more about online marketing than all other Sydney online marketing experts combined!’
…didn’t tend to read like an excerpt from the Reader’s Digest ‘Book of the Month’ club (to put it mildly). Not that that stopped nefarious nineties SEO companies from building mini-empires on such dross mind you! As crappy content was, back then, ubiquitous.
The Rise and Rise of Google Search
It wasn’t long before the companies behind online search came to the conclusion that they needed a better ranking signal to hang their hat on than just keywords. Enter Google (or BackRub as it was originally known) in 1997 / 98. With Yahoo (the largest search engine company at the time) struggling to evolve from a clunky manually submitted and categorised search directory, to a completely automated algorithmic driven search engine; and all the other search engines running what we would now class as ‘primitive’ search algorithms to categorise their results, Google burst onto the scene and changed search forever when co-founder Larry Page came up with an algorithm called ‘Page Rank’ (or PR for short).
PAGE RANK AND THE (HISTORICAL) IMPORTANCE OF LINKS
The logic behind Page Rank was pretty simple (even if the math driving it wasn’t). Google wanted to direct searchers to the pages with the best and most relevant content, but was unable to read a web page like a human being in order to work it out. Consequently it couldn’t tell if the content on a particular site was great, average, or just plain rubbish. But – surmised the Page Rank algorithm – if someone linked to a page on a site from another website, then in a way the site doing the linking was saying (via a tacit ‘vote of approval’) that the content on the page being linked to, was good. Why, surmised Page Rank, would someone link to content that was rubbish? They wouldn’t! Therefore the more links a page on a website had pointing at it, the better the content on that page must be. And by extension, the more links your entire domain had, the higher the Page Rank score for your TLD (Top Level Domain). This score being a mark out of ten, with zero meaning little or no links were pointing at your site, and ten meaning you had gazillions of links pointing at it.
All Links Are Not Created Equal
While the number of links pointing at a website post Page Rank’s introduction was important, it wasn’t just about who had more links (although that helped), as not all links are created equal. A big part of a site ranking in Google was influenced by who was actually linking to you in the first place.
In simplified terms it broke down like this: if your website had one link from a super powerful site like Google Webmaster Guidelines (for example), Google would value that link way more than a hundred links from a bunch of random blogs or micro-sites. The reason being that many thousands of sites already link to the Google Webmaster Guidelines page, giving it a high Page Rank score. But if a bunch of run-of-the-mill websites nobody’s ever heard of links to you, the Page Rank coming down the pipe via those links, would be low. The more Google trusts the sites linking to you, the more Google trusts your site. The rule of thumb being: the more Google trusts where the link is coming from, the more Google trust you.
SEO – What You Need to Know to Rank Your Website in Google
The most important element of any online marketing strategy is of course, search engine optimisation. Industry statistics tell us that 92% of people don’t go past the first page of their Google search, but they also tell us that 8.5 out of every 10 people (or 9 out of every 10, depending on which survey you believe) click on the organic search results rather than the Pay Per Click, or paid search results. And if these numbers are true (and they are), then it ain’t rocket-science to join the dots to work out where the lion’s share of your online marketing budget needs to go.
Search engine optimisation, or SEO as it is commonly known, refers to the marketing discipline that focuses on increasing a website’s ranking and visibility in organic (unpaid) search results. Unfortunately many Sydney small business owners seem to believe that SEO involves stuffing their content with keywords, paying for or exchanging links, duplicating content to create the illusion of having a big website, spamming other websites, or a hundred-and-one other outdated search strategies. But while these techniques may have worked in the past, they’re much more likely to get your website penalised if you employ them today. And understanding the difference between white hat SEO (the good stuff) and black hat SEO (the bad stuff), and devising a strategy comprised of ethical, search engine-approved tactics, is of the utmost importance to getting more traffic to your website in 2017.
Search Engine Processes
To better illustrate what SEO is all about, let’s break down exactly how search engines work.Search Engines – What They do and Why Search engines have two major functions:
- crawling all the content available on the world wide web (be it words, pictures, video or audio).
- indexing all the web pages where said content lives, to provide users with a list of results, ranked from the most to the least relevant / trusted.
Crawler
A crawler, also called a spider or automated robot (or ‘bot), reaches billions of interconnected websites on the Internet through the link structure of the world wide web. Without links, crawlers wouldn’t be able to access web pages and retrieve or discover content.
Indexer
Every new web page discovered by the crawler is then passed on to the indexer, which deciphers the code and stores the information in massive databases, also known as indexes. Once deciphered, the indexer essentially tries to make sense of it all, analysing the pages retrieved by the crawler and assigning or ‘weighting’ relevance values for the content and information found.
The index of a search engine is intended to optimise the speed and performance of finding information on the web that is relevant to a user’s query. The index lets query engines (the third search engine process) recall and retrieve the most relevant website pages at a much faster speed.
The task of storing untold billions of web pages that can be accessed in the blink of an eye is monumental, thus search engine companies such as Google and Bing and Yahoo and Ask employ huge data centres all around the world. These data centres use thousands upon thousands of inter-connected servers to process the information they retrieve, resulting in search engines displaying search results instantaneously.
Query Engine
The query engine is the last of the three major search engine processes, and involves retrieving information from the index, applying various ranking factors to each, and presenting a list of results to the user, displayed from the most to the least relevant / trusted. The query engine attempts to understand the intent and context of a search query, and supply the best results. Factors like device type, user location, search history, and query intent are all taken into account by the query engine in order to ensure that information shown to the user is of a high quality, factually correct, and is as relevant as possible.
Determining Relevance and Popularity
What really makes search engines stand out of course is not how they work, but how they assign values or ‘weight’ to every single piece of content stored in their massive indexes. In the early days of SEO, search engines would simply look for pages with the right words. Today, search engines implement machine learning algorithms that use thousands of factors to determine the relevance and popularity of a web page. These are known within the SEO industry as, ‘ranking factors’.
Google, for example, has over two hundred ranking factors and over ten thousand sub-ranking factors, which are all designed to improve the user experience and provide the most accurate and relevant search results for every query that is performed.
The Three Layers of a Holistic SEO Strategy
With a better understanding of how search engines work, we can now start talking about the three layers that make up a holistic SEO strategy, and how they align with search engine processes. This will allow you to better put all the piece of the SEO puzzle together.
Let’s talk about the three layers in the order Google reviews them:
Technology
This refers to the technical elements that make up a website that allow search engines to crawl and index its pages. Such elements include the HTML code, XML sitemaps, etc. The platform that a website is built in can also greatly influence how SEO-friendly the website is. For example, websites built on the ASP.NET framework are notoriously problematic where SEO is concerned, and sites built with Flash (there are still some out there!) are all-but invisible to most search engines. Whereas websites built on the WordPress platform tend to be much easier to crawl and index. But again, even WordPress sites can be (and often are) built incorrectly, and can cripple your SEO efforts because of the way the back-end code is constructed.
It’s worth noting that if a website can’t be properly crawled and indexed, it will be ranked much lower in your search engine of choice. Sydney Small Business owners are far too often ‘Penny wise and pound foolish’ when it comes to getting their website built. And as in all things, you get what you pay for. So it is imperative to take the time to ensure that all technical elements of your website build are in place. Otherwise trying to rank the website will be like trying to run underwater. Sure you’re expending a lot of energy; but due to the environment around you, your forward motion will be severely limited.
Relevance
Once your content is crawled, the indexer needs to understand what the content is all about. This is where basic, old-school SEO 101 comes in, by having optimised tagging, a robust link matrix, a structured IA, well written on-topic content, et al.
But if this all sounds too much like techno-babble, and you – as a small business owner – just want to focus your energies on something tangible, then my advice is to focus your efforts on adding brilliantly written, on-topic content to your site. As this is the number one most important SEO ranking factor in 2017. However don’t make the mistake of writing your web pages yourself just because you ‘know your product’ or because you wrote a few papers while at uni. SEO copywriting is an art-form best left to the trained professionals. Because there’s a bifurcation required when writing for the web. On the one hand you need to write content that Google will love. And on the other, you need to write content that humans will respond to. And there’s no better way to up the relevance of your website than by adding an endless supply of brilliantly written, on-topic content, that actually adds something to the discussion of the topic (rather than just regurgitating what others have said, or – worse! – just doing a blatant sale’s pitch and disguising it as ‘on-topic content’).
Authority
Okay, so let’s imagine you’ve had your small business website built correctly. Google can crawl it, index it, and make sense of your wonderfully erudite (and relevant) content. But now what? Well, I’m glad you asked. Because while all of the above is exceptionally important, it’s only half the SEO battle (well, maybe 70% of the battle if you want to get pedantic!). Because to be seen as trustworthy by the search engines, you need other websites to tacitly endorse your content. This endorsement comes in the form of backlinks from other websites to your home page and to the various internal pages within your site. The more (relevant and trustworthy) the link point at your website, the better the search engines will think your content (on the page the link is pointing to), is.
On-Page SEO v Off-Page SEO
On-Page SEO
As its name suggests, on-page SEO refers to optimisation techniques that can be applied directly to your website, and of which you have total control. In addition to ensuring that all of your technical elements are correctly in place, there are a few other aspects that can help improve your on-page SEO. These include optimised meta tags and title tags, clean URL structures, correctly formatted content, internal and external linking, your website’s Information Architecture, and more.
Off-Page SEO
Off-page SEO refers to the tactics you employ away from your website, to boost your website’s trustworthiness (sometimes referred to as you site’s ‘Domain Authority’) in the eyes of the search engines. Common off-page techniques include link-building, social media optimisation (SMO), email marketing, local citations, press releases, etc.
Achieving an elusive combination of both on-page and off-page SEO will have the positive impact on your site’s ranking, visibility, and conversion rate.
Search Algorithms – What You Need to Know
There are many search engines, and they all have thousands of algorithms underpinning how they work. But what exactly are algorithms? And which ones should you, as a small business owner, care about? Firstly, let’s just cut to the chase and say that the only search engine anyone (in the western world) needs to focus on, from a marketing perspective, is Google. That isn’t an anti-Microsoft bias, or a pro Google bias on my part. That’s just me demonstrating my predilection for cold hard facts:
Search Engine Market Share
Eight out of every ten people in the western world use Google to search for their information online. You know it, I know it, and your customers know it. If you’re searching online in China, Google is a bit-layer and you’d likely be heading over to Baidu, Qihoo360 or Sogou. If you’re searching online in Russia, then Yandex, Vkontakte or Odnoklassniki are likely to be your go-to search engines of choice. But for most English speaking countries (and certainly for all your potential customers in Australia), Google is far and away the only search engine that people use to find what your small business does, where you do it. Yahoo, for all its historical largess, is a spent force with just 5.55% of the market, (and is actually powered by a Bing back-end), and Bing gets a paltry 7.15% of the market (with most of that due to the fact that Bing is the default search engine installed with Windows 10).So for the purpose of this article let’s limit our discussion to ‘What You Need to Know About Google’s Algorithms’. Because as mentioned there are multiple search engines, each with thousands of algorithms powering them, so it’s easy to get lost in techno-babble and obfuscation when having a nerd-fueled discussion about them all. But if we just focus in on Google, and perform a taxonomy of Google’s algorithms weighted by direct relevance to the small business owner trying to eke out a living via gaining traffic to their website, we can better narrow down the algorithms small businesses to focus on.
What’s in a Name?
Google rarely, if ever, does anything that helps small businesses in Sydney (or anywhere else for that matter) work out what algorithms they need to care about, and what they can safely ignore. But in this case Google has helped everyone out by giving the REALLY IMPORTANT ALGORITHMS, cute animals names:
- Panda.
- Penguin.
- Hummingbird.
- Pigeon.
Elsewhere in the SEO North Sydney website there are more detailed articles on what these algorithms do (click on the above bullet points to be taken to the in-depth articles on the topics). But for the busy small business owner wanting just the facts, an overview follows:
Google’s Panda Algorithm Explained
An algorithm designed by (and named after) machine learning Google engineer, Navneet Panda, that can, for all intents and purposes, read a page of content on the web like a human being. Panda can tell great content from good, good content from bad. Originally released in February 2011, Panda knows if your writing is university level, high school level, middle school level, or Donald Trump level. Panda was designed to combat the plethora of low quality websites that were ranking in Google purely based on having lots of links pointing at them, rather than having brilliantly written original content that answered the users question. After numerous updates, Panda was absorbed into Google’s real-time core search algorithm in 2015.
A website with a low Panda score will get penalised in the SERPS (Search Engine Ranking Positions). Panda takes into account the design of your website, it’s UX (or User Experience), et al. But it is 90% concerned with the words on the page (screen). Panda wants to know what written content is on your website, how well is it written, how well is it SEO’d, and how much knowledge is expressed both vertically and horizontally on the topics being discussed. While ‘Content is King’ is a truism in Google search that I’m sure you, as a small business owner, know already. It is imperative you realise that in the post-Panda world, not all content is created equal.
Google’s Best Practice Webmaster Guidelines
An algorithm unfortunately not created by a guy called Mumble Penguin. It was however created by a team of faceless Google engineers to combat dodgy link-building practices, over-optimisation, keyword stuffing and various other nefarious activities websites used to use to get rankings that contravened Google’s Best Practice Webmaster Guidelines. Having gone several iterations since it was originally launched, Penguin was officially absorbed into Google’s real-time core search algorithm in October 2016.Remember how spammy backlinks were a common practice in the early days of Google? Well that doesn’t work anymore. No longer is it about having more links than the next guy. Because too many of the wrong types of links (now known as ‘toxic links’) can actually get your website penalised by Penguin. Penguin penalties come in two forms: algorithmic and manual. The former can seriously damage your rankings on specific keywords, and on specific pages on your website, and are an absolute nightmare to fix. The latter however is devastating. So the next time you think about hiring some cheap Sydney SEO company, or else think outsourcing your SEO to India (don’t do it!) is the way forward. Imagine how your small business would survive if your website all-but VANISHED FROM GOOGLE for six to nine months due to a Penguin penalty? Because, while you go through arduous, painful and time-consuming disavow process for the better part of a year, in a vain attempt to try to convince Google that, while you were caught with your hand in the cookie jar, you promise to be good from now on…your competition is enjoying being on page one of Google and banking all the money that your company should’ve been making.
And all for what? So you can save a few dollars on your monthly SEO bill? As that old saying goes: ‘If you think it’s expensive to hire an expert. Wait until you find out how much it costs when you hire an amateur.’
Google’s Hummingbird Algorithm Explained
Dodgy back-linking violates Google’s Webmaster Guidelines. So don’t try to cheat the system by buying or manufacturing links. Create brilliantly written, engaging content, and earn the links you get. That way the links you get will be the kind that Google loves. Google’s Hummingbird Algorithm Explained
An algorithm released in September 2013 that has been likened to the Google Caffeine algorithm of 2010 (which provided ‘50 percent fresher results’ and was in essence a turbo-charger strapped onto the Google search engine, to make indexing the web twice as fast). Hummingbird was originally created to ‘read between the lines of Google searches’ to better make sense of the actually question being asked, rather than just targeting the keywords being used. It also focused on giving better and more accurate ‘long tail results’ (a long tail search is a search that uses several keywords, not just one or two). Because, despite what you as a small business owner might think, 70% of all search is long tail (rather than just the short tail, or ‘Trophy Searches’, which is most small business websites try to target).
Hummingbird doesn’t come with a penalty; that’s not what it’s about. Hummingbird is about increasing the relevancy and quality of search results for users by better understanding the user’s intent. With Hummingbird, Google is better able to judge what a user is actually asking for, rather than by just taking the question on face value, or relying solely on keywords. For example, a small business owner in Sydney might type ‘SEO Sydney’ into Google, which is a keyword followed by another keyword. But they might also type ‘How do I get my small business website on the first page of Google?’. A search that doesn’t contain the keywords ‘SEO’ or ‘Sydney’. But with Hummingbird, Google knows that both searches (if they’re performed from an IP address in Sydney) are looking for the exact same thing. This is intelligent search at its best.
Google’s Pigeon Algorithm Explained
An algorithm originally released in 2014, the Pigeon algorithm targets local search, with a focus on increasing both the relevancy and accuracy of local search queries. Pigeon integrates with Google maps, utilising many hundreds of organic ranking signals (including Google’s Knowledge Graph) to show quality sites optimised for (and operating in) specific ‘local’ geographic areas. So if you want your small business website to show in the Google Map ‘3 Pack, you’re going to need to learn to love Pigeons! Note: this algorithm wasn’t actually named ‘Pigeon’ by Google, but was actually named by Search Engine Land (a US based SEO thought leader), because Google had released a new algorithm sans name (an all too common occurrence). And because ‘Pigeons tend to fly back to their home’. Much like Hummingbird, Pigeon doesn’t penalise websites; but what it does do is increase the rankings of local listings in local search. Making it an extremely powerful tool for small businesses to harness. From a small business or SME’s perspective, Pigeon means better rankings, more website traffic, and a lot more local customers.
Are There Any Other Algorithms I Should Worry About?
Yes. Thousands of the buggers. From EMD to Mobile Friendly, to Payday Loans to RankBrain to the Quality Update and on and on it goes. Where it stops, nobody knows!
Google updates five hundred to six hundred algorithms a year. That’s 1.6 per day; so good luck with keeping up with the changing rules of search if you’re not doing it full-time. Google are constantly refining the way they do search and are constantly changing the rules. What worked yesterday may not work tomorrow. And worse, what worked yesterday may get your website penalised today.
So where does that leave the small business owner? In good hands, that’s where. Because while search isn’t rocket science, it IS a science. And given that online is where most of your business comes from (or at least, is supposed to come from!), what you as a small business owner need to understand is not which Google algorithm does what. But which SEO expert to trust to help you to safely navigate the choppy waters of online marketing.
And that’s where SEO North Sydney comes in. With two decades’ experience specialising in getting small businesses and SEMs on the first page of Google, SEO North Sydney have the knowledge and expertise to get the Google first page results your business needs.