Like many other disciplines of marketing, SEO is a highly competitive field that involves a multitude of technicalities, nuances, and knowledge that is constantly subject to change. By definition, search engine optimization involves understanding the mechanisms by which the search engines rank websites and pages. While search engines like Google provide guidelines on how to better optimize your content for them, the exact algorithms are not known to the public and this is why so much about SEO is shrouded in mystery.
That being said, there is tons of information—and of course, misinformation— derived from empirical evidence, historical data, and firsthand information from search engines’ own documentation that we can use to improve our understanding of how search engines work.
In this article, we try to sieve the well-known facts and shed some light on the most fundamental concepts that will put you on the right path to better SEO performance.
For convenience, we focus on Google as it’s the most prominent search engine in most parts of the world—but these concepts apply to other search engines such as Bing, Yahoo and DuckDuckGo. The basic ranking algorithms don’t differ all that much.
How Google Search Works
Let’s briefly go over the basics of how Google search works since to optimize for anything, we should first have a thorough understanding of its nature and how it functions.
The overall purpose of a search engine is to find and provide the most relevant and reliable information to users’ queries. To be able to do this, Google first needs to catalog as much information present on the web as possible. Using an automated program called a web crawler (Googlebot), Google regularly explores the web, traveling from one site to another via links and other ways to find new or updated websites and pages to add to its index.
Google then analyzes each page trying to understand what the page is about and what information it contains. The result of this analysis is stored on a database, ready for access. When a searcher makes an inquiry using keywords or other input methods like voice—that as of now, are ultimately converted to text—Google sifts through hundreds of billions of webpages in the database to find and present the highest quality information or sources of information in terms of usefulness, relevance, and reliability.
Usefulness, Relevance, and Reliability
To keep its users satisfied, a search engine needs metrics and criteria in place to cut through the noise and assess and prioritize the most suitable information out of the bewildering ocean of content available on the web.
This is where SEO comes into play as companies, organizations, and entities of any kind partake in a tight race against each other to move up the priority list for specific queries in order to generate traffic to their website.
There are many signals and factors that Google takes into account in its algorithms when searching for answers to queries. However, from an SEO point of view, they can be roughly divided into three categories:
- Relevance of the content
- Quality of the content gauged by the source’s level of expertise
- Technical aspects of the site and how they meet the search engines requirements.
In the SEO universe, these are the subject of three major components of SEO which are respectively called on-page SEO, off-page SEO, and technical SEO.
On-page SEO
On-page SEO covers a broad range of SEO practices applied to optimize web pages for delivering maximum usability and the best possible experience to users. On-page SEO centers around content, its quality, and how it can satisfy the searchers’ intent. It also involves streamlining the page content to make it easier for Google to crawl the pages of your website and more easily analyze their content. Since usability is a subjective concept, it may mean different things to different audiences. This is where we get into the territory of SEO strategy.
SEO Strategy
SEO is, at its core, an extension of marketing and as such it should be geared toward marketing and business goals. What is it that you are trying to achieve? Do you want to increase your market share? Do you want to grow your online sales or are you primarily focused on building brand awareness?
You need to define your target audience. Who are they? What interests them and attracts their attention? What are they looking for? Answering these questions helps you understand your search demand and should precede any attempts at adopting SEO strategies and content building.
You need to empathize with your audience and see things from their perspective, research the potential keywords they use the most, prioritize them based on the competitive landscape and keyword difficulty, and then structure your content strategy around them. This is how you can give your target users what they expect and satisfy searchers’ intents.
Long-tail vs Fat-head Keywords
A big part of an effective SEO strategy is prioritizing what keywords to target. If you are a small or new company, chances are you are not going to rank for fat-head keywords, i.e. short popular phrases, as there is fierce competition among industry giants with much higher authority targeting them.
For example, if you run an interior design firm, you are much less likely to rank for “interior design” than “Hamptons style interior design”. Fat-head keywords have much higher monthly demand, higher competition, and broader intent whereas long-tail keywords are more associated with niche markets, have less monthly demand and competition, but higher conversion rates. This is a spectrum and every keyword list should contain healthy proportions of keywords from both ends and the range in between.
There are also other important metrics associated with keywords that can potentially impact SEO strategy. Aside from the volume of monthly traffic and level of difficulty, keywords also have different organic click-through rates and varying business value meaning they can be more or less focused on conversion and commerce.
Off-page SEO
We have talked about how Google ranks pages based on the relevance of their content to user queries. But, relevance in and of itself is not enough. What if a piece of content answers your questions but gives you wrong information?
Search engines need a way to distinguish valid sources in specific fields and rank them by measuring their level of expertise. As a solution, Google adopted a quantitative measurement method similar to what is used to evaluate scientific publications in the research literature.
In bibliometrics, citations are used as performance indicators. The underlying assumption is that the quality of a research paper is positively associated with the number of citations it gets from other articles and researchers.
In a similar fashion, Google developed a now-famous algorithm called PageRank (named after Larry Page, a co-founder at Google) that counts the number and quality of links to a page to roughly estimate its level of importance. Again, it’s assumed that websites and pages of higher quality receive more links than others. In PageRank, not all links are equal. Links from important sites have more weight than links from less reliable sites. They have more link juice which means that they pass on more credibility to the linked site.
After calculating the number and quality of the inbound links—i.e., links that a page receives—PageRank assigns a number to the page on a scale of 0-10. The higher a page’s score is, the higher it is likely to rank and appear on the search engine result page (SERP). In this system, historical performance is important as new websites with no backlinks and low authority have to fight an uphill battle to outrank competitors who have earned years’ worth of backlinks and authority.
Although off-page SEO involves a broader range of activities performed outside the boundaries of a website to improve its ranking on SERPs, much of it revolves around building backlinks and influencing how other sources talk about a website in the online space.
MOZ DA and Other Ranking Metrics
Over the years, Google gradually stopped updating and reporting on the PageRank of webpages and while they still use it in their internal calculations, that information is no longer available to us. This created the demand for other SEO tools and metrics to help measure a website’s authority compared to another. Companies like MOZ have tried to emulate PageRank by using methodologies similar to those developed by Google.
MOZ’s ranking system is centered around Domain Authority (DA), a scoring system that assigns a number on a scale of 0 to 100 to website domains after evaluating a plethora of factors from their inbound links. The Domain Authority is used to predict how well a website’s pages may perform in Google rankings compared to pages of other websites. MOZ also measures Page Authority (PA), a metric similar to Domain Authority but for a single page.
DA and PA are comparative metrics rather than absolute. This means that they are most useful when they are used to compare the ranking potential of a website or a page to the ranking potential of the others.
Other popular SEO companies like SEMrush and Ahrefs also provide similar proxy metrics with slightly different names. It is crucial to note, however, that they are all statistical metrics used to analyze and track SEO performance and they are not ranking factors and signals that Google uses. For instance, DA can be used as a KPI to track the link-building performance over time, or, you can use PA and DA to measure the value of a link from a given page to your site.
Technical SEO
Technical SEO refers to the process of improving a website’s ranking on search engine results pages by optimizing its technical aspects and ensuring they meet the search engines’ requirements. The goal of technical SEO is to make it easier for search engines to crawl and analyze a website’s content.
Technical SEO involves the security of the internet communication protocol, HTML code, and meta tags, mobile-friendliness, website and URL structure, server response time, redirects, and much more.
It is crucial to include technical SEO in your overall SEO strategy as almost all of the off-page and on-page SEO endeavors rely on a healthy and well-optimized website. Technical issues can hinder SEO performance in a variety of ways.
For instance, low page load speed causes dissatisfaction among users and increases the bounce rate and occurrence of so-called pogo sticking behavior. This is why Google encourages faster load speeds and penalizes slow websites in its rankings. Low load speed problems might be design-related, but they may also stem from hosting issues.
Another example would be mobile optimization. More than 60 percent of Google’s search traffic is originated from mobile devices and Google has explicitly stated that the mobile versions of websites are indexed first. This means that a website with poor mobile optimization may experience traffic loss due to content disparity between the two versions or some other issues that may cause problems for Google crawlers.
Search engine optimization is a vast and ever-evolving field full of subtleties that each require in-depth study and understanding. Due to constant changes and updates in search engines’ algorithms, things that work today may not be effective tomorrow. It is important to remember that SEO does not yield results overnight. Rather, it’s a mid to long-term investment that if done properly, can be invaluable for a modern business.