(678) 460-7996 stuart@digiscream.com

SEO 101

Search Engine Optimization, or SEO, is a process consisting of a collection of techniques whereby websites attempt to increase traffic by gaining high placement in the results of internet searches.  SEO is a key component of any internet marketing plan.

Search engines should return results that are most meaningful to their users.  To do this, they employ various algorithms.  SEO is intended to improve the likelihood that a site is found by the search engine and that it has a high degree of relevance to the search.

Some sites try to ‘game’ the system to fool search engines into returning their link even if it may not meet the legitimate intent of the search engine’s algorithm.  This is known as ‘black hat’ SEO and, if discovered by the search engine company, could result in the site’s being removed from any search result.  This article is focused on legitimate, or ‘white at’, SEO, and will examine SEO in the Google ecosystem.

SEO is ‘white hat’ as long as it meets the search engines’ guidelines and avoids deception.  Legitimate SEO intends for the content a search engine indexes and ranks to be the same that a user will see.

SEO also is trying to rank pages that are useful to the end-user. Offering up relevant content, links and CTA’s (call to actions) that guide the user through the website.

On-Page SEO

On-page SEO refers to the measures a site owner takes to improve his site’s ranking that is controlled by him or the coding of his website.  Examples of these measures and techniques are given below.

Meta Tags:  An HTML tag is a syntactic element which normally controls the formatting and display of the website.  A Meta Tag is a special element that provides information about the web page, including the page’s author, how often the page is updated, the page keywords, and what the page is about. Google uses this information to build its indices.

 

Page Content: This is the information, articles, etc. that a website contains for which the user has searched. It must be original, not copied. Each specific article should be at least 300 words long, and it should be related to its keywords. In the world of SEO content is king.

 

 

 

 

Outbound Links: These are the links that point a user to an external site. Without external links, a website becomes a dead end and its value to the user is reduced. However, outbound links should only point to quality sites with related and relevant information. Links to ‘spammy’ or low-quality websites reduce traffic. You should limit the outbound links on your site if possible, and those outbound links you do have should point to authority sites in your field.

Internal Links: Inter-links, or internal links, are those that point the user elsewhere in the site. They are useful in helping the user navigate the site, they establish the internal architecture of the site, and they share in the ranking profile. Internal links allow search engines to find all the pages on the site – and all of its content. Hidden or buried links adversely affect a website’s ranking.

Site Map: A site map tells search engines about your site and where to find all of your content. An XML site map is the preferred method for search engines and provides an easy to read link summation that permits the agents to index. A website should have both. A good site map provides a pathway for search engine agents to follow with every page on the site listed.

Robots File: This is a text file that tells search engine agents, or robots, how to crawl through and index pages on a website. If it is improperly set up you might block search engines from being able to see your site.

 

Off-Page SEO

Off-page SEO is a process of building links on external sites back to a site’s web pages.  Three link-building strategies are:

Varied Anchor Text: Anchor text is the actual text a linking page uses to talk about a site’s content.  It is normally a text hyperlink. If the anchor text exactly repeats the site’s keywords in the same way over and over, a Google spam filter will find it.  The anchor text has to be varied.

No Follow vs Do Follow: Do-follow links are just way the term says.  They are links that can be counted and used to increase a site’s Google rank.  However, ‘black hat’ optimizers have abused the feature by adding spurious links throughout the net.  No follow links are those with a nofollow tag that effectively cut the link.  Google suggests this attribute be used for paid links, in comments, in forums, and “untrusted” content.

Niche Link Building: Niche websites are those targeted at niche segments of larger markets.  Niche Link Building is connecting a website to other websites that are relevant to the content and theme, that are reliable, and that are established.

 Tracking SEO

Among the metrics which are used to gauge SEO success, two are considered here:

PageRank: This is a Google algorithm that assigns a number to rank websites. From Google:

“PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.”

It is not the only measurement Google uses, but it is the most visible. The algorithm is mathematically complex; it will yield a value between 0 and 10, with 0 being the worst and 10 the best. Values greater than 6 are difficult to obtain. A 10 would only go to something like Facebook.

Long Tailed Keywords: A long-tailed keyword is very specific to a particular internet search. It may have much lower search volumes, with less traffic generated, but there may be less competition for a page optimized for it – it is easier to rank. For example, a search for ‘exercise class’ will return many responses, while a search for ‘exercise class for senior women’ will result in fewer and more specific responses.

 

Google Updates

Google releases a ranking results algorithm on a yearly cycle since 2011, although they have released regular changes to these updates even years after the initial release of the algorithm change:

  • 2010 – Caffeine
  • 2011 – Panda
  • 2012 – Penguin
  • 2012 – EMD
  • 2013 – Hummingbird
  • 2014 – Payday
  • 2014 – Page Layout Algorithm
  • 2014 – Pigeon
  • 2015 – Mobile
  • 2015 – RankBrain
  • 2016 – Possum
  • 2017 – Fred
  • 2017 – Core Updates
  • 2017 – Intrusive Interstitials Update
  • 2018 – Medic
  • 2019 – Bert

Google makes hundreds of changes yearly. There have been issues with all releases, but anyone involved in SEO must take changes in Google’s approach seriously and quickly.

Caffeine

The caffeine update was a complete rebuilding of the search index.  Google essentially retooled the way they went about doing search index.  The main focus of Caffeine was to increase the speed that Google returned search results, increase the size of the index so they could keep track of more sites, and creating a “smart” algorithm for returning better results.

Penguin

The Penguin release was targeted at reducing the ranking of websites that violate Google’s Webmaster Guidelines.  Penguin affected 3.1% of English searches.  Google provided a feedback form for those who wished to report a ‘spammy’ site that was still highly ranked, or for those who believe they had a site unfairly penalized.

Mobile

Google is rolling out a mobile-first index quaintly referred to as Mobilegeddon 2017 (following the Mobilegeddon’s of 2015 and 2016, respectively). The name “Mobilegeddon” refers to Google’s launch of the Mobilegeddon algorithm change of 2015, in which Google assigned preferential search results for mobile sites. As was the case with the first Mobilegeddon, your site’s effectiveness and search results will be affected unless you are prepared.

Intrusive Interstitials Update

On August 23, 2016, Google announced an upcoming change that would target intrusive interstitials and pop-ups that hurt the search experience on mobile devices. As promised, this update rolled out January 10, 2017. The impact of this update on rankings was minimal.

Page Layout Algorithm

Google’s Matt Cutts announced a refresh of the page layout algorithm. No changes to the algorithm were mentioned – it appeared Google simply reran the algorithm and updated its index.

Medic

The Google Medic update seemed to disproportionately affect medical websites as well as other websites that have to do with potentially life-altering decisions (finance, law, education). Although not explicitly confirmed, Google representatives have hinted that the update implemented some of the E-A-T (expertise, authority, trust) signals from the Quality Rater Guidelines document.

RankBrain

RankBrain is a component of Google’s core algorithm that uses machine learning (the ability of machines to teach themselves from data inputs) to determine the most relevant results to search engine queries. Pre-RankBrain, Google utilized its basic algorithm to determine which results to show for a given query. Post-RankBrain, it is believed that the query now goes through an interpretation model that can apply possible factors like the location of the searcher, personalization, and the words of the query to determine the searcher’s true intent. By discerning this true intent, Google can deliver more relevant results.

Fred

What is Google Fred? Google Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits. This does not mean all sites hit by the Google Fred update are dummy sites created for ad revenue, but (as Barry Schwartz noted in his observations of Google Fred) the majority of websites affected were content sites that have a large amount of ads and seem to have been created for the purpose of generating revenue over solving a user’s problem.

Panda

The intent of the Panda update was to lower the rank of low-quality sites and to rank quality sites higher and specifically to down-rank sites which provided a poor user experience.  Testers rated thousands of websites on their quality for a number of factors and the results were fed into an artificial intelligence engine so that it could ‘learn’ quality.  Some of the initial issues were that some plagiarists were getting better responses and content originators.  This was addressed, however.  The major changes in Panda were that 1

) an entire site, rather than specific pages, could be affected in the rankings, and

2) an over-optimization penalty was enacted.

Hummingbird

Hummingbird is the newest search algorithm from Google.  Unlike Panda and Penguin, which were modifications to its existing search algorithm, Hummingbird is new.  It is designed for greater precision and attempts to take the users intent rather than individual search terms.  This ‘semantic’ search approach means that SEO must be even more aware of the user’s intent.  Many experts believe Hummingbird should have little impact on ‘white hats’ while improving the search engine users experience.  Hummingbird is still new, however, having been used only since August 2013.

EMD

Google’s Exact Match Domain (or EMD) algorithm update focused on ridding the SERPs of spammy or low-quality exact match domains.

Bert

This Google algorithm update uses natural language processing technology to better understand search queries, interpret text, identify entities and relationships between entities. We’ve seen Panda, Hummingbird and RankBrain updates move away from keywords, and the BERT update is the culmination of this effort — it allows Google to understand much more nuance in both queries and search results.

Core Updates

As far back as 2017, Google has started to refer to bigger updates as Google core updates. Since then, there is even less transparency about what those updates are and which parts of search they are intended to improve. SEOs would often track post-update ranking shifts and try to figure out what exactly has changed, but there is rarely a conclusive observation. It is likely that Google core updates are just improvements on previous Google updates or perhaps bundles of smaller updates tied together.

Payday

Google’s Payday Loan update 3.0, which largely was focused on targeting spammy queries, also included better protection against negative SEO attacks.

Pigeon

Pigeon is a Google search engine update that affects local search results. The algorithm change focuses on providing more accurate, relevant results for local searches. Initially rolled out in late July in the US, the effects of Pigeon have recently been noticed in other country search results. So what is Pigeon and how does it affect search results?

What we are able to surmise is that, like Hummingbird, Pigeon is a core change in how the Google algorithms present local search results. While there does not appear to be penalties associated with the update, some local results may have shifted. Any site that targets a local market – big or small should take note however as search visibility may be affected as there may be cases where a business was dropped from the results.

Possum
GOOGLE POSSUM INITIAL RELEASE DATE: SEPTEMBER 1, 2016

“Possum” is the name given to an unconfirmed but documented update that appeared to most significantly impact Google’s local pack and local finder results. Because the update was never officially confirmed by Google, local SEOs have been left to hypothesize about the potential update’s purpose and concrete effects.

Google Core Updates

2010

This update was mainly targeted at sites with thin content and later confirmed by Matt Cutts. Webmasters see a drop in long tail keyword traffic.

New York Times calls out e-commerce site, DecorMyEyes, and finds that its high rankings were based on a large number of negative reviews. Google changed its algorithm and takes negative vs positive into consideration.

2011

Google releases the Panda Update, which targets content forms, ad spam, and other quality issues. This update affects up to 12% of search results according to Google.

Google encrypts its search queries for privacy reasons. This does not affect rankings but it becomes extremely hard for search marketers to track organic keywords because some show up as “keyword not provided.”

2012

Google continues to make Panda updates throughout the year to refine the algorithm.

Alongside this, they take a step towards semantic search. Google rolls out its “Knowledge Graph,” an update to its search engines that provides instant results to users without having to click into a website.

2013

Google expands its knowledge graph and begins using semantic markup like schema.org to determine additional search results like in-depth articles.

Google Hummingbird is released and is a core algorithm update that affects knowledge graph and semantic search.

2014

An algorithm change is implemented to take care of spammy queries.

Google makes a change to the way local results are displayed through its Pigeon update. Pigeon integrated the local results with the rest of the search results while not fully assimilating with the core algorithm.

2015

Google releases its mobile update, telling us that mobile rankings will differ based on mobile optimization. This update heavily penalized unoptimized websites in rankings and sometimes cut their search traffic in half.

Another quality update is speculated as released early in the year but Google never makes an announcement.

2016

Google Panda is assimilated to the core algorithm. This makes it a permanent part of the search engine.

The core algorithm continues to evolve so Google can serve its customers with quality content. Expect major updates in the next few years before Google solidifies its search engines.

2017

Google makes an update to their algorithm to impact sites that are ad-heavy with low-quality content focused on revenue generation. This “Fred” algorithm update is another sign that Google is making good on its word to reward sites that provide valuable content that helps users, not just earn a quick internet buck.

Phone

407-549-0558

Email

stuart@digiscream.com

Skip to content