- +407-549-0558
- stuart@digiscream.com
- Mon - Fri: 9:00 - 18:30
or SEO, is a process consisting of a collection of techniques whereby websites attempt to increase traffic by gaining higher placement in the search results of a search engine. SEO is a key component of any internet marketing plan.
Search engines should return results that are most meaningful to their users. To do this, they employ various algorithms to rank most relevant pages to the search query. SEO is intended to improve the likelihood that a site is found by the search engine and that it has a high degree of relevance to the search.
Some sites try to ‘game’ the system to fool search engines into returning their Web page even if it may not meet the legitimate intent of the search engine’s algorithm. This is known as ‘black hat’ SEO and, if discovered by the search engine, could result in the site’s being removed from any search result. This article is focused on legitimate, or ‘white at’, SEO, and will examine SEO in the Google ecosystem.
SEO is ‘white hat’ as long as it meets the search engines’ guidelines and avoids deception. Legitimate SEO intends for the content a search engine indexes and ranks to be the same that a user will see.
SEO also is trying to rank pages that are useful to the end-user. Offering up relevant content, links and CTA’s (call to actions) that guide the user through the website.
On-page SEO refers to the measures a site owner takes to improve his site’s ranking that is controlled by him or the coding of his website. Examples of these measures and techniques are given below.
Meta Tags: An HTML tag is a syntactic element which normally controls the formatting and display of the website. A Meta Tag is a special element that provides information about the web page, including the page’s author, how often the page is updated, the page keywords, and what the page is about. Google uses this information to build its indices.
Page Content: This is the information, articles, etc. that a website contains for which the user has searched. It must be original, not copied. Each specific article should be at least 300 words long, and it should be related to its keywords. In the world of SEO content is king.
Outbound Links: These are the links that point a user to an external site. Without external links, a website becomes a dead end and its value to the user is reduced. However, outbound links should only point to quality sites with related and relevant information. Links to ‘spammy’ or low-quality websites reduce traffic. You should limit the outbound links on your site if possible, and those outbound links you do have should point to authority sites in your field.
Internal Links: Inter-links, or internal links, are those that point the user elsewhere in the site. They are useful in helping the user navigate the site, they establish the internal architecture of the site, and they share in the ranking profile. Internal links allow search engines to find all the pages on the site – and all of its content. Hidden or buried links adversely affect a website’s ranking.
Site Map: A site map tells search engines about your site and where to find all of your content. An XML site map is the preferred method for search engines and provides an easy to read link summation that permits the agents to index. A website should have both. A good site map provides a pathway for search engine agents to follow with every page on the site listed.
Robots File: This is a text file that tells search engine agents, or robots, how to crawl through and index pages on a website. If it is improperly set up you might block search engines from being able to see your site.
Off-page SEO is a process of building links on external sites back to a site’s web pages. Three link-building strategies are:
Varied Anchor Text: Anchor text is the actual text a linking page uses to talk about a site’s content. It is normally a text hyperlink. If the anchor text exactly repeats the site’s keywords in the same way over and over, a Google spam filter will find it. The anchor text has to be varied.
No Follow vs Do Follow: Do-follow links are just way the term says. They are links that can be counted and used to increase a site’s Google rank. However, ‘black hat’ optimizers have abused the feature by adding spurious links throughout the net. No follow links are those with a nofollow tag that effectively cut the link. Google suggests this attribute be used for paid links, in comments, in forums, and “untrusted” content.
Niche Link Building: Niche websites are those targeted at niche segments of larger markets. Niche Link Building is connecting a website to other websites that are relevant to the content and theme, that are reliable, and that are established.
Among the metrics which are used to gauge SEO success, two are considered here:
Domain Authority : Once upon a time in the SEO world we used a measurement called Pagerank. This was Google’s way to measure a websites ranking relevance via links. From Google:
“PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.”
Though Google no longer displays pagerank they do still use links as an indicator to a websites trust. It’s not the only measurement Google uses, but it is the most visible. The algorithm is mathematically complex; so knowing all the factors it takes to rank a webpage is near impossible, but rest assured building links and great content are two primary ones.
Long Tailed Keywords: A long-tailed keyword is very specific to a particular internet search. It may have much lower search volumes, with less traffic generated, but there may be less competition for a page optimized for it – it is easier to rank. For example, a search for ‘exercise class’ will return many responses, while a search for ‘exercise class for senior women’ will result in fewer and more specific responses.
Each year, Google makes hundreds of changes to the search ranking algorithm. In 2018, they reported an incredible 3,234 updates — an average of almost 9 per day, and more than 8 times the number of updates made in 2009. While most of these changes are minor, Google occasionally rolls out a major algorithmic update (such as Panda and Penguin) that affects search results in significant ways. This is just a small list of major updates.
Not all websites see impacts from updates or issues with the numerous releases, but anyone involved in SEO should approach algorithm updates a slight bit of caution and seriousness.
The caffeine update was a complete rebuilding of the search index. Google essentially retooled the way they went about doing search index. The main focus of Caffeine was to increase the speed that Google returned search results, increase the size of the index so they could keep track of more sites, and creating a “smart” algorithm for returning better results.
The intent of the Panda update was to lower the rank of low-quality sites and to rank quality sites higher and specifically to down-rank sites which provided a poor user experience. Testers rated thousands of websites on their quality for a number of factors and the results were fed into an artificial intelligence engine so that it could ‘learn’ quality. Some of the initial issues were that some plagiarists were getting better responses and content originators. This was addressed, however. The major changes in Panda were that 1) an entire site, rather than specific pages, could be affected in the rankings, and 2) an over-optimization penalty was enacted.
The Penguin release was targeted at reducing the ranking of websites that violate Google’s Webmaster Guidelines. Penguin affected 3.1% of English searches. Google provided a feedback form for those who wished to report a ‘spammy’ site that was still highly ranked, or for those who believe they had a site unfairly penalized.
Hummingbird is the newest search algorithm from Google. Unlike Panda and Penguin, which were modifications to its existing search algorithm, Hummingbird is new. It is designed for greater precision and attempts to take the users intent rather than individual search terms. This ‘semantic’ search approach means that SEO must be even more aware of the user’s intent. Many experts believe Hummingbird should have little impact on ‘white hats’ while improving the search engine users experience. Hummingbird is still new, however, having been used only since August 2013.
Pigeon is a Google search engine update that affects local search results. The algorithm change focuses on providing more accurate, relevant results for local searches. Initially rolled out in late July in the US, the effects of Pigeon have recently been noticed in other country search results. So what is Pigeon and how does it affect search results? What we are able to surmise is that, like Hummingbird, Pigeon is a core change in how the Google algorithms present local search results. While there does not appear to be penalties associated with the update, some local results may have shifted. Any site that targets a local market – big or small should take note however as search visibility may be affected as there may be cases where a business was dropped from the results.
Google is rolling out a mobile-first index quaintly referred to as Mobilegeddon 2017 (following the Mobilegeddon’s of 2015 and 2016, respectively). The name “Mobilegeddon” refers to Google’s launch of the Mobilegeddon algorithm change of 2015, in which Google assigned preferential search results for mobile sites. As was the case with the first Mobilegeddon, your site’s effectiveness and search results will be affected unless you are prepared.
RankBrain is a component of Google’s core algorithm that uses machine learning (the ability of machines to teach themselves from data inputs) to determine the most relevant results to search engine queries. Pre-RankBrain, Google utilized its basic algorithm to determine which results to show for a given query. Post-RankBrain, it is believed that the query now goes through an interpretation model that can apply possible factors like the location of the searcher, personalization, and the words of the query to determine the searcher’s true intent. By discerning this true intent, Google can deliver more relevant results.
“Possum” is the name given to an unconfirmed but documented update that appeared to most significantly impact Google’s local pack and local finder results. Because the update was never officially confirmed by Google, local SEOs have been left to hypothesize about the potential update’s purpose and concrete effects. Fred Intrusive Interstitials Update Mobilegeddon RankBrain Panda Penguin Hummingbird Pigeon Payday EMD (Exact Match Domain) Page Layout Algorithm
What is Google Fred? Google Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits. This does not mean all sites hit by the Google Fred update are dummy sites created for ad revenue, but (as Barry Schwartz noted in his observations of Google Fred) the majority of websites affected were content sites that have a large amount of ads and seem to have been created for the purpose of generating revenue over solving a user’s problem.
There are several triggers that search engines use to evaluate and determine the ranking of a website Taking a look at these factors will tell our experts exactly what search engines are seeing and what needs to be done to improve your current rankings in the SERP's.
Link building is a fundamental part of search engine optimization. Having inbound links to your website tells search engines that your content is relevant and liked, much like an endorsements. Understanding what type quality the links are can help your rankings.
The content of your website should be engaging enough to get visitors interested enough to stick around and take a look at the rest of your site. Your main content pages should written in a way that will engage first-time visitors to convert or share.
Email stuart@digiscream.com