Skip to content

Instantly share code, notes, and snippets.

@freekrai
Forked from mattsandersuk/basics-of-seo-external.md
Last active April 18, 2016 15:41
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save freekrai/f59211843e24b8d05ddc to your computer and use it in GitHub Desktop.
Save freekrai/f59211843e24b8d05ddc to your computer and use it in GitHub Desktop.
SEO Best Practice Guidelines

External SEO

External SEO is concerned primarily with how your website is being linked or referred to.

Backlink Profile

Your backlink profile plays a huge role in helping the search engine to understand the quality and trustworthiness of your site.

If a link is from a high quality website, it's likely to be seen as a vote of trust, likewise if the link or the linking website is of a low quality it can have a negative effect. So, with that in mind it's important to make sure where possible that your links are of as high quality as possible.

Links from external websites always play an important role in determining how valuable one website is in comparison to another. Where possible, it's recommended that any external links are altered to reference the new destination URLs. This will save a search engine having to go through a redirect before reaching their final destination, and can therefore mitigate negative effects of additional load speed, algorithmic damping, etc.

NAP (aka Name, Address and Phone Number)

Having a consistent NAP across all of your online profiles and, if appropriate, directory listings, can act as an indicator of high quality to Google and other search engines.

You should make sure when creating any online profiles for your website that you reference the same name, address and telephone number listed on the website and other profiles.

Read more information here

Onpage SEO

Onpage recommendations focus on areas which can often be altered or amended using functions of the CMS. It's recommended that research be carried out as part of an SEO strategy prior to any significant changes.

Meta Data

Often (though not always) the title tag and meta-description are used as the link and excerpt within the snippet in Google's search result. They are also frequently used by social networks to automatically generate a snippet when a user shares your web page. It's important to make sure that your title tags and meta descriptions are not only optimised for search engines but your users as well.

Page Title / Title Tag

When determining the relevancy of a page to a user's search query, Google and other search engines will take your sites title tags into account. While title tags are one of many ranking factors they are weighted heavily within Google's algorithm and vital for building authority to the page. It's therefore important to make sure that they are optimised to incorporate the terms that your website is targeting.

The page title shouldn't exceed 512 pixels worth of space, there are handy tools that calculate this for you online - see the Useful SEO Resources at the end.

This is an example of a well formed page title:

Example Page Title | Page Category | Example Site Name

Meta Description

Meta descriptions are no longer one of Google's positive ranking factors however as mentioned previously they are often used within the search result snippet to give searchers a brief description of the page. Typically these shouldn't exceed 155 characters.

It can be an indicator of poor quality if every page on the website uses the same meta description therefore we advise that you use a custom one for each page or none at all. Google will automatically select relevant text to display if there isn't a meta description present.

Meta Keywords

Google has expressed that the meta keywords tag no longer provides any value within organic listings for websites of this nature. That said, Bing has been known to use the meta keywords tag as a negative ranking factor; if a large amount of keywords are used in a spammy fashion it can be interpreted as spam.

Page Content

The content of the page is one of the biggest factors within Google's (and many other search engine's) ranking algorithms. The search engine will use when determining how relevant a page is to a user's search query. You should aim to include the core keyword for the page within its content however because not to overuse it.

There is no scientific rule of thumb as to how many times a keyword should be used within a page. Referring back to Google's basic principles you should "Make pages primarily for users, not for search engines". With this in mind it's usually worth asking yourself whether the content would make sense if read aloud.

Generally you'll find that different sections of your website naturally target different keywords. Keep this in mind and make sure that what you're targeting is relevant to the content of the page.

Duplicate Content

There are instances when more than one URL can be used to access the same page, for example many sites can be accessed with or without the ‘www' in their URL (www.example.com vs example.com). This can be perceived as duplicate content therefore it's important to make sure that pages can only be accessed through one URL. To do this, a redirect should be used to send both the search engine and the user to the canonical (correct) version of the page.

Technical SEO

Technical SEO refers to changes which need to be made by the developer or by more advanced functions of the CMS.

Robots.txt

The robots.txt file contains a series of directives that you wish the search engine robots/spiders to adhere to. It can be used to tell the search engine to ignore pages or directories. An incorrectly configured robots.txt file can stop whole sections of websites from being returned within the search results or in severe cases whole websites.

See more about the robots.txt here

Sitemap.xml

A sitemap.xml file provides the search engine with a defined list of pages currently being hosted on your website. There are also other parameters which can be set that indicate to the search engine how you would like it to be crawled. If marked up correctly, the XML sitemap can also help us to highlight to search engines which pages of the website are of the highest importance.

Search engines will typically look for an XML Sitemap before crawling a website. In many cases this is handled automatically by the site's content management system. It should also be submitted to Google and Bing via their respective online Webmaster Tools.

Redirects & Header Responses

Whenever a user visits your website, the server hosting it sends a header response code to the user's browser. This informs the user's browser how to interact with the webpage. It also can influence how the search engine interprets the page.

These will often go unnoticed by the general user.

Status: 200 OK

The "200 OK" header response is the most common of what we're covering. This informs the browser that all is working as it should be. However this should only be returned when the webpage is present and rendering.

Redirect: 301 Permanent

"301 Permanent" redirects indicate to search engine spiders that a page has permanently been moved to a new location. This means that the destination of the redirect will replace the source in search results, and that as much existing value as possible (PageRank, authority, backlinks, etc.) will be attributed to the destination.

When retiring old content we can redirect users to new but relevant content. This should be done using a "301 Permanent" redirect. Redirecting the user should be seamless for the user however be careful not to redirect the user to an irrelevant page as it may cause confusion.

Redirect: 302 Temporary

As with a "301 Redirect" the "302 Temporary" redirect will send the user to a new relevant page if the old page no longer exists. However, it's very important to note that none of the old page's authority and relevancy will be transferred. Therefore this type of redirect should only be used in special circumstances, i.e. when the redirect is actually temporary.

Canonical Link Element

The canonical link element can be used to highlight to the search engine which page should be seen as the dominant version. This is useful when the same content is accessible via multiple URLs.

Error Handling

When the user requests a page that is not present on the website (this could be if someone has linked to the website using an incorrect URL or they simply misspell the URL) they should be presented with an error page and the server should return a 404 header response. The 404 header response is an instruction to the search engine that it is not a valid page and therefore should not be indexed by the search engine (or if it has already been indexed it should be removed). It also helps to ensure that these errors are located easily using tools such as Google/Bing Webmaster Tools.

If a page cannot be found but is still returning a "200 OK" header response it can be deemed as a "Soft 404" (an error page which is functioning incorrectly) - this is an issue which needs to be avoids.

The Basics of SEO (Updated Sept 2014)

Below you’ll find a brief overview of how a search engine works, along with some best practice guidelines on how to update your website.

How a Search Engine Works

A search engine discovers web pages primarily by following links, these can be present on other websites or by manual submission.

When it discovers a new page it keeps track of it in its index. It will then process the page through their algorithm(s). Its the search engine's algorithm(s) that enable it to determine how relevant it is for a searchers query.

The algorithms take into account a huge range of factors, these include what we're broadly defining as:

  • Onpage SEO factors
  • Technical SEO factors
  • External SEO factors

Google's Algorithm Updates

Every so often the search engine will update their algorithm, this can cause the rankings of your website to fluctuate.

You can read more about how Google works here.

You may have heard of Google's Panda and Penguin algorithms, these are code names referring to different sets of significant upgrades to or sections of the algorithm.

Spam & Google's Quality Guidelines

The main goal of the search engine is to present the most relevant search results as they can for the users search query. This means that if, according to their algorithm, the page is deemed to be more relevant to a given term than a competing website it will appear higher in the search results than it.

Google wants to cut down on the amount of 'spam' within the search results, these can sometimes be referred to as low quality websites or low quality content. The core of SEO is to improve the quality of the content through various means and make sure that your website is not seen as spam, but instead of high quality and useful for the end user.

Google provide a series of quality guidelines, adhering to them can not only help your site to become more SEO friendly but more user friendly too.

These are Google's core principals, they should influence any SEO decisions made for the site:

  • Make pages primarily for users, not for search engines.
  • Don't deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what > you've done to a website that competes with you, or to a Google employee. Another useful test is to ask, "Does this help my users? > Would I do this if search engines didn't exist?"
  • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Source: https://support.google.com/webmasters/answer/35769?hl=en#3

SEO Tools

These are commonly used tools that we use to monitor your website and make sure it's well optimised. Access to all of these tools can be shared.

Webmaster Tools

Google and Bing both offer Webmaster Tools, these online tools provide us with the ability to monitor how the search engine sees the site. They will notify us of any major changes to the website that the search engine is seeing, this can include server errors, a significant increase in "404 Not Found" header responses or if a site has been hacked for instance.

Google Analytics

There are many applications that allow you monitor your website's traffic however Google Analytics is both comprehensive and free. Even if it isn't going to be extensively used it's worthwhile installing as it'll allow us to see any issues that may occur with the website. It also allows us to monitor how your users are using the site and see which areas are performing well.

Useful SEO Resources

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment