Ultimate On-Page SEO Guide

Over the last 9 years, this SEO guide has gone through many variations. We first built it for internal training purposes, then we decided to share it with clients to help them better understand what we were doing. After years of updating and tweaking it, we’ve decided to just share it with everyone. Enjoy!

When looking to build a website, determining which CMS (content management system) you are going to use is a critical first step. Sure, you could have someone code your site from scratch, but why waste the effort when you don’t have to (not to mention that’s probably a terrible idea, for lots of reasons)?

There are literally hundreds of CMS options, and it can be hard to determine which one is the best. Unfortunately, not all content management systems are created equal, and some can actually do more harm than good (as an SEO rule of thumb, avoid anything .NET/ASP, ColdFusion, and anything made by Adobe).

As an SEO company, one of the problems we see most often is clients who’ve built their websites on a CMS that does not support SEO best practices, thus making it difficult (if not impossible) to rank for their target keywords. Without further ado, here is a list of features you should insist on in an SEO friendly CMS:

11 Must Have Features

  1. Static, customizable URLs – You should be able to define both the page name and the directory structure for the entire website, page-by-page (not database driven, unless it can be manually overridden if necessary). Keyword rich and search engine friendly URLs are an SEO must have. If this feature is missing it should be a deal killer.
  2. Support for custom URL redirects (301s, 302s, etc). – At some point you will change a page name, move it, or change its structure in your site hierarchy. In order to keep the trust and value of any inbound links to that page and to avoid creating a poor user experience, you MUST properly redirect it when you move it using a 301 permanent redirect. Your CMS should allow you to do this one page at a time if needed, or for blocks of pages. You should also be able to directly edit your .htaccess file, if needed.
  3. Customizable Title tags and Meta tags on Every Page – The Title tag and Meta description tags are important SEO elements, and should be unique for each page. They should always be carefully written and never duplicated from page to page. Meta keywords tags have no value, and should be left blank. These tags should not be database driven, but written manually and thoughtfully for each page.
  4. Custom Image File names and Image ALT Tags for Each Image – Search engines look at image file names and ALT tags for keyword usage to help define the topic of a page and the site as a whole. You need to be able to choose the image file name, and you should have the ability to define an ALT tag and if needed an image Title tag.
  5. Support for Rel=Canonical tags and Meta Robots tags on Each Page – These elements help to prevent duplicate content penalties, especially with eCommerce websites. You should be able to add this tag on a page-by-page basis as needed.
  6. The Ability to Directly Edit the HTML on ANY Page – This is important for customizing content, link anchor text, NoFollow tags, heading tags and other HTML elements.
  7. Automatically Generated and Updated Sitemap.xml File – Search engines use this file to find and index all of the pages on your site. It can be a real pain to maintain manually, so the automatic feature is extremely handy. If you have videos on your site, you should also have a separate video sitemap.
  8. Support for Blog Integration – Every website needs a blog, since that is the best way to ensure ongoing content growth (which search engines love). Your CMS should support having a blog in a sub-folder.
  9. Clean, Lean Coding – Your CMS should make use of HTML best practices, using clean, current code (validates 100% with W3C), and that loads quickly. Valid, fast loading code is appealing both to visitors and search engines. Avoid code filled with useless or excessive JavaScript, unnecessary non-breaking spaces and antiquated tables instead of clean CSS. If possible, find a CMS that supports HTML5/CSS3. For images, use compression, and for layout elements consider using CSS sprites. Make sure to set page caching quite a ways out.
  10. Rich Snippet Support – Schema and RDFa, for example, but other rich text markup such as reviews, products, apps, locations and services are also a good idea if relevant.
  11. Mobile Support – This could be responsive design, where the site auto-adjusts to any screen size or device type, or it could be a dedicated mobile version of a site. Whatever option you go with, at this point in time not having a mobile friendly version of your site is a major negative factor. At this point, support for AMP (accelerated mobile pages) would also be advisable.

5 Potentially Nice to Have Features

  1. Social Sharing – If you want people to share what they read on Facebook or Twitter or any other social site, you need the ability to integrate buttons to make that sharing easy. You might also want things like open graph tags, Twitter cards, Facebook insights, etc.
  2. Social Comments – If you want to expand the reach of your content, using something like Facebook Comments or Disqus to power your commenting system can be super useful. That said, Disqus in particular creates a crap ton of links on the pages they are on, all pointing off your domain. They are NoFollow links, but that doesn’t mean much (NoFollow links still burn off some link juice on the page). Weight the pros and cons of social commenting carefully.
  3. Custom 404 Page – Hopefully your site will never have any 404 errors, but if it does it’s nice to have something show up other than “404 error – page not found”. Create a custom 404 page that lists your most popular pages and perhaps a search box. Keep them on the site! Consider making your 404 page funny, like these: http://fab404.com/
  4. Automatic Link Management – Whenever you change the name of a page, any links to that page break. While implementing a 301 redirect for the changed page will fix this, you would ideally want to go through the site and change all links to reflect the new page name. Doing this manually sucks, so find a CMS that supports automatic link updating if possible.
  5. User Generated Content – Make it possible for visitors/customers to rate products or services, leave reviews, etc. For some industries this is a must have, not a nice to have.

WordPress (.org) is, by far, the most SEO friendly CMS available (with some minor modifications), and it’s a good fit for the vast majority of website needs. If you need eCommerce capabilities, WordPress offers those too. From Cart66 and WooCommerce to custom Magento installations, there are plenty of good options on this front.

This isn’t a comprehensive list of features by any means, but any CMS with these elements will be about as SEO friendly as it can get. Beyond this list, look for additional features as dictated by your company needs.

Believe it or not, server speed, server up-time, server location, IP address and type of hosting are all elements that can impact your rankings, some directly and some indirectly. Since Google wants its users to have an awesome experience on the sites that Google delivers in search results, Google tries to analyze sites from a UX (user experience) perspective.

A site that loads slowly, that is down often, or that shares IP addresses and/or hosting servers with less than savory sites (shared hosting) can negatively impact that user experience. Beyond that, being on a shared server can pose major security risks for your website, depending on how well your hosting provider has things configured.

To avoid any such problems, it is recommended that you host your website on a dedicated server that has the capacity to handle your site features and traffic (with capacity to spare) to avoid any downtime. Having secondary and tertiary servers as redundancies is also highly recommended (cloud platforms can help with this). Though certainly not required, you may also find some benefit in using a server located in the country that matches your TLD (i.e. USA for .com, .net, UK for .co.uk, etc.).

It is also highly recommended that you have a dedicated C class IP address for your website, and that you don’t share that IP address with ANY other websites. This avoids any potential cross-linking devaluation issues, as well as preventing bad neighborhood issues. A bad neighborhood occurs when a number of sites on a shared IP address are identified by Google as spammy, thus hindering the rank-ability of any other site on the same IP address.

Make sure to schedule regular backups of your website and all databases at non-peak traffic times (2am is usually a great time). Also, make sure to regularly test the page load speed on your site. If you’re running WordPress, the best option available is probably WP Engine.

On April 9th, 2010 (over 7 years ago, come on already!), page load speed officially became a part of the Google search ranking algorithm. In addition, Google has made multiple updates in the last few years regarding mobile usability and mobile page load speed. For usability reasons, best practices dictate that a web page should load within 1-2 seconds on a typical connection. However, according to Google a load time of 1.4 seconds seems to be the threshold between a fast page and a slow page. That means, ideally, that every page on your website should load in 1.4 seconds or less (to the Load Event; when the page is technically complete), to receive the maximum SEO and usability benefit for fast loading pages.

Google gathers page load time data through actual user experience data collected with the Google search toolbar, Google Chrome, Google Analytics, and may also be combining that with data collected as Google crawls a website and from other sources as well. As such, page load speed in terms of the ranking algorithm is likely being measured using the total load time for a page, exactly as a user would experience it, and not just user perceived load time (visual load time). Though Google has claimed that only TTFB (time to first byte) is factored, we don’t buy that, because it isn’t a UX focused metric.

One of the best resources for tips and tricks for lowering page load speed is http://developer.yahoo.com/performance/rules.html, and one of the best tools for testing your site is http://tools.pingdom.com/fpt/ (unless you have Google Analytics configured, in which case that is the best possible source of page load speed data).

There are a few key things you can do to increase that page load speed to reach Google’s recommended page load speed of 1.4 seconds or less. Three things that would be most impactful would be leveraging browser caching, CSS Sprites for images where possible, and reducing the image file sizes as much as possible for images that can’t be sprited (different file types, removing unnecessary color channels, etc.). You might also see benefits by using a content delivery network (CDN) for your images.

We would also recommend reducing the total number of CSS and JavaScript files by combining them into fewer files, and minimizing the file sizes by using compression and code minification where feasible. Tools like WPMU Hummingbird can help a ton with this.

W3 Total Cache is another excellent WordPress plug-in that can help with page load speed issues, and a simple CDN can be set-up via Amazon AWS for very little money. You can learn how to do this here.

Mobile usability signals fall roughly into two buckets, UI (User Interface) and UX (User Experience). The first, UI, means having a mobile friendly site (preferably a responsive site), with appropriately sized tap targets, a well thought out mobile navigation, auto-scaling images, and an easily readable font, font size, and color palate. Google has a great tool here to help evaluate this.

UX means having a fast loading mobile site, not showing full page mobile pop-ups, minimizing ads (especially ads with video or audio), and anything else that could cause a user to feel like they’re getting a bad experience on your site. Google has another great tool here to measure some aspects of this.

This is worth calling out again: DO NOT show shitty pop-ups on mobile devices – Google is demoting sites for this.

Mobile page speed is of particular note, because it’s the single biggest factor in whether someone stays on your site or leaves when visiting from a mobile device. One relatively easy way to boost your mobile speed is to implement AMP pages, which Google is showing a lot of favoritism towards lately.

The W3C markup validator service can be found at this location http://validator.w3.org/

Because there are so many programming languages and so many ways to accomplish any one thing using each language search engines rely on certain rules in which they read the content of the website. Having code that adheres to these rules removes and helps to minimize errors when parsing or separating the code from the content of any one page.

Search engines such as Google have openly stated that W3C standards are what they suggest when making the code easy to understand for them. We typically only test the home page of the website, because many issues can be easily fixed a crossed the entire website using just its page templates.

At the VERY least, make sure you heavily test your site for cross-browser compatibility. Make sure you support the browsers and versions that reflect ~95% of your target demographic.

Picture your website as a series of pyramids, 5-10 sitting side by side. It should look something like this:

6 Pyramids

The tip of each pyramid represents a top level page, such as Home Page, About Us, Products, Services, etc. Each of these top level pages should be optimized for a root keyword with significant search volume. The middle and base of the pyramids represent sub-pages, pages under the top-level pages that are topically related. These should be optimized for variations of the top-level or root keyword.

For a website about, say, Bicycle Repair, you might optimize the pages as follows:

  • Home Page – “Bicycle Repair City”
  • About Us – “Bicycle Repair Shop in City”
  • Services – “City Bicycle Repair Services”

The root keyword, the main topic of the entire site, would be Bicycle Repair [City]. As such, the home page is optimized for Bicycle Repair, and all other main pages for variations of that keyword.

Then, you go one more level down. Since Services is optimized for Bicycle Repair Services, pages under that page should be optimized for things like Cheap Bicycle Repair Services, San Francisco Bicycle Repair Services, Bicycle Repair Services in Utah, etc. The higher up a page is within the site hierarchy, the more search volume the keyword assigned to that page should have. The lower the page is in the hierarchy, the lower the search volume of the keyword phrase.

This is called creating keyword silos, and it is a critical component of SEO and user friendly site design. Not only does this make your site topically clear to search engines, but it makes site navigation super simple for users as well. For optimal search engine and user usability, don’t go more than 3-4 levels deep (i.e. Top Level Page, Sub-Page, Sub-Sub-Page). A user or search engine should be able to get to any page on your site in 4 clicks or less. This is called a flat architecture.

Your site’s navigational elements will follow this exact same structure. Speaking of navigational elements, make sure that all site navigation consists of HTML and CSS only. Don’t ever use JavaScript or Flash for your navigation, as that can cause both search engine indexing problems and user experience problems. The details are complicated, so just don’t.

Never underestimate the value of long-tail traffic. More and more queries each year are long-tail, and by anticipating questions people might type into search engines and creating matching content, you can get far ahead of your competitors. You can also mine data from your analytics organic keyword data, your Google and Bing Webmaster tools search query data, and other sources like Google Trends, Google’s Keyword Tool, SpyFu, Quora, and Yahoo Answers.

A site’s URL structure is extremely important to both users and search engines. Poor URL structure can not only hurt rankings, but can prevent pages from being indexed and lower the click-through-rate (CTR) in the SERPs (Search Engine Results Pages). Though they have dialed back some on that value, Google is still biased towards exact keyword match non-hyphenated domains, so get one if you can.

It is extremely important that URLs be readable, user friendly, and that they contain the keyword of the page. It should never be longer than 256 characters (though ideally under 100 characters), and should contain no query parameters, strange number sequences, spaces or symbols. If a number sequence is needed for eCommerce reasons (like a product ID), append the number to the end of the search engine friendly URL, NOT the beginning. (i.e. https://www.awesomesite.com/killer-keyword-url-1234/)

If relevant, a geo-qualifier (such as Seattle WA) should also be included in the URL.

A proper URL will consist of lowercase words separated by dashes/hyphens only (no underscores, since Google combines words separated by underscores, and no uppercase letters).

The URL structure should never go more than 3 sub-directories deep, just as the site navigation should never go more than 3 directories deep. The more important the page, the higher up in the directory structure that page should be. The URL structure should function as a bread crumb, telling visitors exactly where they are within the site.

Your site should, at this point, be HTTPS. Google will be throwing a warning in Chrome for HTTP sites, and there are plenty of good reasons at this point to be using HTTPS, so put that in place.

At this point, DO NOT use file-type endings (.html, .php, .asp, etc.) These are unnecessary, add length to your URLs, and show that your site is dated.

An ideal URL structure is as follows:


This URL structure is written perfectly for both SEO and search engine indexing. It is short (70 characters), descriptive, keyword rich, and contains no query parameters. The use of a category can help with usability, but it is not always necessary for SEO purposes. In fact, it can be beneficial to have all pages in the root directory for a smaller site.

For your blog, DO NOT use date-based URL structures (/2017/08/), as they tell both Google and users how old your content is, which can have a negative impact on your rankings for a variety of reasons.

It is very important to cross link within the pages and content of one’s site using keyword rich anchor text and/or from within keyword rich content that matches the linked-to page. Pages of similar topic should cross link to each other using relevant keywords in the anchor text or surrounding body text. There should ideally be 5-10 top-level pages on your site that effectively cross-link to each other and that have more inbound links than any other pages on your site. These pages will become your site links.

When linking internally, keep in mind that all internal links should use absolute URLs (i.e. http://www.domain.com/page-name/), not relative URLs (i.e. /page-name/).

In addition to linking from within the text of a page, keyword rich anchor text should be used in the main navigation elements. Where space prevents the use of the keyword for the page being linked to in the navigation, it is important to include the title element in the navigation anchor tag, as follows:

<a href=”http://www.domain.com/” title=”Home of the awesome keyword”>Home</a>

The same goes for links outside of your site. When you get a link on a blog, forum or press release, some of those links should include the keyword of the page being linked to in the anchor text. At present, the ratio of anchor text from inbound links that is considered safe is 15-30% (though this number isn’t exact, and can vary wildly from vertical to vertical, so be careful).

Also, the keyword used in the anchor text going to a page should be topically consistent. One should not use “Keyword Topic A” in the anchor text for one link, and then “Keyword Topic B” in the anchor text of another link to the same page (unless Keyword Topic B is a very close variation of Keyword Topic A). A page should have one core topic, so be consistent.

While Google is more than capable of crawling and executing basic JavaScript, they don’t always get things right. To be on the safe side, all links that you want crawled should be in plain old HTML (this includes navigation elements). If you’re considering using something like AJAX on your site, be aware of the risks and how to minimize them. Pushstates tend to be the best option, but pre-render is another option (though no longer Google’s preferred one).

Because Google and other search engines crawl the web link-to-link, broken links can cause SEO problems for a website. When Google is crawling a site and hits a broken link, the crawler immediately leaves the site. If Google encounters too many broken links on a site it may deem that site a poor user experience, which can cause a reduced crawl rate/depth, and both indexing and ranking problems.

Unfortunately, broken links can also happen due to someone outside of your site linking in incorrectly. While these types of broken links can’t be avoided, they can be easily fixed with a 301 redirect.

To avoid both user and search engine problems, you should routinely check Google Webmaster Tools and Bing Webmaster Tools for crawl errors, and run a tool like XENU Link Sleuth or Screaming Frog on your site to make sure there are no crawlable broken links.

If broken links are found, you need to implement a 301 redirect per the guidelines in the URL Redirect section. You can also use your Google or Bing Webmaster Tools account to check for broken links that have been found on your site.

In addition to broken links (404s), you also need to watch out for a variety of other error types. 403s are fairly common, and mean Google is hitting a page you don’t allow them to hit…in some cases, this is fine, in others, it may not be. Monitor Google Search Console regularly to make sure you know what’s going in.

Last but not least, server errors (5XX errors) can be a big deal, because they tell Google your server is overloaded, which Google interprets as “whoops, we need to crawl this site way less”. You don’t want Google reducing your crawl budget, so make sure you aren’t throwing server errors.

Unless a redirect is truly temporary, such as for a time sensitive promotion, 302 redirects should never be used. 302 redirects don’t pass any link value, and are essentially a dead end for SEO. In almost every scenario where a redirect is needed, a 301 (permanent) redirect should be used.

Any page that changes URLs or is deleted needs a 301 redirect to tell search engines and users that the page has moved/is gone. There should never be more than one URL path to a page. You can learn more about redirects here: http://moz.com/learn/seo/redirection

On an Apache server, redirects will be configured via the mod_rewrite module and your .htaccess file. Making these sorts of changes is a very technical task, and can break your website if done incorrectly. Unless you’re very technical, it’s best to leave this one to your web developer.

That said, here are a couple of common issues, and the correct code to use in your .htaccess file to fix them:

## Always include this at the start of your htaccess file ##
Options +FollowSymlinks
RewriteEngine On


## Redirect HTTPS URLs to HTTP URLs ##
RewriteCond %{HTTPS} on
RewriteRule (.*) http://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

## Redirect HTTP URLs to HTTPS URLs ##
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

## Redirect Non-WWW to WWW ##
RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]

## Rewrite all MiXed Case uRLs to lower case URL’s and 301 redirect ##
RewriteMap lc int:tolower
RewriteCond %{REQUEST_URI} [A-Z] RewriteRule (.*) ${lc:$1} [R=301,L]

## Redirect index.htm, index.html or index.php to the trailing slash recursively for the entire site ##
RewriteCond %{THE_REQUEST} /index.html? [NC] RewriteRule ^(.*/)?index.html?$ /$1 [R=301,L] RewriteCond %{THE_REQUEST} /index.php? [NC] RewriteRule ^(.*/)?index.php?$ /$1 [R=301,L]

## Ensure all URLs have a trailing slash ##
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1/ [L,R=301]

## Remove Spammy Query Strings ##
<ifModule mod_rewrite.c>
RewriteCond %{QUERY_STRING} enter|query|strings|here [NC] RewriteRule .* http://www.%{HTTP_HOST}/$1? [R=301,L] </ifModule>

## End of htaccess file ##

Obviously, if any of these don’t apply to your site, don’t use them. Test these carefully before you roll them live on a site! (I take no responsibility for broken sites, you’ve been warned).

If your site has HTTPS pages for a good reason, you definitely don’t want to go redirecting them to HTTP pages! For the spammy query strings (say, someone pointed a ?pill query string at your site), just replace the fake text with the real query strings, separated by pipes if there are multiple query strings.

Moz has a fantastic tutorial here. It’s thorough enough that we see no need to craft something similar here! Sufficeth to say that you should use your robots.txt file to block pages that you don’t want indexed. You can see our current robots.txt file here, for reference.
In a nutshell, make sure your sitemap.xml file exists immediately after your root directory (www.domain.com/sitemap.xml), and that it contains a properly formatted list of every page on your site that you want Google and other search engines to list in their indices. If a page wouldn’t make a good search engine entry point, it’s probably best not to list it. You can learn the nuts and bolts of creating a sitemap here, and specialized sitemaps here.
After the URL, the second most important place on a page to have the keyword is the Title tag. A proper Title tag will:

  • Aim for 55-65 characters in length (Never more than 70 characters, and never shorter than 15 characters).*
  • Be unique to and descriptive of that page (never use identical or mostly identical title tags on multiple pages on the same site).
  • Use the keyword of that page twice if space permits (once at the start, followed by a separator such as a colon and a space, and then once again in a call to action). If the character limit prevents the use of the keyword twice, use it once in a good call to action, with the keyword as close to the beginning of the title tag as possible.
  • If relevant, include a geo-qualifier (such as Seattle WA).
  • Typically not be used for branding purposes, unless you have a major brand whose name would increase SERP click-through-rates. If you must include your brand name, use it at the end of the Title tag, not at the beginning.

* – Character length is an approximation; Google is actually using a pixel width limit, not a character limit. Title tags appear in 13.5pt Arial font by default (18px), with searched for keywords bolded, and Google has a pixel width limit of 600 pixels for Titles.

You can see if a Title will truncate by doing the following: simply use Excel, set column width to 600px, set columns to wrap text, and font to Arial 13.5pt. Type in your Title, and bold the main keyword. If the line breaks, your Title tag will truncate. (You can also use this tool to check: Title Length Tool).

65 characters is now considered the safe upper limit, as this Title character limit will avoid truncation 95% of the time.

Proper title tag structure is as follows:

<title>Keyword Phrase: Call to Action Using Keyword Phrase</title>

A colon and a space are used as the separator because it uses the least amount of characters. You can also use – or | (a dash or a pipe).

The Title tag is the first description of the page that search engine users will read, and it is extremely important to both users and search engines that it contain the keyword they are searching for. This will not only help to improve rankings, but can significantly improve the click-through-rate on search engine results pages (SERPs).

Meta description tags are used as the description searchers will see in the search engine results pages (SERPs). Having the keyword used properly in the Meta description tag is not a part of the ranking algorithm, but will increase the likelihood that users will click on the link to the page. It should adhere to the following guidelines:

  • Be unique and relevant to that page, be written as ad text, and contain a call to action.
  • Be no more than 200 characters in length (aim for 180 to be safe), including spaces and punctuation (and no less than 50 characters).
  • Contain 1-2 complete sentences, correct grammar and punctuation.
  • Use the keyword once or twice (once per sentence, as close to the start of each sentence as possible).
  • Include a geo-qualifier (City and/or State), such as “Seattle WA”, only if relevant.

A proper Meta description tag would be:

<meta name=”description” content=”Keyword Phrase used in a question? Keyword Phrase used in a good click-inducing call to action.” />

To further encourage the search engines to use the description you provide, add this tag to every page:

<meta name=”robots” content=”noodp, noydir” />

This tells search engines not to use snippets from DMOZ or the Yahoo Directory. Some people think this tag is no longer needed. We disagree, and have seen some pretty wonky descriptions come from not blocking this.

These no longer have any SEO value on Google, Yahoo or Bing. The current best practice is to leave these off of your site entirely. If your CMS forces you to have this tag present, either leave it blank or make sure that there are no more than 1-2 keywords in the tag, and that those keywords are used at least twice each on the page in question.

While these tags have no SEO value, misusing them can still have a negative impact on your rank-ability. Keyword stuffed meta keywords tags are still a negative signal, and could negatively impact your ability to rank organically.

Search engines weight text for SEO value based on text size and position on the page. Heading tags are supposed to be larger than the other text on the page, and should typically be the first thing in the content section of the page; thus the added benefit of having the keyword in the heading tags.

Every page should have an H1 tag, as search engines look to the H1 to help determine the topic of a page. It should be the first thing in the body text of the page, and should appear prominently.

The keyword of a page needs to be used in the H1 tag, and in at least half of the total heading tags on a page. There should never be more than one H1 on a page. H1 tags should never wrap images or logos, only text.

From a usability perspective, paragraphs should never be longer than 5 lines of text, and it is wise to break up a page every 2-3 paragraphs with a sub-heading in the form of an H tag (H2 or H3). Testing has shown that when users are faced with a large block of unbroken text, most either skim over the text or skip it altogether.

We recommend no more than 1 heading tag per 150 words on the page. It is VERY important that the keyword of a page be used in the H1 tag, as close to the beginning of the H1 as possible. Ideally, if a page has at least 300 words of content, there should be at least one additional H tag on each page that contains the keyword, for added SEO value.

For the benefit of search engines, code compliance, and visually impaired users, every image MUST have an ALT tag. The ALT tag should accurately describe the image, and should contain a keyword relevant to that page only if the keyword is relevant to the image as well. The ALT tag should be the same for every instance of that image on the entire site.

<img src=”keyword-rich-image-name.jpg” alt=”Describe Image” />

Image file names should be comprised of words separated by dashes, should be descriptive to both users and search engines, and should accurately describe the image. If relevant, it should also use a keyword relevant to the page/domain.

All links pass a bit of trust, PageRank and anchor text. This 3rd party endorsement system is at the heart of the Google search algorithm. It measures how different pages link together, and weights those links based on traffic, popularity, relevance, age, size, content, and hundreds of other components.

When pages that Google deems relevant link to other pages, some of that trust and authority flows through that link to the site being linked to. A “followed” link is essentially endorsing the page being linked to. Anchor text also passes through links.

Enter the rel=”nofollow” tag. Google introduced this tag to help prevent ranking manipulation through comment spam. While the use of this tag used to prevent ANY trust or anchor text from being passed on, Google has recently made some changes.

Now, when the tag rel=”nofollow” is used in an anchor tag (link), Google will usually just pass less trust to the page being linked to (~50% less, from what we have heard; testing has repeatedly shown that it does pass some). Using this tag is like saying, this page is nice, but we don’t want to endorse it. Of course, Google can ignore you and pass whatever they choose…NoFollow is little better than a suggestion these days, especially when it comes to NoFollow’d social links (Twitter, Facebook, etc.).

With that in mind, from an SEO perspective, you would probably only choose to use the rel=”nofollow” tag in the following instances:

  1. On any user created link within blog or forum comments, author profiles, etc. If it’s likely to be spammed, use NoFollow.
  2. On any internal link that links to a page with no SEO value (i.e. login pages, RSS feed pages, etc) – If the page has no searcher value, or if it’s a page you don’t really want showing up in search engines as an entry point, feel free to use the rel=”nofollow” tag on links to that page.
  3. On any affiliate links or links you are otherwise compensated for placing on your site. Google has penalized many sites for “link selling”, and it’s a stiff penalty, so don’t mess around on this front.

Remember, NoFollow may still pass link juice to the linked to page, but usually less of it.

A proper NoFollow tag would be used like this:

<a href=”https://www.domain.com” rel=”nofollow”>Page Name</a>

Keep in mind though, it is perfectly normal to link out to other sites on occasion, and linking out to at least some other sites with followed links is a part of appearing “normal” in Google’s eyes. Don’t link out to external domains from every page, but definitely do so with followed links from at least a few pages, particularly blog posts as relevant.

Last but not least, using a NoFollow tag doesn’t pass additional page value through to other links on the page; rather, NoFollow consumes that link’s portion of the page value without passing it anywhere, essentially burning off a portion of a page’s link authority. In general, it’s better to simply not include a link (if you can avoid it) rather than using lots of NoFollow links on a page.

Last but far from least, one of the most important on-site SEO elements is text content. Search engines want to see roughly 300-400 words or more of unique text on each page (there’s some debate on the minimum, but this seems to be a safe number). That said, minimums are relative…check to see what content ranks on page 1 for the terms you want to rank for, and if the average content length is 1,000 words, you have your relative minimum.

We recommend having roughly 400-600 words of unique text per page, but 1000-3000+ word blog posts and epic guides rank amazingly well, so plan accordingly.

This content on a page should contain the exact target keyword about 3-4 times, and perhaps a few more times on long pages. Having unique, keyword rich, topic focused text on a page can help to improve search engine rankings significantly.

It is also highly beneficial to use variations of the keyword. For example, for “chocolate pudding”, you would want to use “chocolate” and “pudding” somewhere on the same page. It is also highly valuable to use related keywords. In the case of “chocolate pudding”, you might also want to use “creamy”, “Jell-O”, “dessert” and “tasty”. This technique is referred to as LDA (Latent Dirichlet Allocation) or LSI, and helps to establish excellent topical relevancy, and increases your chance of ranking significantly.

If it fits, consider using the keyword at least once in a <strong> tag and/or an <em> tag on every page.

That said; at the end of the day make sure you are writing content for your human users, and not just for search engines. If it’s not worthy of commenting on or sharing, it’s probably not worth writing. You can use tools like Answer the Public and SEMrush to find what sorts of questions and search phrases people are using.

To rank and stay ranking in a competitive space and to appeal to Google’s QDF algorithm (query deserves freshness), regular content growth is very important. We recommend that every site have a blog, and that they write in that blog at least once per week (once per day if possible, but only if you can write one high quality post per day). At the end of the day, it’s better to put out one truly epic post per month, than one shitty post per day.

It is also beneficial to update the content of existing pages from time to time. Google can tell how often a site is updated, and takes that into account in rankings. The more often helpful changes are made to a site, and the more often unique content is added, the greater the value of that site to users in the eyes of the search engines.

In addition to content growth, search engines and users love to see content diversity. Text is great, but images, videos, polls, PDFs, and other interactive resources have both user experience and SEO benefits. Videos and images, in addition to helping with SEO, can also drive additional traffic to your site via image search and video search results.

Duplicate content is viewed as a big negative by every major search engine, Google in particular. It can not only hurt rankings, but can prevent a page from ranking, and sometimes result in de-indexing of an entire domain. Search engines want to see unique content on a site, and in their search results. A search results page with 10 different websites that contain the same content would be a poor results page indeed.

There are two main types of duplicate content:

  1. Duplicate content within your own domain (such as is often caused by CMS issues, like WordPress tag and author pages). This is a very common problem with e-commerce websites.
  2. Cross-domain duplicate content, where your site is hosting content that is identical to content used on other websites. This is also a common problem with e-commerce websites, as well as article sites, news sites, and less than scrupulous scraper sites.

Duplicate content is generally defined as any page that is 30% or more the same as content elsewhere on the web. Duplicate content is found using N-grams, which look for identical sequence of 10 words or more, excluding common usages and stop words. Too many blocks of identical content on a page = duplicate.

If content must be duplicated within your own site, for whatever reason, you have three options:

  1. make use of the rel=”canonical” tag on duplicate pages
  2. use the <meta name=”robots” content=”noindex, follow” />tag on duplicate pages
  3. block indexing of duplicate pages in the Robots.txt file

When it comes to cross-domain duplicate content, you can use the above options, but the best practice is to simply not have cross-domain duplicate content anywhere on your site. It is better to be safe than sorry in this regard.

Because Google is putting more emphasis on user experience, page load times are also now a factor in search algorithms. You should ensure that every page on the site will load in 1-2 seconds on a typical DSL connection. When a user exits the site via the back button, it hurts rankings, and few things elicits the use of the back button more than a slow loading page. With this in mind, make sure the file size of images or video in your content are as small as possible.

Using flash or images in place of text is considered a big negative. While search engines like a variety of media, it must not take the place of text. Flash looks good, but it is not SEO friendly. In addition, requiring that users have Flash to view a website has been shown to increase load times and bounce rates, and decrease usability. It also means that your site would appear broken to someone using an Apple mobile device such as the iPhone or iPad. If you need Flash-like visual elements, there are alternatives.

However, if you absolutely must use Flash, there is an SEO friendly technique for websites using Flash known as SIFR, where the flash is replaced with text if the user does not have flash installed, and that makes the HTML text always available to the spiders. This way, the flash content is presented to user’s who have flash, but is always readable by the search engines, as well as users who don’t have flash installed.

Avoid all Black Hat SEO techniques. When trying to rank for a competitive keyword, you may be tempted to try some less than kosher SEO tactics…DON’T! Black Hat SEO is a very big negative, and if discovered could result in your site being removed completely from Google’s search index.

This means:

  • No keyword stuffing. This means NONE, not in any way. Don’t hide keywords in DIVs, strange made up Meta tags, or anywhere else on the page or in the code. It is not worth the risk of being de-indexed.
  • No disabling of the back button, and no pop-ups/pop-unders stopping you from exiting a site
  • No sneaky redirects
  • No pulling large blocks of content or entire sites with iFrames (no SEO value)
  • No Meta refreshes (unless the refresh time is 0)
  • No hidden text (unless you’re using a user accessible collapsing DIV, which is fine)
  • No hidden links
  • No text of the same or very similar color as the background
  • No displaying different content based on user agent (except for URL names, which is OK)

Though advertising can be a great way to make money, it seriously detracts from both the user experience and the SEO-ability of both a website as a whole, as well as individual pages. Ad calls slow down page load times, often create a poor user experience, and in some instances can trigger ranking penalties tied to the Google Panda updates.

You should carefully weigh the pros and cons of using ads, based on the purpose and goals of your website. For some sites, the tradeoff will be worth it.

Here is a visual example of a perfectly optimized page (courtesy of Moz):

Perfectly SEO Optimized Page

Optimal Content Sample:

A sample page optimized for the keyword “Seattle SEO”, ideally written, is as follows (HTML in Orange):

<h1>Seattle SEO</h1>

<p>If you are a business in Seattle with an online presence, then your business could likely benefit from professional Seattle SEO services. What is SEO you say? Quite simply, it is the art of optimizing a website or webpage to be more search engine and user friendly, ideally resulting in improved search engine rankings.</p>

<p>The vast majority of internet users make use of search engines to find what they are looking for. Regardless of whether you own an online business or a brick-and-mortar operation, a website is a must-have to maximize your revenue potential. Unfortunately, just any old website isn’t going to cut it. To get the most benefit from your online presence, making effective use of <a href=”https://www.vudumarketing.com/”>Search Engine Optimization</a> is a necessity.</p>

<p>So, your business is in Seattle, and you want a local company to build you a website and/or do some SEO work…now what? How do you find such a company?</p>

<h2>Finding the Right Seattle SEO Firm</h2>

<p>With dozens of companies offering SEO services in the Seattle area, it can be a real challenge to find the right one for your business. Of course everyone will claim to be the best, but how do you really know if what they are selling is what you should be buying? To that end, we offer the following advice:</p>

<ul><li><strong>Know What Questions to Ask</strong> – It is important to always negotiate from a position of strength. Learn the basics of <a href=”https://www.vudumarketing.com/”>SEO</a>, such as the value of keyword rich URLs and title tags, before you go searching. Even if you just sound like you know what you are talking about, you are much more likely to effectively weed out imposters.</li>

<li><strong>Ask for Examples of Past SEO Work</strong> – From testimonials to actual optimized sites and rankings, ask to see some work they’ve done. Make a note of the examples, and contact those companies if possible. Ask them if they had a positive experience with the SEO company in question, and if they would recommend them.</li>

<li><strong>Determine How They Measure Success</strong> – If their only measure of success is improved rankings, they are probably not the right company for you. SEO, like any form of marketing, is about making more money. To that end, tracking traffic and conversions in addition to rankings is the ideal scenario. If they don’t measure rankings plus some additional metrics, you should probably look elsewhere for a better company.</li></ul>

<h2>Learn More About Seattle SEO</h2>

<p>While there are many other tips that could be offered, the three tips above are all that you should need to find a high quality SEO company, one that can add real value to your bottom line. <a href=”https://www.vudumarketing.com/contact-us/”>Learn More About SEO</a> today!</p>

In summary of the above content:

  • It’s about 400 words in length, not counting code, putting it in the ideal 400-600 word range.
  • The keyword is used in the H1 tag, and in at least 50% of the other H tags, and the page is effectively broken up by sub-headings (H2 or H3 Tags). There is approximately one H tag per 150 words of content.
  • The keyword is used 5 times throughout the page, and also appears in a strong tag (bold) and in a list item.
  • Variations of the keyword and other related keywords are used throughout the text, helping to page to conform to the rules of LDA (Latent Dirichlet Allocation).
  • The text of the page contains cross-links to other related pages using keyword rich anchor text relevant to the pages being linked to. Every page should cross-link to 1 or 2 other relevant pages on your site.

The page closes with a call to action, which is a critical element of content writing. All content should provide real benefit to the reader, and should encourage the reader to take some sort of further action to secure a conversion.

If you’ve done a search on Google recently and seen a search result with an image next to it, or star ratings, a video next to the SERP, or any one of a dozen other advanced search results, you’ve seen the impact of Schema, RDFa, etc. You can learn more from Google here, and on Schema.org.

One of the most common implementation out there has to do with rel=”author” and rel=”publisher”. This is where the image of the person or business who wrote an article or page shows up next to their search result. These are a great way to increase your traffic, and are very simple to set-up. There’s a great slide deck here that walks you through this particular implementation.

If your site provides products, services, recipes, apps, music, videos, or any of dozens of other options, the chances are good that you could use Schema to upgrade your search results!

These page-level code markups allow your search results to stand out from the crowd, and can significantly increase your organic click-through-rates. I’ve seen instances where the #3 or #4 search result actually had a higher CTR than the #1 ranked result, simple because of Schema markup.

While the scope of what can be marked up this way is quite broad, it’s actually fairly easy to implement.  Here’s a handy guide from Built Visible to get you started. Once you have your rich snippets code implemented, you can test the page to make sure it was done correctly with this handy Google tool.

If you’re a local business, local search traffic is becoming more important than ever. More and more people are searching using localized search phrases, or at least have local intent while searching. If you want to rank well for these sorts of local searches, there are a few key things you should do/have in place:

  • Create and/or Claim Your Local Profiles – You can certainly do this manually on Google, Bing, Yelp and other sites, but the easiest way is to use a tool like Yext or WhiteSpark or Moz Local to quickly and easily do this across all of the most common sites at once. You can also use a service like Loganix to build out citations (off-site mentions of your NAP).
  • Claim Your Social Profiles – With Local SEO, citations are key (any place online that has your name, address and phone number exactly as it appears in your local listings). The key with this is to make sure all of your online business listings use the exact same format for your name, address, phone number and website URL (and we mean EXACT…1-800-798-2430 is different from (800) 798-2430…be EXACT). One of the quickest and easiest ways to get a bunch of citations (and to prevent brand squatters) is to claim all of your profiles via a service like KnowEm.
  • Create a KML FileA KML file can help to make sure your business shows up accurately on maps.
  • List Your Business Info in SchemaUsing Schema to identify your business name, location, business hours and other key elements can not only help with local rankings, but can help with getting rich snippets in your search results.
  • Show a Map and a Route PlannerEmbedding a map and a route planner into your site makes a ton of sense from a usability perspective. If you want local visitors, do this.
  • Use Your NAP (Name, Address, Phone) and Local Keywords Throughout Your Site – Throughout your site, in the footer, on your contact page, and in Titles, Metas and page content as relevant. It should be crystal clear to Google and visitors that you service a particular area.

If you’re using WordPress, you can do some of this very simply with the Yoast Local SEO plugin.

While not strictly an on-site SEO element, social media is now an important part of the ranking algorithms of Google and Bing, and social circles play a big role on purchasing decisions, so facilitating social on your site is a smart thing to do.

First and foremost, make sure you have created social profiles for your company, and tied those back to your website. Every page on your site should prominently include links to your Facebook page and Twitter profile, at the very least. If you have a YouTube channel, a Google+ profile, a Pinterest or Instagram account, or a company LinkedIn profile, definitely link to those as well.

Next, make it easy to share your content, and by easy, we mean so easy your Grandma could do it. There are a number of ways to integrate these functions, but the most popular are:

At the very least, make sure it is easy for someone to like/share your content on Facebook, Twitter,  Google+ and LinkedIn. Having these elements in place will help you to effectively leverage the SEO value of your current and future social media efforts. For traffic and branding purposes, you may also want to make it easy for content to be shared on Digg and Reddit, and easy to be bookmarked on StumbleUpon and Delicious.

Last but not least, make sure you make use of RSS feeds. Any page that updates regularly, like your blog, should have an RSS feed. The easier you make it for people to follow, digest and share your content, the better.

Of course, keep in mind that this isn’t an exhaustive SEO guide by any means; it’s focused entirely on on-site SEO elements. If you’re looking for more comprehensive SEO training guides, check out The Beginners Guide to SEO from Moz, and The Advanced Guide to SEO from QuickSprout.

If you have any questions, please contact us!

Your Name (required)

Your Email (required)

Your Website

How can we help?