So you told your client that you are going to audit their site for search engines. And because you’re young and dynamic and your client is not they trust you to magically pull traffic from the web to their website. You sold them on the idea, the benefits, the miraculous wonders of search engine optimization. Now, they are like everyone in digital: we all understand how important SEO is, yet almost no one knows exactly what to do to make it work.
During my MBA we had a very long class on SEO, our project was to do an audit for a website using everything we had been taught and more. Fortunately for me, I was working with this dude, who happened to be interning at an agency that specializes in SEO. Together with another one of our classmates, we put together an epic SEO audit that garnered a perfect 20/20 from our professor, herself an SEO expert with a long list of esteemed clients. How’s that for credibility?
In this incredibly long post, I’m going to outline each and every step for creating the ultimate, most impressive, and most effective SEO audit that you can. Considering how fast everything changes in the world of SEO, it might only be relevant for a period of six months, maybe a year max, so please share this widely among your SEO friends, or horde it away all to yourself so you can blow your next SEO audit out of the water!
What’s the site about?
The key to successfully harnessing the power of SEO is to ask yourself the not-so-obvious question, what is this site about? It’s a semantic exercise in marketing that will help you narrow down the focus so that it can be sharp enough to shave away some traffic from search engines to your landing pages.
Identify the landing pages
Landing pages are the storefronts of a website, the place where people are going to enter. Landing pages are also the first impression, the place to shape and model a visitor’s opinion of you and the quality of the service or product on offer. Depending on the range and scope of the overall website, you can identify 4 to 5 landing pages to focus on.
Once you have the landing pages, you need to ask yourself what each of those landing pages represents. What is the mission? Why would someone come to this page in particular? Make a list of the answers to these questions and hone it down so that you have one answer for each landing page.
Now that you have the objective of each landing page, it’s time to start looking at keywords. What would someone search for to get to that page? Make a list of all keywords that you would use to search for to get to that particular landing page. Then, play a game of linguistic adaptation. In order to exploit the best keyword, you should grab a thesaurus and start copying out every combination of words. Once you have this list of keywords and alternate versions, start running them through Google’s Keyword Planner.
The best keyword is the one that has a relatively high search volume, but fewer total results. This means that people are searching with this keyword often enough but not everyone in the space has caught on yet. Keyword Planner will also give you a difficulty rating, this part is easy, the lower the difficulty the better.
Search engines like when things are clear, and in many ways that’s what SEO is all about, presenting the content of your website in the clearest way possible so that search engines can instantly connect people searching for that content to you. This is why you need to focus on one keyword (which can be one or two or three words). Once you have this down, you’ve managed to narrow the entire focus of the landing page into the most potent keyword.
Now repeat the process for the other landing pages. Make your list, expand it out, run it through the Keyword Planner and pick the best one. In the end, you will have one keyword for each of the landing pages. It might seem like a tiny amount of product for a huge amount of work, and if you’re working for a client, it’s a good idea to keep the spreadsheet you’re using to show them just how much work went into it.
But it is by far the most important part because most of the things that you will do to optimize a landing page and a site will boil down to these keywords. The difference between having great keywords and poorly-chosen generic keywords could be the difference between hundreds of visits or thousands of visits. It’s not uncommon for SEO experts to sort through thousands of keywords before settling on the most effective.
Check out the competition
Next, take your keywords and search for them on Google (after clearing your cookies to avoid weird results showing up). Who are your competitors that appear? Make a list of them. Cross check that list with your other keywords. How are they faring when you search for the keyword that you want? There are tools that can help you with this part, like SEMRush which allows you to follow your competitors’ sites and their ranking against certain terms and keywords. A healthy benchmark will give you a reference against which to judge your progress.
Checklist for an SEO Audit
In no particular order, these are the questions and steps to complete to deliver the best SEO audit your client (and the world) has ever seen.
What is the Page Authority for a landing page?
The Page Authority can easily be checked by installing the MOZ Bar extension to Chrome. The MOZ Bar will show you a ranking out of 100 for how strong the Page Authority and the Domain Authority are. The Authority is mostly the credibility of a site or page, based on how many people are sharing or talking about it on social media, the traffic from search engines and its reputation. Sites like Google are 100, smaller blogs can be down in the 10s. Generally speaking, anything over 50 is pretty good, and the MOZ Bar will serve as a measuring tool for the progress of your SEO development. If the Authority keeps going up over time, you’re on your way to success!
Is the site responsive?
Google ranks responsive sites higher than non-responsive sites since about half of Google’s searches are on mobile devices. It’s not enough to check if the page formats correctly on your phone, Goole has a Mobile Friendly tool to check. Just pop in the domain name and they will tell you. If the site is responsive, you’re good, if it’s not, you’re going to need to talk to the developers about how to make it accessible on mobile devices.
Does the site have a FAQ?
Believe it or not, voice search is becoming more and more commonplace. And how do people search with their voice? They use questions. Therefore, websites that feature a FAQ section stand to benefit since a question that someone asks verbally to Google could be one of the frequently asked questions on your site. A match made in heaven!
If you don’t have a FAQ, think of some of the obvious questions that people would ask that your site provides the answer to and start building that out.
Each site should have a robots.txt part, for example “http://www.yoursite.com/robots.txt.” This is the part that Google checks first, to see what parts of the site the author wants to be made visible. If the author has a “disallow” on, that means they don’t want the site or a part of it to be indexed. It is very important to add the link for the sitemap at the bottom of the robots.txt page so that the crawler can go directly to the site architecture.
Check the Sitemap.xml
Every site should have a sitemap that indexes all of the individual URLs of a domain. A properly functioning sitemap will automatically be updated when changes are made to a site, like a new page is added or an old page is deleted. Check the sitemap to see how many links are in 404 (they don’t exist). If there are a lot of 404 errors, this means the sitemap is out of date and needs to be updated so that the search engines consider it to be healthy. You can use Sitemap Inspector to see which pages are in 404. In the best case, none of the URLs in your sitemap go to 404s.
Eliminate the 301s
A 301 is a page redirection, it’s what happens when you click on a link to a page and then the URL changes again before bringing you to the page you wanted. You still find the information that you were looking for but you had to go through a redirect to get there. Search engines don’t like this since it is an extra step and takes more time, plus it’s not clear when sending someone to a page to then immediately send them to a different page. You can use redirect measurement tools to see how many of your pages are in redirect. The only way to get rid of this problem is to rewrite the URLs. It’s OK if you have to use them from time to time, as a temporary fix, but it’s best practice to get rid of them.
Check the download speed
Unify the site architecture
A well-organized site should look like a tree, with the trunk spreading out into big branches and then off to the smaller twigs. Twigs don’t reach across an entire tree to join with another branch, and entire branches can’t grow off of little twigs. Enough tree talk. You get it.
Look at the menus, is there a lot of overlapping content? Are there multiple ways to get to a page on your site? If so, the crawler that is going to index your site and pages can get confused, since it isn’t sure which content supports which pages.
It’s not enough to eliminate links and make the navigation easier. Your URLs have to reflect the architecture of the site in the clearest possible way. All URLs that are subpages for a landing page should be yoursite.com/landingpage/subpage. Contrary to popular belief, where the URL needs to spell out the keywords for a search engine to index it properly, it’s actually the architecture that is more important since the search engine can put together the context of the page from another page it knows. Start going through URLs and see if any of them don’t follow the logical architecture. If they don’t create a guide and standardize all URLs.
Standardize URL Capitalization
Did you know that capital letters and small letters are not automatically treated the same way in a URL? While the navigator will still get to the right content, it could create a redirect where the URL is changed before the content in delivered. When robots crawl the site for indexing, this can create duplicate content which search engines view negatively (they think you are creating extra pages to trick them, they’re smart).
Go through and make sure all of the URLs don’t contain capitalization.
Hide your testing environment
Sometimes, even when the robots.txt is correctly applied, pages that you don’t want to show up, like test pages or pages in development, will get indexed. These pages normally have a subdomain like testenv.yoursite.com and are not meant to be visible to anyone except your developers. If a search engine crawls those pages, it might see that those pages either have no content or load poorly, which is why they are in the test phase. But it could have a negative impact on your indexing. If those pages are indexed in Google, get right of them, and check into how adjusting your testing environment or robots.txt can make future test pages invisible until you want them online.
Verify your CMS
CMS, or content management systems, are platforms like Drupal or WordPress that many websites are built on. They make it easy to create and add content, as well as build websites with predetermined characteristics and a full-feature back end. When you create a post or page, the CMS will often generate the URL for you, and this can be troublesome. Sometimes extra bits get inserted into the URL, like /node/ or /taxonomy/ when working with Drupal.
In effect, this could create hundreds or thousands of extra pages that could get indexed. Change the shortlink for these pages to avoid the crawler from scanning unnecessary pages.
HTTPS vs. HTTP
Everyone loves security these days, and nothing says security more than https, right? Well for certain websites that don’t deal with sensitive information, HTTPS is not a requirement. But it is a good practice that is helpful especially with large numbers of people using extensions like HTTPS Everywhere. The problem is that https can create duplicate pages with the same http pages. Those https pages need to be redirected to http pages to avoid bad double indexing of content.
Google Site Links
You’ve seen these before, they are the sublinks that are included when a site is the first to show in the Google search results. Usually, it’s two or four subpages and little explanations, more ways for people to find what they are looking for faster, and great for qualifying the traffic coming to your pages.
All you have to do with this is register your site with Google and tell them which landing pages you want to show up as site links. But be careful, the site links should reinforce the theme of your site and its content, and that sort of visibility should not be wasted on the weaker or less relevant pages.
Get rid of subdomains
Subdomains might be all the rage today but search engines don’t like them. In fact they often consider subdomains like subdomain.yoursite.com as separate sites entirely. Therefore you lose out on having supporting content for your primary domain.
Instead, URLs should be written as http://www.yoursite.com/subdomian/etc. This might seem counter-intuitive since it makes the URL longer, but that’s just the way it is!
Use the right language
Many sites are written in multiple languages, but it’s rare for a site to be 100% plurilingual. Often times only parts of sites are translated. This can cause issues since the search engine indexing in English could run into French pages and not know what to do with them. It’s even worse when the URLs are written in multiple languages. If you have say, 5 pages that are in French and 100 that are in English, those French pages could cause problems. But if almost all 100 pages are also in French, it’s considered as two different purposes for different countries or populations.
So check to make sure that your site is either entirely available in each language or have it only in one language.
Diversify Your Backlinks
Once upon a time, when search engines were suckling infants, they measured a site’s authority and popularity based on the number of links that pointed to that site. The logic was simple: the more links, the more important a site must be. This worked well until webmasters discovered that they could get their site to the top ranking just by creating bogus sites that contained nothing but links to their site. Link farms were born.
Search engines got smart and started penalizing sites that used abusive backlinking. Webmasters started to freak out and had to go back to delete their backlinks one by one. For sites with tens of thousands of backlinks, it was a bitter consequence to swallow.
Backlinks are still very important today, but it’s hard to fake those links with today’s search engines. Your should check the backlinks that point to your site. You can use a free backlink checker to do that. If you have a couple thousand links to your site, and 90% of them come from one particular domain, search engines might be penalizing you! It’s best to have a balance and even delete some of those redundant backlinks if you think it looks too skewed. You can disavow backlinks directly with Google.
Up your netlinking strategy
OK, the other side of the backlink coin, what happens when you don’t have that many links pointing to your site? You need to develop a netlinking strategy. This is where you start to create content that includes links to your site but is posted on other sites. You should start at places like wikis (Wikipedia, Wikitravel, etc) depending on the relevant content. Then check out forums (yes people still use them and yes they are still important). Find relevant subjects and create valuable content that points towards your site. Quora is a good place to be.
The good news is that social media counts! The more people post links to your site and talk about your content, the more highly search engines will index your site. You know what that means, get your Grandma on Facebook and get her sharing!
Navigate away from your 404 page
An ideal site would not have any links that end on the dreaded 404 page, but what happens if someone manually writes http://www.yoursite.com/pagethatdoesntexist? You need to have a 404 page. Normally a 404 page is a dead zone, and if a crawler comes across the 404 page it might stop indexing your site. You should always have a menu or at least a button to bring someone back to the homepage should they stumble upon a 404 page. Remember: SEO today is as much about user experience as about keywords and content, make it easy for people and you make it easy for search engines to index you highly.
Use all Meta tags in the Head
In the head section of the HTML of each page, there is space for the meta tags, which are things like keywords, description, and author. This information doesn’t show up for your visitors but it does tell search engines exactly what a page is about, and it is extremely important. It’s relatively easy to add this information:
- <title>The Ultimate Guide to SEO Audits</title> – this part shows up in the browser of the internet user
- <meta name=”description” content=”everything you need to know to master search engine optimization” />
- <meta name=”author” content=”Tony Hymes” />
- <meta name=”keywords” content=”SEO, marketing, digital strategy” />
Use the Schema.org tags
Schema.org is a resource that lets you put specific tags around content that you want to clarify to search engines. For example, articles can use tags like <article> to signal where the content begins, as well as <author> that wraps around the name of the person who wrote it. There are endless Schema tags depending on the purpose of your site, the best thing to do is check the list and apply any and all Schema tags that correspond.
Add a News Section
Search engines love fresh content, and they will determine how often they come to crawl your site depending on how often you add new content. If your site is rather static, this might mean that they come by once a month, so any changes you make to your SEO game will take a long time to have an impact. Adding a News section where you regularly post new content (articles of at least 250 words), will bring the crawlers back on a more regular basis which means a faster indexation taking into account all of the changes you’ve been making!
But make sure the content in relevant! Don’t just write fluff to have more content down, think of original content that search engines will consider valuable. Otherwise, you head into the sad situation of having a computer think that what you’re writing is dumb! How embarrassing!
Make sure your Events are up to date
If you feature an event section on your site, and you’re still featuring events that happened in 2014, you need to get rid of those events, or create a new section for past events. Nothing signals inactivity on a site like seeing an event from 2014 in the Upcoming Events section. This is too easy for the search engines, so either organize it or take those pages down!
Add Alt Image Tags
The Alt tag for images is an alternative tag that shows up when you mouse over an image. Alt tags are important for accessibility, for example when blind people visit your site. But they are also crucial for SEO because of image searches. When you add an image to your site, make sure you add an alt tag and make sure that it is something a bit more generic than the title. It’s amazing today how much traffic and visits come from image searches. Don’t miss out on it!
It’s easy to add it in to the HTML, just put it after the source URL of the image: <img src=”imageurl.com” alt=”clever image tag”>
Videos are awesome. Not only does everyone love videos, but embedding YouTube videos in your site creates a backlink with YouTube which is the most reputable site on the planet. Great. There’s just one thing, not everyone can take advantage of video, and specifically deaf people who can’t hear what’s being said. A great practice is to transcribe the content of a video so that it becomes more accessible. Not only is it a cool thing to do to widen your audience, but it also provides you with a mass of original written content for the search engines to index.
One day, search engines will probably be able to index videos automatically, in the same way they do for written content. But that’s not the reality yet, so get to transcribing your videos.
Link via Social
As I mentioned somewhere in the 4000 words above, social media counts for a lot. Not only does it add to your Page and Domain authority, it counts for netlinking too. But that’s only if there are links in your posts! If people are sharing your content but there are not actual URLs to your site in those posts, you are missing out on valuable SEO power.
Make sure that everything you post has a link to your site in it, so that if something gets shared widely, you benefit directly.
Use Supporting Hashtags
Become part of a greater conversation by using hashtags. Go all the way back to your primary keywords, find related hashtags and start sharing your content and site links with those hashtags. If the hashtags are complementary, you will gain visibility to your links and traffic via social media, which will in turn help to boost your SEO.
Alright, you’ve got quite an arsenal now to go out there and optimize the hell out of your site. You’ve seen how SEO is as much (if not more) technical than just based on the content. You’ve seen how social media plays an important role. You’ve gotten to know all the hacks and tricks, and you know what to avoid. Now go out there and own SEO!
“Whoa, whoa whoa,” you say, “you forgot about something!”
Did I? Let me know by getting in touch, and I’ll add your expert tip to this post.