How To Check Your Website’s SEO Health

SEOThe process of auditing one of my websites which took quite a while not forgetting the pain I underwent is complete. However, I successfully managed to discover and repair some problems that I was not even aware that they existed.

There are several reasons it may be necessary to audit the SEO health of your personal site. It could be that despite being an owner of the site, probably you haven’t considered visiting it for some time. Thus, it could be that you are opting to go for a DIY solution instead of an external audit which is relatively expensive.

Probably, it is a website that you bought lately and you want to be certain that everything is right before doing anything more to the website. It could also be that you are an intermediate auditor and a number of clients look upon you to have their site checked and fixed.

Regardless of on what basis you are checking the site, there is need to be keen in checking a number of factors and to also be comprehensive enough. If one of the factors is left out, odds are that the SEO may stall. A further major problem is likely to occur afterwards. The following leads are a nice in enabling you to know what to look out for:

Author note: Knowing that this may be your own site if not your client’s thus you should be super keen. In case of an off-site SEO or a competitive analysis, you may have to consider checking other additional guides.

Look for Broken Links that have a screaming Frog

Screaming Frog is likely to assist you greatly with some crucial information that may be of help to you to not only in this step but also in the further steps. Specifically in this site, you want to confirm the integrity of your site by simply running crawl on it. In case you come across a link leading to a page that is inexistent, you need to eliminate this from your site if not changing it for a better one. You may either resort to updating the version of the latter destination or even do away with the link completely.

Look for the sitemap integrity

Ideally, your sitemap is a list of each page that you would like to be found by the search engines on your site. There is need to ensure that it is well founded and also hosts each page all on your site. This can be done in a number of ways not forgetting the Xenu Link Sleuth. Since the process may prove to be a bit technical, there are two ideal options. The first option may be that you have to look at your current sitemap. The second option may have to develop a new sitemap. However, it is best done only after the rest of the entire audit is done and after the necessary adjustments have been made.

Look for 404s from the Google Webmaster Tools

Once you have already linked your site to the Webmaster Tools account, you can now go to Cral Errors report. This report can be of great help in enabling you to see the inexistent pages that Google bot has found after crawling on that particular page. Indeed, this is a very nice opportunity to either redirect it to the page that should be loading or to develop another real page.

Look for duplicated content using Copy Scape

Any duplicate content on whichever sight can negatively affect your site ranking. Copy Scape is a perfect fit to detect stuff that is duplicated online. You are likely to find several possibilities:

  • Duplicated stuff in your site content. In case you find the same content in several places, its either you will have to have it fixed especially if the duplicated content is on thin pages. Alternatively canonicalize especially if there are several URL results given by the dynamic search.
  • You can duplicate content on a different site, part 1. However, this only happens if the other site was the founder of that content. That is, especially if you copied descriptions of a manufacturers product or just any content that the writer could have copied manifestly from the latter site. Simply replace the copied content with the ideal original content.
  • In case it another site that has copied, duplicate the content on different site, part 2. Though this is not likely to cause you a penalty, it is wise if you report this to Google.

Look for thin content

Ideally, nebulous gives the correct definition of thin content. Otherwise, more often than not content that is not more than 300 words is likely to be classified as thin. Additionally, pages containing several navigational content as well as headers or footers are also in the thin-page category. What can be done to thin pages is to either eliminate them completely or put additional content so as to make them valuable.

Look for errors in content

Content is likely to contain either factual or grammatical errors. The correct remedy of grammatical errors is to proofread as you fix the erroneous parts. This is by no means big deal. You only need to have an update in your site map so that it becomes easy to ensure that your page is free from errors.

On the other hand, it is not easy to fix factual errors. In case your page is obsolete and was contained factual content then, you simply need to add a disclaimer saying that the advice is passed by time. However, you can also decide to develop   a new updated content. It is left upon you to make the most ideal choice for you.

Look for the numeral of indexed pages

This again requires you to apply the use of webmaster tools. At this time, haul the Google index, Index status report. Consecutively, run a search at Google for your site. They will all provide several pages for you.

Try to compare those numbers. The good news is that Google will do the indexing. However, you are likely to have problems if you happen to have a lesser index count. In fact, there may arise problems not only in directives but also with robot directives. On the other hand, in case is proves larger, odds are that your URLs re probably having duplicated content. All these need to be fixed soon enough.

Look for Meta-tags that are well-formed

There are three major Meta codes to lookout for t each of the pages in your site:

  • Meta title: If possible, it should be brief, not more than 70 characters. It is wise to affix details about the brand at the end rather than at the beginning like most people do. Give your title some enough description to prevent it from being a switch scenario or bait. It is worth to note that it should also contain a keyword, though not necessarily all the optimized keywords. At last in order to avoid duplicating errors, have a unique title.
  • Meta description: Though your description is not directly an SEO factor, it is helpful in improving the appearance of preview as well as clicks. Maintain it short and precise and with an ideal title to match with the content.
  • Meta robots directives: Technically, you are supposed to manage to handle most directives in the main body. It is not wise for one to contradict the entire content by applying their use on page level. This is likely to spoil the whole broth.

Content in different multiple pages should have rel=prev/next tags. it also makes sure you have an ideal canonicalization. Additionally, in case your page contains Met keywords, the best you can do is too remove them. Nowadays, Meta keywords are left for just the spam sites.

Look for an optimized 404 page

Regardless of the fact that very few people are likely to visit the 404 page, once in a while, this will happen. Therefore, ensure that your 404 page is optimized. Avoid simply passing it on to the site’s homepage.

Look for proper canonicalization

Canonicalization has the odds of proving technical. It is very much likely to have nearly thousands of URL pointing to the same page. This happens especially in case one has a number of URLs for the same page.  Other reasons may be if you have involved URL parameters as well as if you have URL that has been stored in the URL. This is likely to duplicate the age thus making the several URLs to led one to the same page. At such cases, one tends to assume that the two ULS are directed to different pages. A dozen of dynamic URLs appear like they redirect to different pages. However, they have duplicate content. Thus, canonicalization is a correct move in voiding the duplication.

Look for the load times of the site using pingdom

Since users are never willing to wait, it is unwise if you let them. Pingdom can be of great use in helping you to look for the check time frequency of the individual pages s well as the entire content in general. Essentially, none of the pages in your site should load for more than 3 seconds time. As a matter of fact, the swiftest pages should be able to load within milliseconds time while the slow pages should not exceed 5 seconds of loading. In case you have any page exceeding 10 seconds of loading, odds that there is an error that you need to check and repair.

Ideally, there are several problems leading to the long delays in page loading. Chances are that there may be a broken Plugin or fix a script that has broken. Therefore, it may require the site owner to consult their web hosting architecture and make some changes.

Look for errors I the Robot.txt

Robot.txt files could be of great help in directing the steps of well-behaved search engines. However, also be keen so that not to block the bots accidentally from your whole site. In case some of pages are blocked, make sure that it is with your consent and for some good reasons.

Look for ideal redirects

Mo steps in for this task. There are diverse types of redirects each of redirect and each one of them serves its own purpose. There are chances that you have a number of redirects present in your site and you could happen to have changed the architecture of your site. Similarly, in another step you may have altered your URLs structure. It is upon you to ensure the implementation of all your redirects.

Look at your URL format

It will be good if your SEO has a normal URL structure that is legible by human. The best are branded HRULs or rather Semantic URLS; these are the most common nowadays. Semantic URLS are characterized by www.example.com/ ; this cannot be compared to senseless numbers and letters. If you are that person who is yet to apply the use of semantic URL, there is an important change for you to make. Since there are high chances that it may involve too much redirects and lot of work load, it is important for you to be super careful when implementing any changes on your site.

Look for image Alt Tags

There is a need to briefly describe any image featured in your site even though it is a business logo that you have displayed at one of the corners. Alt text is a tool of great use in your site. It plays a major role in ensuring that any time an image failed to load in your site, there is an alternative text to replace it. This is of incredibly great use especially for users that may be encountering loading problems or having slow connections. Alt text also plays a great role in creating traffic for the site. This is achieved by its help of assisting higher ranking by the image search engines. Bearing that in mind, it is only wise if you craft an alt description that is keyword-optimized for every single image that you apply.

Look for ideal utilization of proper H Tag

It is requirement for every single primary title to have a H1 tag. Though you may consider the use of H2 tag especially in subtitles, it is not a must for you to apply the rest of the H tags. Use of H2 as the first sub head or even H3 for the next are some common mistakes that you should avoid at all cost. If possible, consider them nested elements, but not as among those bulleted in the list.

While on it, ensure that you do not leave out number. Never should you ever have a H3 preceding a H2. At the same time, if you have not started with a H1 Tag, it is not necessary to include them in the consequent pages. This may not be as minor as it appears to many of us. However, it is important to have well-formed code knowing that every small bit really counts.

Look for cannibalism of keyword

This is a common phenomenon in most of the small, new sites and really rampant in the large, obsolete sites. The fact that there are just limited valuable keywords at specific sites, many blogs end up duplicating , odds are that the blog may end up repeating themselves a number of times. If the keyword has been targeted on the same page for several times, the cumulative keyword values is normally divided equally among all the available pages. Therefore, each page is relatively less effective as compared to the entire pages altogether.

The odds are that you may hence have to be ranked as low as 6, 7 or 9 on Google with the separate pages instead of the position one you would hold with the combined pages.

So as to find the keyword cannibalization, it will call for you to dig out for the aim of the keywords on every of the single pages. Also, confirm that there is no content that has been duplicated in your pages or any overlaps present.

Look for ideal architecture of site

If you happen to represent your site on a circle drawn on some piece of paper, how will it appear in comparison with a spider’s web? There is some technical science in this analogy. This is to check the odds of your site having violated some of the fundamental rules that are there. Such may be like: probably your site concealing content far away from the home page. This may men that one will have to redesign their future site.

Look for search penalties

In case you have a newly purchased site, obsolete or one that may have been poorly scrutinized, there is need for you to be on the lookout for search penalty.

Begin with searching for any hints of a possible penalty. This may be such as lower ranks than expected or some indexing that may be absent.

Secondly, look for a real penalty. In case you suppose that the cause of the penalty is having some non indexed pages, it is super easy to handle that. However, if the actual penalty results, there is quite some work for you to do.

Fix whatever problems that may have been there. These could be: code, content or whatever else that could be. You just have to get it fixed. This is the only way to have your rank restored.

One can also make a reconsideration request from Google. This is to make it possible for Google to withdraw any undue penalty that they may have charged. Clearly, a reconsideration request will by no means hurt.

Look for the Quality of the external link

Draw all link’s profiles in your site making sure that you point to the rest of the domains. However, it is expected that at this time you have already fixed any links that may have been broken. For all the remaining others, you need to take time and review them thoroughly to decide if it is necessary for you to retain them. Is it possible that they are pointing at changed, packed or hacked sites?

If that is the case, it may require you to replace them with the right ones. Do they point at low quality sites from your personal point of view? If that is the case, then you may have to consider completely doing way with them or having to add an attribute of no follow. In case you find that they are of up to standard quality, simply leave them intact.

Summary

Essentially, the pages that are affected will be clearly pointed out by your page stats on Google analytics. After you have audited your old page, you may realize that you have your content creation content greatly improved. It may also be likely that you have some obsolete content that you need to eliminate. The best way to carry out site auditing is making sure all possible changes have been made. Audit every single aspect of the entire content of website. Also, ensure you apply the use of tools as well as software that will greatly simplify your whole task of auditing. If you don’t want to perform the website auditing yourself, it might be a good idea to hire this SEO Agency to help you. Search engine optimization, as seen in this article, can take a lot of time to get it done thoroughly.