Whether you own a small three page hobbyist website or a monster E-commerce store there are several things you can check to ensure you are not placing any barriers in the way of your search engine optimisation campaign.
Below you will find the 5 most commonly checked areas when a website is critiqued by an outsider. Any webmaster has the ability to check these and, depending on your content management system, you should be able to fix them.
- Duplicate Titles
- Content Duplication
- Blocking Your Website
- Non-Crawlable Navigation
- XML Sitemaps
As the search engines rely on your tag to tell them what the page on your website is about, it’s imperative that each page on your website has a unique tag.
You can check this manually by browsing your website, or you can simply go to Google and do a search for site:yourdomain.com. If you see duplicate titles then it’s time to collate a list of pages on your website that need their titles changed.
Following on from the duplicate title check it’s also imperative that you have unique content. Until recently this was unavoidable for pages that carried “print-friendly” versions but with the implementation of the rel=”canonical tag you are able to specify the URL that you would like the search engines to credit as the unique content.
By using the site:yourdomain.com operator again you can spot duplicate content. However, you may find that browsing your website manually will give you a better insight.
This is a “doh!” tip, but you would be surprised how many times this comes up. Have you had your website developed recently or has it just been launched?
Check in your source code if your developers have left in there, and more importantly, have they left content=”noindex,nofollow” in there. If so, get this changed to “index,follow” and open the door to the search engines
Search engines rely on their spiders being able to crawl your website through navigation menus and internal linking. Is your navigation stopping them?
Also increase your internal linking within your current website copy. The more pages that the search engines can find the more pages that your potential customers can find.
Why not help search engines know what pages your website has by giving a list of them all.
Using a free service such as xml-sitemaps.com you can create your own sitemap.xml to avoid you manually creating one. Simply upload this to your webserver and let Google, Bing and Yahoo know in their respective Webmaster Tools where your XML sitemap is.
Whilst these tips are basic in many SEO’s eyes, they are always a fantastic starting point when trying to solve the mystery of why your website might not be performing as best as you think it can be.
Latest posts by DST Contributor (see all)
- Survival Guide to Multilingual SEO - May 14, 2013
- Five Killer Link Bait Tips That Can Provide You With ‘Passive Marketing’ - May 11, 2013
- 5 US SEO Events to Visit to Spend a One Month Vacation with Purpose - May 9, 2013