Lately, there has been lots of talk in the SEO world about keyword research and how to do it properly. There are many expensive tools that have been designed for doing keyword research including Micro Niche Finder, Long Tail Pro, Niche Finder, WordTracker, etc. But the truth is, keyword research is not a very difficult task and it definitely does not require any sophisticated software to do effectively. There are only two things that a marketer needs to find great keywords: the Google Keyword Tool and Google.com.
All the tools that I mentioned earlier derive their data from the Google Keyword Tool so why not use the tool directly for free rather than pay to use it through other people’s software? The only claim to fame for the paid keyword tools is their apparent ability to judge a keyword’s competition. But from my personal experience, keyword tools tend to be pretty bad at gauging a keyword’s competition in organic search results. This is likely due to the fact that they rely too much on numbers and statistical data.
Don’t get me wrong, statistical analysis is very useful and it can do wonders when analyzing and sorting lists of potential keywords, but when it comes to actually gauging organic competition in the search results, automated tools fall flat. The primary reason for that is there are far too many factors that go into determining the rankings for any given keyword. What most automated keyword tools do is look at how many results are shown for a specific keyword and use that as the main indicator for competition. That couldn’t be farther from an accurate indicator of competition. In order to determine what accurately indicates competitiveness, we must look at the factors that determine how websites are ranked.
There are two main areas that search engines look at when deciding where a website will be ranked: on-page and off-page data. On-page data includes title/meta tags, H1/H2 tags, domain/URL, number of times the keyword is mentioned throughout the content of the webpage, LSI keywords, etc. If a keyword is found in the title, H1 tag, domain/URL, and is mentioned fairly often (but not too often) throughout the content, and related keywords are also mentioned throughout the content, there is a very good chance that the webpage is relevant to the keyword and that is a very positive sign to the search engines.
The other factor (off-page data) refers to the quantity and quality of the backlinks that point to the webpage and the anchor text of those backlinks. The main indicator for the quality of backlinks is determined by the location of the link on the webpage and the Pagerank of that webpage. Links that are in the sidebar or “blogroll” or footer are not considered nearly as valuable as links that are in the actual content of the page (also known as contextual links). The ideal scenario is for the backlinks to be from a wide range of domains and IP addresses with a diverse set of anchor texts. This is especially important after the latest Google algorithm update.
So now that we have a pretty good idea of what factors determine how Google ranks webpages, all we need to do is look at the webpages that show up in the search results for a particular keyword to decide how competitive that keyword is. If the search results consist of webpages that don’t have the keyword in their title or H1 tags, or they have very few or low quality links pointing to the pages with irrelevant anchor text, that is a very good sign that the keyword has low competition and that it is a good keyword to target. Things like the number of results or the number of results with the keyword in quotes or the “allintitile” or “allinanchor” queries (all of which many keyword tools use) are pretty much irrelevant because it is only the top 10 results that truly matter.
And that’s basically all there is to it. If you think that is too much work, there are indicators in the SERPs that can speed up the process. For example, if you see webpages from free article directories (Ezinearticles, Buzzle, etc) or free blogging platforms (Blogspot, Tumblr, etc), which typically do not have many quality backlinks, it is a very good sign that the keyword has low competition. Just a couple of weeks ago, I started a niche site targeting a keyword with more than 1 million search results with the keyword in quotes because the top 10 webpages consisted of unoptimized webpages with very few backlinks and today I’m ranking on the third page for that keyword and I haven’t even done any link building! I would have never found this keyword if I had used automated tools. Rather than being a slave to automated tools and statistical data, determining competition in the SERPs manually by analyzing the SEO of the top 10 webpages is not only much more effective in gauging competition, but also allows marketers and SEOs to gain more experience and knowledge of how Google rankings work.
Bio: Rob is a webmaster, marketer, SEO aficionado and enjoys writing about various topics in online marketing.
His current project is a web hosting review site which you can find at bestwebsitehostingservices.org.
Latest posts by DST Contributor (see all)
- Guidelines for Enhancing Social Media Sharing - June 17, 2013
- Is Search Engine Optimization Growing Faster In New York Than The Rest Of The Country? - June 12, 2013
- SEO Campaign Management: Top 3 Requirements - June 5, 2013