Currently, search engines rank their web pages based on a number of things, including; the number of visitors, the amount of keywords, how long the domain has been in operation for, how many back links the website has, its page rank and a few others. However, it looks like this is about to change. Here, we will explore if this is the case.
Google filed a patent (on the 24th Feb 2011), which detailed how this new system would work, and also that their current system had a few issues. They are worried that at this point there is too much focus on keyword manipulation.
The patent documents that the focus should be switched to usage data and frequency of visits, but it does say that other features will continue to be used. This patent was launched on the same day as Google Panda, which suggests that these two have been launched to stop websites bending the current rules.
What the Patent Says
The patent has been documented very clearly, and here is what it says:
- The search engine forms a list based on link scores and IR
- These pages are listed based on usage statistics (in whole or part)
- Usage stats are returned on a live basis, meaning that ranking would change from second to second
- Formed based on the number of unique visitors
- Combine this ranking with PageRank and keywords and other bits in order to form a ‘fair rank’.
Of course, this patent means nothing currently, because it was filed with wording that allows Google to make anything from a complete overhaul to no changes at all. The frequency of visits, number of users and additional user data can all be gathered in different ways, which means that Google can essentially keep the same system or alter it to any way that they believe will be the most successful.
This could be:
The highest number of times a page has been visited (including repeat visits)
The number of times the page has been visited in a specific timeframe
The difference in the amount of times a page has been visited from month to month
The amount of unique visitors to the page
This information would be gathered on the IP address of the person and their host. Alternatively, a combination of cookie information and other machine based facts to associate the visitor with the website. However, the issue here lies with the amount of bots or repeat visitors to a site.
This means that the search engines (based on the patent) would have to choose from a combination of these three approaches (term based, link based and usage based). This would then be combined with the IR and link scores in order to give an accurate ranking.
There is no doubt that the way we will rank in the future will change. Unfortunately, now there are too many ways to bend the system which leaves those not in the know well behind. Hopefully, with this new system, more people will understand how to successfully boost their rankings.
Article was written by Kristian from the RFK Solutions writing team. RFK Solutions offer a range of Search Engine Optimisation services to clients all over the UK. They are SEO experts.
Latest posts by DST Contributor (see all)
- Survival Guide to Multilingual SEO - May 14, 2013
- Five Killer Link Bait Tips That Can Provide You With ‘Passive Marketing’ - May 11, 2013
- 5 US SEO Events to Visit to Spend a One Month Vacation with Purpose - May 9, 2013