This system has worked well, but has not been without criticism (it tends to discriminate against newer web sites) and there is now a problem looming for Google which is rendering its page rank model less accurate.
I work for a nonprofit organisation and more than 50% of organic links back to our website are marked as nofollow. The nofollow status of links tells search engines not to follow those links. Google does not count nofollow links as a vote for your site in their ranking algorithms. The trend towards nofollow links started with Wikipedia where there was a serious risk of link spamming because of its open nature. I believe that some blogging software defaults to nofollow which means that bloggers linking to our website content and campaigns don't contribute to our Google PR or rankings.
The outcome of this is that Google's method of ranking web pages is being left behind. Its a shame and its also becoming a problem for users who find it harder to find relevant content, especially in local areas or in subjects where there is a lot of professional SEO activity.
Rather than just criticising Google I have a few ideas about how this move to nofollow could be addressed. Its a brainstormed list of ideas and I am not suggesting one over the other:
- Count nofollow links but reduce their weight by a factor of ten. This would make linkspamming from sites using nofollow links quite difficult.
- Track the links and increase their value with age, so a link's vote becomes stronger the longer it has existed. This would mean that spammy links that were cleared regularly could not pollute the real value of the page they linked to.
- Count all nofollow links as normal links but downrate the refering value of the site they come from if it is monitored as having lots of nofollow links (this is just a variation on #1 but might be better at dealing with spammy link sites).
This is an issue which needs to be looked at and I hope this article adds to the discussion.