Page rank has long been a confusing issue for webmasters and Forum owners, and a lot of myths surround it, the biggest myth is that page rank refers to the “rank” of webpages in search engines, the term has become synonymous with SEO by the ill informed.
When Google first set up as a search engine, the net was in relation to today, a small place, as the net became larger, and content was duplicated,there was a need for Google to be able to automatically compute which websites were the most reliable, pages and sites had started to link to each other through “keywords” and URL’s, Google needed a way of analysing these links, in other words they needed a “link analysis algorithm”, the problem was by Googles senior programmer and CEO of the day, a certain Mr Larry Page, the algorithm he developed was named after him.
What Larry Page developed was a calculation to assign a numerical weighting to a given number of sets, with the purpose of measuring the importance of it’s relevance within the set.
This weighting is also used to set the importance of the set within the net as whole.
Over the years the algorithm has been refined, and extended, it now gives greater weighting to relevance, and can even add extra weighting to sites with relevant backlinks and content, what it actually does now is to assess the position of a site within a given set, and then assesses that set within the scope of the net.
The algorithm works mainly on percentages, and the importance of the page the link is coming from.
The whole issue is confused because the algorithm is called the Page rank, and pages have a ranking system worked out by the algorithm.
- Mathematical PageRanks for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value. If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. In the presence of damping, Page A effectively links to all pages in