Pages

Tuesday, October 11, 2011

How Search Engines Rank Web Pages

Search for any base is a little with your favorite search engine crawlers. Almost immediately, the search engine to sort through the millions of pages it knows and then appropriate to your topic. The parties are still classified, so the first key.

Of course, search engines do not always get it right. Not relevant to do so through the pages, and can sometimes be a little digging to find what you are looking for. But in general, search engines have an amazing job.

As WebCrawler founder Brian Pinkerton said: "Imagine a librarian and say," travel "They look at you with a white face.."

OK - a librarian's not really looking at you blankly. Instead, we will ask questions to better understand what you are looking for.

Unfortunately search engines are not able to ask some questions to narrow your search, a librarian. You can not rely on the opinions and experiences in order to rank your web pages, as in humans.

Just as search engine crawlers are based on the definition of relevance, when faced with hundreds of millions of web pages to rank? They follow a set of rules known as an algorithm. Algorithm as a search engine works is a trade secret very carefully. However, all major search engines follow the general rules below.

Location, location, location and frequency ...
One of the most important rules in a ranking algorithm for the location and frequency of keywords on a Web page. They call it the location / frequency method, in short.

Remember the librarian mentioned above? You need to find books, her request for "travel" agreement, so it makes sense, for the first time with travel books in the title. Search engines work the same. Pages with the keywords that appear in the HTML title tag are often considered more important than others to the topic.

Search engines also check if the keywords appear at the top of a web page, as in the title or the first few paragraphs of text. They assume that any page relevant to the topic I will mention those words right from the start on.

Frequency is another important factor in how search engines determine relevancy. A search engine to analyze the frequency of search terms appear in conjunction with other words on a website. Those with a higher frequency are often more relevant than other websites.

Off the Page factors 
Crawler based search engines have now got a lot of experience with webmasters who constantly write to their web pages in an attempt to higher rankings. Some sophisticated webmasters may also be a major effort to "reverse engineering" uses the location / frequency systems through a search engine. For this reason all the major search engines now use "off page" ranking criteria.

Off-page factors, the webmaster can not easily swayed. Chief of these is link analysis. By analyzing the number of linked pages, a search engine to determine how much is a page if the page as a "significant" and therefore deserve increased rankings. In addition, sophisticated techniques are used to develop tests of webmasters to "artificial" links to remove build to increase your score.

Another factor is the extent of clicks page. In short, this means that a search engine to see what someone chooses the results of a particular search, then eventually drop high sites that will not attract clicks, while promoting lower-ranking to create the site visitor. As with link analysis systems are used to compensate for artificial links from webmasters enthusiasts.

No comments:

Post a Comment