Millions of people all over the world use them everyday, but not many people know how search engines work. If your business has a website, then a basic knowledge of how the function may give you some insights into how you can improve your site's ranking on search engines.
Most search engines use one of two methods to create the index of sites that is searched. The most common method uses what is called a spider or a crawler and the other uses manual input.
By far the most time consuming, human-powered search engines require that website owners or reviewers write a small piece about a site. These pieces are submitted to a search engine database or index. The search engine software searches the index for matching keywords and returns the results.
Spiders are automatic programmes that visit sites, stores the content then follows links to other sites. The URL (Uniform Resource Locator or web address) and content information is stored in the index. This index is what the search engine software searches when people enter a query in a search engine. The software ranks the pages according to two criteria: relevancy and the number of times the keywords appear in the content.
Most search engines search the title and headings for keywords first before searching the rest of the page. Some search engines don't look at the content, they look at the website's descriptors, or meta tags, for keywords.
Once the search engine has found websites that contain the keywords used in the search string, it uses an algorithm to rank the pages. There are different kinds of algorithms and not every search engine uses the same method. The most common method looks at the location and frequency of the keywords on the website. Websites that have the keywords in the title and headings and that use keywords frequently are often ranked higher.
Relevancy is a little harder to quantify. Essentially the algorithms take the links of the site into account when determining whether or not it is relevant to the search or not. A site that contains links to other sites that have similiar information or a site that is linked to from sites with similar information is usually considered to be more 'relevant'. These links are combined with the keywords in the site to determine the overall relevancy of the site.
'Clickthrough measurement' is another way that the page ranks are determined. With this method a website is ranked depending on how many people have clicked on the link. A website that may originally have been low on the list due to low keyword frequency can move up in the rankings if enough users click on the link. This allows websites that have useful information to be ranked higher, even though their keywords and links may not be relevant.
Search engines are not perfect and some sites will always slip through the cracks. To find out how you can help your website to be noticed by the search engines, look at our Tips for some helpful things that you can do and should not do to make your website stand out.