Thursday, January 23, 2014

Search Engines and their Working

Working of search enginesThere are millions of pages in the internet providing information about various topics. Search Engines are a special software program that helps the user to collect the required information from the internet. A search engine provides relevant information based on the search query by searching their data base. Search Engines use complicated algorithms to determine the ranking of websites for every search term considering many factors.






Search engines like Google, Bing etc. use an automated robot known as spider to index the web pages in internet. The crawler retrieves every page on a site by following the links on a website. The pages excluded by the site owner using a robots file will not be indexed by the search engine. The search engine then collects information about every page including its content, title, meta-tags etc.  These data is then stored in a database of every search engine to retrieve later. The spiders will return to already crawled web pages on a regular basis to check for updates and index the new data.


When users search for a query the search engines returns a list of matching websites after analyzing the stored data in the data base. The results usually consist of title and meta description (if given) of the corresponding page in the website. For every search query there may be millions of websites indexed in the data base. But, the most relevant result is given to the user by search engines after considering a number of SEO factors. The results displayed and the methods used undergo regular changes over time. 

No comments: