Monday, February 20, 2012

What is Search Engine Optimization (SEO)? And how Search Engines Work?

Search Engine Optimization (SEO-Search Engine Optimization) issues related to the content of sites, the flow of visitors to the site as a result of searches made on Search Engines and the upper ranks, which is the process of creating Web pages appear. As a result of changes and updates made on the Sites, Search Engines and Web Catalogs content in accordance with the priority rankings are used to obtain. Search Engines, there are sites on the Internet, and users of these sites contribute. According to the statistics of all countries in the search engines of the most clicked pages. Among these, Google, Yahoo ,Ask, Bing. Overall, over the same logic works in all major search engines.


How Search Engines Work?

Search engines are special programs running on computers with a very strong processing power. These programs run on computers such as computers, but they are not known. There are far more advanced equipment, connections are very fast, have a larger hard disks, can store very large data, there are very large memories, and often had more than one processor. These computers are linked together to share the load falling to them. For example, Google is also estimated that around 1000 computers connected to each other. Because thousands of people at the same time is able to search, would not be easy at all to get out from under this burden. Search is currently in stages, muting the search engine after typing the appropriate word search (search) button and click at the present pages and pages from a web address. So, how old are these web addresses?

Information when the user clicks the Search button search engine scans its database, it does not search on the Internet. Well, that's how the web address is entered into the database of millions? According to a study of a man, He might open a web page every second until the end of its life on the Internet all the pages are still a few people sit down and make it impossible considering. Filling of the database spider, crawlers, carried out by software known as robots. Mechanism that is part of Spider indexing, storage, does the record in the database according to the contents of all pages and web sites. Spider said indexing process performs the same web address as a user requests the servers. Bookmarked links on the page is visited. This makes the process for all pages and all pages of the website is handled their database with the search engine. Spider web page you requested does not receive response from the database deletes this page. Constantly by software developers to develop search engines spiders.


0 comments:

Post a Comment