Thursday, November 10, 2016

Search engines and web crawler


If you’re doing any type of business on the internet than search engines like Google and Yahoo! could play a large role in the success or failure of your business. 

Even though most internet user have experience using search engines, many don’t know how the engines work.

For website owners it is important to understand how, at least at a high level, search engines work so they can take advantage of the website traffic that only search engines can deliver.

Search Engines are special sites on the Web that are designed to help people find information stored on other sites.

There are differences in the ways various Search Engines work , but they all perform three basic tasks:

– They search the Internet or select pieces of the Internet based on important words,

– They keep an index of the words they find, and where they find them, and

– They allow users to look for words or combinations of words found in that index.


Early Search Engines held an index of a few hundred thousand pages and documents, and received maybe one or two thousand inquiries each day.

Today, a top Search Engine will index hundreds of millions of pages, and respond to tens of millions of queries per day.

Before a Search Engine can tell you where a file or document is, it must be found.

To find information on the hundreds of millions of Web pages that exist, a Search Engine employs special software robots, called spiders, to build lists of the words found on Web sites.

When a spider is building its lists, the process is called web crawling.

In order to build and maintain a useful list of words, a Search Engine’s spiders have to look at a lot of pages. How does any spider start its travels over the Web?

The usual starting points are lists of heavily used servers and very popular pages.

The spider will begin with a popular site, indexing the words on its pages and following every link found within the site.

In this way, the spidering system quickly begins to travel, spreading out across the most widely used portions of the Web.

To increase the likelyhood that the spiders find your website make sure you are listed on high ranked websites like Merchant Circle.

You can create a free business page and blog that will not only increase your chances of being found but also be another place for potential visitors to find about about your website.

Once the spiders have completed the task of finding information on Web pages, the Search Engine must store the information in a way that makes it useful.

There are two key components involved in making the gathered data accessible to users :

– The information stored with the data, and

– The method by which the information is indexed.


In the simplest case, a Search Engine could just store the word and the URL where it was found.

In reality, this would make for an engine of limited use, since there would be no way of telling whether the word was used in an important or a trivial way on the page, whether the word was used once or many times or whether the page contained links to other pages containing the word.

In other words, there would be no way of building the ranking list that tries to present the most useful pages at the top of the list of search results.

To make for more useful results, most Search Engines store more than just the word and URL.

A Search Engine might store the number of times that the word appears on a page.

The engine might assign a weight to each entry, with increasing values assigned to words as they appear near the top of the document, in sub-headings, in links, in the META tags or in the title of the page.

Each commercial Search Engine has a different formula for assigning weight to the words in its index.

This is one of the reasons that a search for the same word on different Search Engines will produce different lists, with the pages presented in different orders.

You can get a customized Page Critic Analysis Report from Webs 4 Small Business that tells you how your webpages should be set up to make sure give the search engines what they are looking for when they review your page.

An index has a single purpose: it allows information to be found as quickly as possible.

There are quite a few ways for an index to be built, but one of the most effective ways is to build a hash table. In hashing, a formula is applied to attach a numerical value to each word.

The formula is designed to evenly distribute the entries across a predetermined number of divisions.

This numerical distribution is different from the distribution of words across the alphabet, and that is the key to a hash table’s effectiveness.

When a person requests a search on a keyword or phrase, the Search Engine software searches the index for relevant information.

The software then provides a report back to the searcher with the most relevant web pages listed first.

Not sure what keywords you should use?

Picking the right, or wrong ones, can make or break your chances at online success.

Consider getting a customized 'key word report' from webs for amall business that tells you which keywords you should and should not consider for your site.

These keywords are specific to your website not a generic list of keywords.

So, why are search engines important to business website owners?

Simple.

You need traffic and Search Engines have lots of it.

And, their purpose is to drive traffic to sites that they think match their visitors’ needs.

The way to get a portion of that traffic, is to optimize your website .

That way when a someone searches for a keyword that is relevant to your website, the search engines includes a link to your website in the search results.

That puts traffic and potential visitors just a click away from you website.


0 comments:

Post a Comment

Copyright © 2015 INFO4u
| Distributed By Gooyaabi Templates