This is default featured post 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Monday, December 27, 2010

Link from an orphan page is useful or not?

No
Because the page is not linked to any website page and you cannot visit it through any referral link and these pages are mainly
for more details
http://www.searchenginejournal.com/…
orphan pages link is never useful as they are not connected to site so the chance of getting crawled is less and if you are finding the page cached or page with rank than it might get cached or ranked when it was connected to website. orphan pages are mainly use for link building purpose and i think it comes under Black Hat technique but i am not sure about it.
Ok. That is sooo nice of you all. But now tell me if there is a page "www.xyz.com/abc"; which is not connected with any page of the domain "www.xyz.com";. Now if that "www.xyz.com/abc"; page gets a link from a very reputated site (may be more than 1link). Then will it help in any way www.xyz.com??
still i will not take link from that page, coz if that page is associated with any other page that means we are building link for that page only but still it is an orphan page/ dead page. that page may get ranked after building backlinks for that but still how to reach at that dead/orphan page??
no it's not at all good to tk link frm orphan page, if we tk link frm orphan page then it shows an artificial link building which not at all good in Google eyes
We have some orphan pages ranking well as these orphan are linked from some of the forum pages of the website + external links. If the domain is good, do get the link and also add few links to the internal orphan page so that Google can cache it. In this way the link will become really useful. As mentioned always ask 1) Who is linking to me (The value of domain) 2) In what context are the linking to me (Page & Link position) ... if the Who part is strong go for it. This also helps in trust ranking.

Which is more important for a site content or links?

It is like this question.
which is better AIR??? WATER??? expand »


I believe both are imp. Because links gives a favor vote to our site where content is important because it gives G reason to index our website and we will have natural links. For best practice both content and links are essential.

If the option is to chose one I will go for links as there is fundamental concepts that persist, all the contents are reachable through links.
hmmm but as per my thoughts ..contents are more imp..as without it a site is like a"naked site"..and on the basis of contents of the sites links are being puted..
Some SEO experts say that content is king and Google is behind the fresh content and this is the only thing that makes your site look fresh and listed higher on the search engine.
expand »But recently I have read a post which is saying that after doing the competition between Backlinks & site content the site is won in ranking which is carrying more backlinks.
We have the content still we need back links. And when there is a page obvious there will be content. it is about content optimization and quality of the content. As u said "whole backlinking procedure is mainly based upon the site content". so what is the need of backlinks??? I understand, good content are source of natural flow of links. Again links for what?? the answer is for ranking.
expand »
As far as the linking strategy, there is a parameter OBL. We dont read the content of the page then proceed. Rather we check the theme and OBL.

I STILL SAY THE CONTENT AND LINKS ARE COMPLEMENTARY. FOR BEST PRACTICE BOTH ARE ESSENTIAL.
The conclusion from my end: The purpose of a page is 1) (Bring visitors) Rank well 2) (Convert visitors) Sell/Satisfy well ..... Now a page that may satisfy well but doesn't bring visitor is like a huge shop in a remote village (it is not profitable) ... We will have to bring people .. if we are bringing people using ads only we can forget about search engines and content part altogether, we can also forget about links.... now if we need traffic from search engines then we need to expand »understand the formula which is something like (Value of the content x value of links) ... now the value of the content will be limited to a certain value so for high competitive market only content will not suffice and content will hardly play any role in very competitive keywords ... links will rule but for low competitive keywords content enough will suffice.... why do people say the content is the king (the reason is for the informative articles as an informative article will bring in natural links and will also satisfy people) ... 1) Write content to rank so that you score well with respect to content value 2) Get links so that you can score well with respect to link value ... it is never a competition between which is bigger content or link ... both are playing its role .. these days content is playing a major role in search engine rankings ... example http://www.google.com/… here google used to rank but now only Google CSE ranks as the page has the keyword in it. .... :) Enjoy the discussion.

Friday, December 17, 2010

Dynamic page optimization

Unlike a normal HTML website, where the content of static pages doesn't change unless you code the changes into your HTML file: open the file, edit the content, save the file, and upload back the server. A dynamic web page is a template that displays specific information in response to queries. The database is connected to the web site and the response is generated through the connected database. These sites are easy to update for webmaster since they’re directly connected to a database where one change reflects all pages. The problem with dynamic sites started when content rich sites failed to rank higher in search engines. The problem lies in its advantage itself. As studied earlier, dynamic pages execute on query.
Users send queries through search engines or they are already coded into a link on the page.But a search engine spider doesn't know how to use your search function - or what questions to ask. Dynamic scripts often need certain information before they can return the page content: cookie data, session id, or a query string being some of the common requirements. Spiders usually stop indexing a dynamic site because they can't answer the question.
Search engines only believe in content and not flashy elements in your web site. Search engine crawlers are programmed in such a way that they can read only the text. Crawlers strictly ignore elements such as pictures, frames, video etc, and read it as an empty space, then move on. Some search engines may not even be able to locate the dynamic page very easily.But even if we make web sites only search engine friendly and not user friendly then most likely we end up losing out visitor. This presents a big problem for marketers who have done very well with their rankings in search engines using static pages but who wish to switch to a dynamic site. This is the reason SEO’s came up with advanced SEO techniques to optimize dynamic pages.

Here are few methods you can use to optimize dynamic pages:

1.Use of software
2.Use of CGI/Perl scripts
3.Re-configuring your web servers
4.Creation of a Static Page linked to an array of dynamic Pages

Thursday, November 25, 2010

Content Optimization

There are aspects of the optimization process that gain and lose importance. Content optimization is no exception to this. Through the many algorithm changes that take place each year, the weight given to the content on your pages rises and falls. Currently incoming links appear to supply greater advantage than well-written and optimized content.
While currently having a bunch of incoming links from high Page Rank sites will do well for you on Google you must consider what will happen to your rankings when the weight given to incoming links drops, or how your website fares on search engines other than Google that don't place the same emphasis on incoming links.
While there are many characteristics of your content that are in the algorithmic calculations,there are a few that consistently hold relatively high priority. These are:
1. Heading Tags
2. Special Text (bold, colored, etc.)
3. Inline Text Links or Anchor text
4. Keyword Density

Heading Tags

The heading tag is code used to specify to the visitor and to the search engines what the topic is of your page and/or subsections of it. You have 6 predefined heading tags to work with ranging from H1 to H6.
By default these tags appear larger than standard text in a browser and are bold. These aspects can be adjusted using the font tags or by using Cascading Style Sheets (CSS).
Due to their abuse by unethical webmasters and SEO's, the weight given to heading tags is not what it could be however the content between these tags is given increased weight over standard text. There are rules to follow with the use of heading tags that must be adhered to. If you use heading tags irresponsibly you run the risk of having your website penalized for spam even though the abuse may be unintentional.

When using your heading tags, try to follow these rules:
• Never use the same tag twice on a single page
• Try to be concise with your wording
• Use heading tags only when appropriate. If bold text will do then go that route
• Don't use CSS to mask heading tags

Never use the same tag twice on a single page. While the H1 tags holds the greatest weight of the entire heading tags, its purpose is to act as the primary heading of the page. If you use it twice you are obviously not using it to define the main topic of the page. If you need to use another heading tag use the H2 tag. After that the H3 tag and so on. Generally try never to use more than 2 heading tags on a page.

Special Text (Optimization using Bold, Italics, Underlines etc.)

When a search engine is scanning page its looking for several factors to determine what is important and what is not. In particular, it looks for text tags such as bold, underline, italics etc to help rank the page.
The reason behind it is quite simple! As the text is formatted specially, search engines think that it is important for users. And hence it is important for Search Engines as well.
"Special text" is any content on your page that is set to stand out from the rest. This includes bold, underlined, colored, highlighted, sizing and italic. This text is given weight higher than standard content and rightfully so. Bold text, for example, is generally used to define sub-headings or to pull content out on a page to insure the visitor reads it. The same can be said for the other "special text" definitions.

Friday, November 19, 2010

How social bookmarking help us in Linkbuilding

Social bookmarking is not just helps us in traffic generation but we can get so much links through social bookmarking websites for our articles. There are some popular social bookmarking websites like digg, stumbleupon, reditt which are very famous for deriving huge traffic but have you ever imagine that you can increase thousands of links from social bookmarking websites.

What is social bookmarking?In simple words “In social bookmarking system, users save links to web pages that they want to remember and/or share with others.”

There are lots of famous social bookmarking websites like digg, reditt, stumbleupon, mixx & delicious etc. Every website has their own method of bookmarking where you just need to create an account & start saving your bookmarks. There are lots of articles available on web for “how to work on social bookmarking websites”. In this article we would like to share how social bookmarking can help us in linkbuilding & how we can increase thousands of links from these websites.

How to increase links through social bookmarking website? Mixx, proppler, digg, reddit & stumbleupon are the most famous social bookmarking websites to increase backlinks. You just simply need to create an account & follow the below mentioned instructions. Here I am sharing an example about mixx

1. Create account on mixx & other websites with the same name if available. Search available user names on 400 websites. It helps you to become famous in social media. 2. You can use one Gmail id to create account & add that Gmail id in your all profiles. 3. You should spend around 1hr-2hrs in a day to do some activities on your account like network growth & bookmarking. 4. Start adding some social media friends on Gmail. 5. Follow some friends & ask them to follow you back. 6. Once you submit your articles on these websites try to get some votes from IM friends or by sharing on social networking websites like twitter or facebook. 7. If your article is good then Mixx will promote your article on their home page with different-2 categories & you will receive hundreds of links for one article.

Thursday, November 18, 2010

The Importance of Content

Content is what makes the website good or bad and content is the major factor which attracts more and more visitors to the site. Search engines also give very high importance to the sites, which are full of relative content. Considering the main reason behind the website i.e. information, search engines emphasis more on content than anything else.
But just writing some simple text is not enough to rank the page in the search engine. There are several factors on which a search engine evaluates the content and ranks the site. The main aspect any search engine looks in the content is its benefit to the visitor. Also original and well written content is highly appreciated by the search engine. This is because, search engines believes that users are always looking for new and original content. Submitting articles is also becoming popular because of its ability to get linked o other sites. The quality of the content also plays a vital role in search engine rankings.
Lets look what advantage the well written and original content provide! Before writing any content, one thing should be kept in mind that content is the backbone of the site. The main purpose of creating a website is to provide all the possible information as deeply as possible.
Therefore before writing the content you should know what exactly your target audience is and what their interests are.
Original & Well-Written Content:

1. What does your audience want to find?
2. Will you have to do additional research?
3. Are you an expert writer or do you have one on staff?

What Does Your Audience Want To Find?
Assessing your potential visitors’ want does not require a crystal ball. If you have completed and spent quality hours on Step One of this series, fully researching your keywords, you are already well on your way. Delving into those keywords you will often find hints that will push you in the right direction.

Tuesday, November 9, 2010

Anchor Text Optimization

As discussed earlier, anchor text is the visible hyperlinked text on the webpage. Anchor text links to a page where the same topic is discussed or it is related to that topic. That means anchor text takes visitor to another site or page where he/she can get some more insights about the topic he/she is reviewing. For example let’s take the four lines written above.
As discussed earlier, anchor text is the visible hyperlinked text on the webpage. Anchor text links to a page where the same topic is discussed or it is related to that topic. That means anchor text takes visitor to another site or page where he/she can get some more insights about the topic he/she is reviewing.
Anchor texts are very important from SEO point of view. The page, which is linked to anchor text, is believed to be highly relevant for search engines. But not only the linked page but also the page containing anchor text gains some importance while ranking on SERP, provided it is using keywords.
Anchor text optimization can be done while link building campaign also. Since presence of keywords in the links pointing to your site also has greater importance, we need optimize that text as well. So it is advisable to write keywords in the title and vary the description accordingly.

It is observed that anchor text is the deciding factor in ranking pages. The site will rank fair if you don’t optimize anchor text. But it will rank much more higher, if we do anchor text optimization. Take the example of ‘miserable failure’ case. Biography of George Bush is linked with ‘miserable failure’ and still it is ranking number one for that keyword. It’s all done without any on-page optimization but only on Anchor text optimization.

Sunday, October 24, 2010

Blog Commenting : Do-Follow and No Follow

When you shout a comment on some one’s blog, you need to enter your name, email address and a website URL into a form in order to write your comment.Alos look at the theme of the blog before shoting.

Take a brief look at the comments that are shot before yours. Notice the names they are using for commenting. Are they using their real name or a pen name? If they are using their real name, so should you. If not, then try to use your targeted keyword instead of the real name. The URL you entered will be incorporated with your name. So using a keyword will boost your ranking.

Below are a few working guidelines that you can follow while commenting on blogs:
1. Comment on relevant niche blogs to yours. Because, the weight age is more if you get links back from relevant sites than non-relevant ones.
2. Make an useful comment that will be helpful to others. No one likes to see/approve “superb blog” or “thanks for sharing” or “see my blog” in the comments section.
3. Comment only when you can add some more useful information to the topic discussed in the post/previous comments else go for another one.
4. Use deep linking of posts in comments. That means if you have written a similar post on your own blog that is relevant to this, then point back to that post. It is easier for the search engines to crawl your blog and it also makes the readers to read your version too.
There are few people who only comment on “do-follow blogs”. But, I suggest you to comment on all blogs that are relevant to yours. May that be a do follow blog or a no-follow blog.

Saturday, October 23, 2010

USE OF HTML Tags:

HTML Tags:

As we know, spiders or crawlers can only read text. When spiders scan a website it actually scans the HTML code of that particular website. There are several HTML tags such as Title tag, Meta tags, Alt tags etc. which should be integrates with the keywords, in terms of greater optimization of the website.

Title Tags:
The Title tag plays a very crucial role in optimizing any web site. Title tag is an HTML code that shows the words that appear in the title bar at the top of the browser. The Title tag is generally the first element of the web site. After that come the Meta tags and Header tags.Title tags contain the theme of the website and hence search engines give substantial importance to this tag. Title tags are the first impression of the web site for crawlers and all the major search engines evaluate the relevance of the website on the basis of the keywords present in the Title tag. This tag is also displayed on the SERP and results contain the text included in Title tag.Title Tag holds significant weight and must be created carefully to ensure that they hold maximum SEO effectiveness and they appeal to the searchers.

Search Engine crawlers consider this tag as the most important element.Generally search engines read about 80-90 characters, therefore the Title tag should not be longer then this. In addition keyword density should be kept in mind while working on the Title tag because it has to be relevant to the ‘webpage’ rather than ‘website’.

Meta Description Tags:

The Meta Description Tag is part of the HTML code that allows you to give a short and concise summary of your web page content. The words placed in this Meta tag, are often used in the Search Engines result pages, just below the Title tag as a brief description of your page.
Some Search Engines prefer to ignore your Meta Description tag and build the description summary on the basis of the search term for the SERP on the fly. They usually pick up parts of the text on your page wherever the search terms appear. The only exceptions are the Flash,Frame or All Image sites that have no content, and some high importance websites, where the search term is not found in the text. In such case, Google picks up your entire Meta Description Tag and displays it.

Friday, October 22, 2010

Keywords

Keyword list is a list of significant and descriptive words that will be used to render the content to users in searches. The words should be similar to the theme of a web site and should be easily integrated in the web site content. Keywords are mental images linked to what lies in the heart of your customer. Keyword selection is based on consumer persuasive research.Keyword selection mainly depends on the theme of website, your target audience,which search engine they are likely to use and finally what keywords they might use to find your product or service.
There are many tools available to generate keywords, known as keyword suggestion tools.The most commonly used keyword suggestion tools are Wordtracker, Overture and Google suggest. The keyword suggestion tool helps to choose relevant and popular terms related to your selected key terms.

Wordtracker:Wordtracker helps web site owners and search engine marketers identify keywords and phrases that are relevant to their or their client’s business and are most likely to be used as queries by search engine visitors.

Overture: Overture’s keyword research tool is basically for Pay-per-Click advertisement where bidding takes place for top rankings in SERP of major search engines and websites including Yahoo!, MSN, AltaVista, AllTheWeb, Dogpile, CNN and ESPN.

Thursday, October 21, 2010

On-Page Optimization Factors

Search Engine Optimization consists of some factors which can be changed and modified by webmaster and some which can’t. The first is called On-page factors and the last is Off-page factors.In this chapter we are going to study ‘On Page Factors’.

On page factors include:
• Keywords
• HTML tags
• Content
• CSS
• URL rewrites

As we know, on page factors are related directly to the content and structure of the web site.This normally consists of pages written in HTML but also applies to other document formats that are indexed by search engine such as .pdf or .doc. Along with all the aspects mentioned above, it may also include reducing redundant HTML codes produced by web page authoring tools and restructuring the site to produce better linked and focused page content.

Monday, October 18, 2010

How Google Defines Spam

As part of their Webmaster Guidelines, Google outlines techniques to use to help Google locate, index and rank your website. They also specifically state that the following techniques may lead them to remove your site from the Google index:
• Hidden text or hidden links.
• Cloaking or sneaky redirects.
• Automated queries to Google.
• Pages loaded with irrelevant keywords.
• Multiple pages, subdomains, or domains with substantially duplicate content.
• "Doorway" pages created just for search engines, or other "cookie cutter" approaches
such as affiliate programs with little or no original content.

However you should keep in mind that these aren't the only practices that Google disapproves of Generally, Google doesn't like their results manipulated by deceptive practices. Their recommendations for webmasters are:
Webmasters who spend their energies upholding the spirit of the basic principles listed above will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.
To combat common search engine spam practices employed by rogue SEOs, Google has also posted a list of practices that should raise a red flag when you are looking for a search engine optimizer. According to Google, feel free to walk away from an SEO who:
• Owns shadow domains.
• Puts links to their other clients on doorway pages.
• Offers to sell keywords in the address bar.
• Doesn’t distinguish between actual search results and ads that appear in search results.
• Guarantees ranking, but only on obscure, long keyword phrases you would get anyway.
• Operates with multiple aliases or falsified WHOIS info.
• Gets traffic from "fake" search engines, spyware, or scumware.
• Has had domains removed from Google's index or is not itself listed in Google.

Tuesday, October 12, 2010

Spamming

Invisible Text: Hiding keywords by using the same color font and background is one of theoldest tricks in the spammers' book. These days, it's also one of the most easily detected by earch engines
Keyword Stuffing: Repeating keywords over and over again, usually at the bottom of the page (tailing) in tiny font or within Meta tags or other hidden tags.
Unrelated Keywords: Never use popular keywords that do not apply to your site's content.You might be able to trick a few people searching for such words into clicking at your link,but they will quickly leave your site when they see you have no info on the topic they wereoriginally searching for. If you have a site about Medical Science and your keywords include"Shahrukh Khan" and "Britney Spears”, that would be considered unrelated keywords.
Hidden Tags: The use of keywords in hidden HTML tags like comment tags, style tags, http-equiv tags, hidden value tags, alt tags, font tags, author tags, option tags, no-frames tags (onsites not using frames).
Duplicate Sites: Content duplication is considered to be search engine spamming also.Sometimes what people do is, they copy the content and name the site differently. But search engines can find it easily and they mark it as a spam. Don't duplicate a web page or doorway page, give them different names, and submit them all. Mirror pages are regarded as spam by all search engines and directories.
Link Farms: Link farm is a network of pages on one or more Web sites heavily cross-linkedwith each other, with the sole intention of improving the search engine ranking of those pages and sites.
Many search engines consider the use of link farms or reciprocal link generators as spam.Several search engines are known to kick out sites that participate in any link exchange program that artificially boosts link popularity.Links can be used to deliver both types of search engine spam, i.e. both content spam and Meta spam.

Wednesday, October 6, 2010

Search Engine Spam

Search engine spamming is the unethical practice for optimizing the site to rank it high on SERP. Spamming is used to trick search engines for higher rankings with the use of some tactics such as repetitive keywords, hidden text and links etc. All search engines penalize websites that use spam. Since time immemorial --or at least since the Internet first began-- webmasters have been using these stratagems to dupe search engines into giving irrelevant pages high search engine placement.
There are overall sixteen tactics that are considered search engine spam. These techniques are:

• Keywords unrelated to site
• Redirects
• Keyword stuffing
• Mirror/duplicate content
• Tiny Text
• Doorway pages
• Link Farms
• Cloaking
• Keyword stacking
• Gibberish
• Hidden text
• Domain Spam
• Hidden links
• Mini/micro-sites
• Page Swapping (bait &switch)
• Typo spam and cyber squatting

Friday, September 24, 2010

Components of Search Engine

Broadly search engines are divided into 2 main categories:
a). Crawler based search engines b). Human powered directories
Any crawler based search engine is made up of 3 basic components.
a). Crawler or Spider
b). Index
c). Search engine software
All these components work one after one and list the page on search engines. Search engines find websites in 2 ways:
1. By accepting listings sent by webmasters 2. By crawlers that roam the internet storing links and information about each page they visit.
Once the site is found by the search engine, crawlers scan the entire site. While scanning, the crawler visits the web page, reads it and then follows link to other pages within the site. Major search engines like Google, Yahoo and MSN use multiple search engines simultaneously.
Google uses 4 spiders which crawl over 100 pages per second and generating around 600KBs of data each second.
Then index program starts after the crawler. Once a webpage is crawled, it is necessary to transfer them to the database. The index contains a copy of each web pages scanned by the crawler. If the webpage is changed, the index is updated with the new information. It is very important that your pages are added to the index. Until and unless it is indexed, it is not available to those searching with the search engines.
The search engine software performs a task of relevant listings. It searches the entire database i.e. indexed pages and matches it with the search. It then ranks and lists the most relevant matches. These listings are done on how the search engine software is programmed. It delivers listings according to what it believes the most relevant content is!

Tuesday, September 14, 2010

Directories

A Web Directory is a web search tool compiled manually by human editors. Once websites are submitted with information such as a title and description, they are assessed by an editor and, if deemed suitable for addition, will be listed under one or more subject categories. Users can search across a directory using keywords or phrases, or browse through the subject hierarchy. Best examples of a directory are Yahoo and the Open Directory Project.
The major difference between search engine and directory is the human factor. A web site search directory indexes a web site based on an independent description of a site. While directories perform many of the same functions of a web page search engine, although their indexing format is different. The main difference is that directories do not spider your site to gather information about it. Instead they rely on a few text entries, typically a site title, domain name, and description to determine which keywords describe your site. While sites in the search engines are scanned and resulted by program (crawler), they are edited manually in directories. Directories contain groups of websites according to theme or industry i.e. automobile related sites are placed in one sub-directory, sports sites are placed into the other sub-directory and so on. Directories do effectively help organize thousands of web sites together. A directory contained inside another directory is called a subdirectory of that directory. Together, the directories form a hierarchy, or tree structure.
There are 5 types of directories namely Human Edited, User Categorized, User Classified, Independently Classified and Pay Per Click (PPC). DMOZ and Yahoo! are the largest directories in the world today.

Friday, September 10, 2010

Main Search Engines

1. Google:
• Google's main search results are provided solely from Google's search technology,offering results from no other engine or source.
• The Google Directory is comprised of listings from The Open Directory Project (ODP,DMOZ). • Google provides results to AOL, Netscape, IWon, Teoma, AskJeeves and Yahoo! Web Results.
2. Yahoo!:
• Paid and free submissions. - Paid results come from Overture. - Provides main results to HotBot, Excite, Go.com, MSN, Excite, Infospace. • About and backup results to LookSmart, and Overture.
3. MSN:
• MSN provides sponsored results from paid advertising sources. • MSN provides primary results from LookSmart. • Secondary results are provided from Inktomi.
4. AOL:
• AOL results for "Recommended Sites" are listings that have been hand picked by AOL editors. • AOL "Sponsored Sites" are supplied by Google AdWords. • AOL "Matching Sites" are supplied by Google results. The results in AOL may not always match the results in Google as Google often updates their database more frequently. • AOL directory listings are provided by the ODP.

5. Alta Vista:
• Alta Vista receives sponsored listings from Overture and Alta Vista's own advertisers. • Alta Vista will use results from their own database for the main search results. • Alta Vista obtains its directory results from LookSmart.
6. HotBot:
• HotBot results contain three categories: Top 10 Results, Directory Results & General Web Results. • Top 10 results include popular sites and searches. • Directory results are hand-picked by human editors. • Web Results are provided by Inktomi. • HotBot offers the capability to search the HotBot database, Lycos, Google and / or AskJeeves all from one location with the click of a button.
7. IWon:
• IWon Spotlight results are comprised of web pages found within IWon or web sites that IWon has a direct business partnership with. • IWon Sponsored Listings are provided by a variety of paid advertisements through third party pay for performance listings including Google, AdWords and Overture. • IWon Web Site Listings are powered by Google. • IWon Shopping Listings are provided by Dealtime.com
8. Lycos:
• Lycos provides directory results from The Open Directory Project. • Lycos provides sponsored listings from Overture. • Lycos provides Web Results from Fast and from the Lycos network.
9. Netscape:
• Netscape's sponsored links are provided by Google AdWords. • Netscape's matching results include sites that are handpicked by ODP editors mixed with results powered by Google.
10. AllTheWeb:
• AllTheWeb crawls and index ODP results. • AllTheWeb powers the main search results in Lycos. • AllTheWeb provides results from Lycos. • AllTheWeb also powers the Lycos advanced search feature, the FTP search feature and their MP3 specialty engine.
Directories:
1. Dmoz: Directory listings are provided to AOL, Google, Lycos and Netscape and many other web sites, directories & portals.
2. Yahoo!: Yahoo! Directory listings are supplied by Yahoo! editors and require a fee for commercial sites.Yahoo directory results are provided to Alta Vista.

Introduction to Search Engines

As the Internet started to grow and became an integral part of day-to-day work, it became almost impossible for a user to fetch the exact or relevant information from such a huge web.This is the main reason why ‘Search Engines’ were developed. Search engines became so popular that now more than 80% of web-site visitors come from them. What exactly is a Search Engine? According to webopedia, a “Search Engine” is a program that searchesdocuments for specified keywords and returns a list of the documents where the keywords were found”.

Search engine market share:

Four times voted as Most Outstanding Search Engine, Google is an undisputed market leader of the search engine industry. Google is a crawler based search engine, which is known for providing both comprehensive coverage of web pages and most relevant information. It attracts the largest number of searches and the number goes up to 250 million searches
everyday.

Major Search Engines-

(1) Google
(2) Yahoo
(3) MSN
(4) Ask Jeeves / Teoma

Monday, September 6, 2010

About SEO

SEO,search engine optimization is process by which we increase the popularity & also improving ranking of any website in any search engine basically in Google search engine.In Internet marketing strategy, SEO considers as that how search engines work and also what people search for.Search engine optimizers may be offer SEO as a stand-alone services or as a part of a broader marketing campaign.An effective SEO may be require changes to HTML source code of any site, SEO technique may be incorporated to web site development and design.
SEO does help in the promotion of sites and at the same time it requires technical knowledge – at least familiarity with basic HTML language.sometimes it is also called SEO because most of techniques that are used to promote any site.SEO helps to increase the traffic to any site, SEO is not advertisement.

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More