Tuesday, November 29, 2005

Google site map

Google site map seems an interesting way to do things.

They have lots of info
http://www.google.com/webmasters/sitemaps

Monday, November 28, 2005

When Should You Submit Your Website to Google?

As soon as you register your domain name, submit it to Google! Even if you have not built your site, or written an copy, or even thought about your content, submit your domain name to Google. In fact, even if you have not fully articulated your business plan and marketing plan, submit your domain name to Google.

Do not wait!

There are two reasons for this. Firstly, getting on the search engines has always taken a long time for a new site. Even assuming you do everything right, it takes months before your site is even indexed, and more months before it starts to rank well. As a rule of thumb, never expect to rank highly within 6 months of submitting your site to Google.

The second reason is a recent phenomenon called "Google Sandbox". Many SEO experts believe that Google 'sandboxes' new websites. Whenever it detects a new website, it withholds its rightful ranking for a period while it determines whether your site is a genuine, credible, long term site. It does this to discourage the creation of SPAM websites (sites which serve no useful purpose other than to boost the ranking of some other site).

By submitting your domain name to Google as soon as you register it, you are establishing a site history even if the site has no content. By the time you have built your site, written your copy, and developed the rest of your content (and written your business and marketing plans), Google will probably see no need to sandbox you.

If you wait until launch day to submit your site, you will spend a month or two (maybe more) sitting in the sandbox watching potential customers spend their money elsewhere.

Wednesday, November 23, 2005

What is Link Popularity?

Link popularity is defined as the number of web sites or web pages pointing to a site. It is a measuring factor used by many search engines in determining the relevancy and popularity of a web page. Google uses link popularity as the single most important factor for a site to achieve top search engine rankings. Content is the next most important factor Google uses to rank a web site.

Role of Link popularity in search engine optimization

In the past, search engines ranked a web page based on the meta content the page offers considering the keywords, key phrases, titles and description tags. The principal factor in rating a web page, were the keywords and key phrases. As time passed, many web sites cleverly utilized this factor by spamming a page with unnecessary and irrelevant keywords and phrases. The search engines faced difficulty in analyzing these spammed and irrelevant pages resulting in improper ranking of these pages. Search engine specialists and research developers after months of thorough study and analysis discovered that a page could be best ranked on the basis of its popularity. The popularity of the website was determined by the number of other websites which had links pointing to the home website. They also considered the popularity of the websites that pointed to the home website. This strategy proved to be effective and paved way to the concept of link popularity. This technique is very safe as it is quite difficult for a website to get into the eyes of many similar websites leading to top search engine rankings. There are no easy means for website owners or spam experts to easily manipulate the pointers to their site in any way and increase their site popularity. Google was the first search engine to implement link popularity as a part of its search engine ranking strategy. Later, almost all the search engines followed suit thus establishing a very strong base for link popularity as a primary decision factor in search engine optimization strategy and achieving top search engine rankings.

How to achieve Link popularity?

We have just seen that Link popularity holds the key in determining and ranking a web page. It is also evident that search engines accord top search engine rankings by looking at the number of sites pointing to it. However it is not simple to achieve link popularity for a site.

Any website links to our site if and only if useful and meaningful content is present thus evoking interest in the visitors mind to introduce a pointer to our website. So, primarily good content is the first and foremost aspect a site should provide in order to increase link popularity. Content always forms a fundamental feature in search engine optimization.

The second aspect is usage of reciprocal links or link exchange programs wherein two website owners include a link pointing to the others. In this way, both the sites are assured of a link from the other side. One can also submit to Free for All Pages (FFA), which automatically ensure links to your site. FFA pages place the site URL in a common page consisting the URLs of many other sites. It is really a painstaking exercise to really submit and monitor FFA pages. Sometimes you would have to submit on a weekly or daily basis and if you are really fortunate enough, all these links stayed tuned to your site for a period of time. Banner Exchanges is another means by which one can achieve pointers to their site. But all these methods are considered as spamming techniques by search engines.

Link popularity is not just a measure the number of links to a site but more than that. The following factors are useful in measuring link popularity.

Number of Links - The more the number of links pointing to our site (Inbound links), the higher the Link popularity is.

Relevance - Link popularity is based on the relevance of the inbound link. The incoming link should share the same subject of that of the current site as search engines detain irrelevant links.

Link text - The text in the links also can affect the ranking of your site. Search engine spiders see if any text in the links pointing to our site is relevant to the context. So if websites use keywords or phrases that coincide with that of our website, there are greater chances of a site achieving top search engine rankings.

Search engines merely do not consider the quantitative aspect of link popularity but also the quality of inbound links. In other words, it is of no use if hundreds of irrelevant links point to your site. Few quality links, which are relevant to the subject and theme, will go a long way in ensuring top search engine rankings.

Search for similar sites, which share the same content that of yours. Study the site and send an email requesting to link back to your site, clearly explaining the purpose of the link and the site. It is a good idea to send the link text that you want to have in the other website including the html code. Recommend including the keywords and phrases in the link text. Also explain where to position the link in their site, whether in the home page or secondary page etc.

Each search engine has a different approach towards implementing link popularity of a site. For instance, one search engine may show that a site has 200 links pointing to it, but another may display only 50 links are pointing to it. A site will get a boost in ranking if the sites linking to it are already indexed by search engines and directories. So, ensure and crosscheck whether the site, which is linking to your site, is submitted to the search engines. All sites linking to your site should be indexed by that particular search engine. Incorporating outbound links to relevant sites may also add to link popularity. Ensure that no dead or broken links exist in your site.

Finally, here is a brief do's and don'ts about how to achieve good link popularity.

Do's

  • Study the links of your competitors and contact them
  • Use quality links, rich with keywords
  • Link to relevant sites only
  • Place the link in the home page if possible as it boosts your ranking
  • Check for broken and dead links
  • Ensure that all sites linking to your site are indexed by the search engines
  • Submit pages with new links to search engines on a timely basis

Don'ts

  • Do not use FFA pages
  • Do not use exchange programs
  • Do not use banner exchanges
  • Avoid using images as links

The reasons being the above factors are considered as spamming techniques and artificial link popularity by popular search engines.

Analyzing the above factors, it is clearly visible how link popularity forms a basis for search engines to rank web pages and the importance of link popularity in formulating an effective search engine optimization strategy.

Link Management for effective search engine optimization

A website may be huge in size and rich in content constituting large number of pages. But it serves no purpose if a website surfer would not be able to navigate easily to the page he desires. The designer along with the search engine optimization expert should try and formulate a link management strategy to achieve two ends.

  • Ensure smooth and intuitive navigation
  • Ensure search engine optimization with proper placement of links for, proper link management is a key to top search engine rankings

Link management forms an important basis in evolving a robust search engine optimization strategy and is a broader concept in the sense that it is not just confined to linking of pages but it encapsulates the different concepts like link popularity, click popularity, reciprocal linking which greatly aid in achieving top search engine rankings.

In the first place, study the website and evolve a site map which clearly depicts the linkage between different web pages. Checking the site map, start connecting the pages using appropriate links. The link text should give an indication of the destination topic to which the link points. In other words, the text in the link should be very much closely related to the context of the destination page as most search engines rank a page higher if the link text speaks of the subject of interest. Use keywords and key phrases as and when necessary in the link text extensively. In this way, one can attain good keyword density and keyword frequency enabling top search engine rankings. It would also be a good idea to use appropriate tool tip text for the links as some search engines do consider this factor while indexing. Do not forget to include the keywords in the tool tip text.

Let us now discuss the different types of links existing for a web page and study how each link affects the search engine optimization. Primarily we have two types of links namely internal links and external links. Internal links are links which link to pages present in the website only. On the other hand, external links are links which point or link to pages outside the website. The various external links a web page can consist are inbound links, outbound links, broken links, dead links, reciprocal links etc. An inbound link is a link from a page external to the website pointing to the web page in the website. An outbound link is a link from a page internal to the website to a page external to the website. It is quite necessary to have proper inbound and outbound links in our website as they greatly determine the website position in search engine rankings. Inbound links actually determine the number of sites pointing to our site. The more number of inbound links, the more popular the website is. The more popular a website is, the more traffic the website receives. The more traffic the site receives, the more chances of generating business, which is the ultimate objective of any website. Relating to search engine optimization, inbound links add to link popularity which is defined as the number of links outside the site pointing to the website. The more number of inbound links, the higher the link popularity is. Usage of outbound links to sites relating to similar subject and content will incredibly boost the link popularity of the site.

Utmost care should be taken while creating links and one has to ensure no broken or dead links exist in the site. Broken links are links which when clicked return an error without actually displaying the required page. Dead links are links which when clicked do not perform any action. They neither display the required page nor return an error. These two types of links are very dangerous as they lower the link popularity.

One has to take time out in formulating an error free link management system. The links in the site should be created such that the user can freely navigate the entire site from any page. The links should emulate the traffic road signs clearly directing the user to the destination of his choice.

Now at this point we can clearly come to a conclusion stating that link management of the site is an essential feature for an effective search engine optimization strategy and also a very important decision maker in achieving top search engine ranking.

The basic tenets of link management as far as search engine optimization is considered are

  • Have as many inbound links as possible
  • Have as many internal links as possible in each page.
  • The internal links should use keywords and key phrases
  • Wherever you find keywords create a link and point it to the page, which has more information about the keyword.
  • Name the destination page in such a way so as to include keywords and key phrases. Ex. instead of naming a destination page as services.html name it as search_engine_optimization_services.html

RSS Feeds are Quickly Becoming Mainstream

RSS feeds are quickly becoming mainstream, but publishers, advertisers and consumers are just scratching the surface. Recent data from the Pew Internet Research Foundation shows that a mere 9 percent of the Internet population has a good idea of what RSS is. Don't be
concerned about the numbers quite yet.

RSS is the new email newsletter RSS is poised to become an important content delivery mechanism in mainstream media. It will soon represent a permanent and fundamental change in the way information will be shared, viewed and acted upon online. It will reshape the way people interact with the web for several reasons.

1. E-mail SPAM has devastated the sending of legitimate customer
communication - RSS is "SPAM free"

2. Many publishers catering to the early adopter and tech markets are seeing 40% month-over-month growth rate in their RSS traffic. Some are seeing 50% of their traffic come from their RSS feed, with a corresponding decline in email subscriptions

3. RSS is easy-to-use (after the subscription process). Consumers will gravitate to anything that saves them time RSS was popularized by blogs

RSS has been around for awhile, but it hasn't been until very recently that there has been a surge in its use. Why now? I believe there are two reasons.

1. Explosion of blogs
2. Demand for consumer control

Technorati reports over 900,000 blog posts are created daily. Blog software tools make publishing to the web as simple as typing an email. RSS makes it easy to stay up-to-date with the volume of blog posts. The content comes to you. You no longer have to search for it.

A common misperception is you must have a blog to have an RSS feed. This is not so. RSS has been adopted by major publishers such as CNET, the BBC, Yahoo, Motley Fool, InfoWorld, The New York Times, the Christian Science Monitor, Wired News, The Wall Street Journal and many
others, including a rapidly growing contingent of local and regional newspapers.

Era of Consumer Control

From TV and digital video recorders to Radio and Podcasts, consumers are demanding control over their media consumption. For over 50 years, TV and Radio has remained the same. With the advent and popularity of TiVo and Podcasting, both TV and Radio will see dramatic changes in how people interact with and consume these media. Consumers will watch and
listen on their time, skip commercials and create their own personal information gathering networks.

Because of RSS, online content consumption is changing too. You can now get your favorite content delivered right to your desktop and read it on your time - without the threat of SPAM clogging your inbox. In future posts, we'll explore how advertising will change in the era of consumer control.

RSS is in its Infancy. However, RSS is not perfect. It has a lot of growing up to do. Here are just a few things that will need to change before we see widespread adoption of RSS.

1. Subscribing to feeds is cumbersome. It is not intuitive
2. Receiving feeds requires another tool (news aggregator) to adopt
3. Getting subscriber counts and data requires new enterprise software to employ

Look where we are today with banners - animation, Flash, behavioral targeting, Fatboy Ad, affiliate programs etc.

In the future, RSS will carry more then text. Today, it is already the primary distribution channel for podcasts. In the near future, much of the content delivered in the era of the much touted Web 2.0 will come on the backs of RSS feeds.

Content provided by Jerry Hart, Hart Creative Marketing, Inc. and Bill Flitter Chief Marketing Officer at Pheedo.

RSS And E-mail: How They Can Work Together?

For most marketers online e-mail is still the key marketing and communicational tool, with its use ranging from e-zine publishing, direct sales messages, loyalty campaigns to internal communications between team members.

But getting e-mail through due to spam filters and spam itself is getting increasingly difficult, while anti-spam legislation is putting even legitimate e-mail marketers to risk.

With 100% content delivery ratios, is RSS a replacement for e-mail?

At least right now, certainly not. However, it has become the key supplement to e-mail delivery. While many internet users are starting to ignore e-mail subscriptions and subscribe only to RSS content, the majority is just starting to explore the world of RSS.

As such, the time to get started with RSS is now.. if you want to get an upper hand over your competition before RSS reaches mainstream and at the same time test the RSS marketing approaches that work for you.

Using RSS as a supplementary content delivery channel, next to e-mail, is one of the places to get started. But to use RSS in conjunction with e-mail, you first need to understand some of the basic relationships between these two tools and e-zines and blogs.

A) UNDERSTANDING RELATIONS BETWEEN RSS, E-MAIL, E-ZINES AND BLOGS

How do these four really relate and what does this mean for your internet marketing strategy?

The most common miss-conception is comparing blogs and e-mail, with many bloggers actually touting blogs as a replacement for e-mail. The truth is, there's no comparison at all, just like comparing apples and oranges.

The second miss-conception is believing that RSS and blogs are somehow strongly related or even that RSS is good only for delivering blog content. The result of this on one side are marketers who do not see RSS as a full-powered communicational channel, and bloggers on the other side who refuse to see e-mail as a viable content delivery vehicle.

Let's set the record straight in the simplest possible terms.

Blogs and e-zines or newsletters are "the what" --- what you publish online ... the content side.

RSS and e-mail are "the how" --- how you get that content or information to the reader ... the delivery side.

RSS/e-mail and blogs/e-zines cannot be directly compared. Blog content and e-zine content can both be delivered via RSS and e-mail, and there is no direct business/logical relation between, for example, blogs and RSS.

What makes sense, for example, is comparing e-zines and blogs. Blogs are "personal" conversations, opinions and news, delivered in a linear structure, usually written in a more personal style, and confined to a limited number of content types.

E-zines on the other hand are more similar to magazines or newspapers, carrying content presented in a complex non-linear content structure, and having the ability to carry many different content types that do not mix well together if provided through a linear content structure. For example, a typical e-zine might include an editorial; a leading article, representing the prevailing topic of a specific e-zine issue; supporting articles, clearly structured to show they are secondary to the leading article; links to the most relevant forum topics and posts; a news section; different advertisements (banner ads, textual ads, advertorials etc.); a Q&A section; a featured whitepaper; etc.

Providing all of this content demands a complex content structure and a strong and experienced editor. The blog format simply does not provide the level of structure needed to effectively present such a complex content mix.

B) INTEGRATING RSS IN TO YOUR E-MAIL MARKETING STRATEGY

If you can understand these basic relations you can in fact understand how much you can integrate RSS in to your e-mail marketing strategy as a supplement to e-mail as a delivery tool.

For now, here are some of the most basic generic opportunities in using RSS together with e-mail:

1. Use RSS to announce each new issue of your e-mail e-zine, which you make available in full on your website.

2. Provide a separate RSS feed for the articles you publish in your e-mail e-zine and get them to your subscribers as soon as the articles become available, without them having to wait to receive them in your e-mail newsletter. The same goes for your news section, if you have one.

3. If you publish much content in different topic categories in your e-zine, provide a separate RSS feed for each of those topics. Take another look at the elements we listed above that a typical e-zine might include. Each of those elements could in fact become a stand alone RSS feed.

4. If you're doing e-mail autoresponder marketing, provide those very same autoresponders as RSS feeds, allowing your visitors to subscribe either to the e-mail or RSS delivery channels to receive the very same content.

5. If you have your own affiliate program, make sure that your affiliates can also subscribe to your affiliate notices via an RSS feed, not just e-mail. Basically, all you will be doing is duplicating the same content you're sending out via e-mail in an RSS feed.

6. If you're sending out special notices or updates to your existing customers via e-mail, create a special limited-access RSS feed to deliver those same updates via RSS as well.

These should be enough to get you started thinking in the right direction.

By Rok Hrastnik
Expert Author

GoogSpy or Google Spy

You will love this tool at www.googspy.com. It's free for now. So use it while it last.

Ever wonder if your competitors are targeting niche phrases that you are missing out on? A new tool offered by Velocityscape allows you to do a little bit of competitive research to make certain that your competitors aren't leaving you behind.

Known as GoogSpy, the tool works by scraping the search results of over half a million phrases each day and then storing them in the GoogSpy database. Marketers use the tool by entering the domain name of a competitor and viewing a list of the search terms that they rank well for. The tool also displays the top 25 competitors for the company that you search on. Entering a search term will bring up a list of companies that are purchasing Adwords, along with the text of their ads.

"I read all those Search Engine Optimization Tips & Tricks articles," says Michael J. Roberts, Velocityscape's President, "and each one said 'Find out which search terms your competitors use.' I thought, 'That seems obvious, but how?' Apparently, the 'best practice' was guess-and-check. There had to be a better way. With Web Scraper Plus+ we extracted a million search results and built a proof of concept in a few days."

Roberts' team has done an excellent job in creating this tool. I've used it myself for several clients and have added it to my arsenal of handy tools. While the tool is limited in scope (I tried several niche phrases for a current client and there were very few results), but has excellent reach on some of the more common terms.

Using the tool is fairly simple.

1.) Go to GoogSpy

2.) In the search box, type "usairways.com"

3.) Look at the listing of companies and click on usairways.com

On the results page, you'll see a listing of the search terms that usairways.com ranks well on organically. Among the results, you'll see that they rank first for the phrase "dividend miles" and fourth for the phrase "flight check".

Scroll down a bit and you'll see a section that says "usairways.com Pays for these Google Adwords." Select the phrase "flights discount" and you'll suddenly see a list of companies that are bidding for this phrase on AdWords. You can find the ad title and description that they use, as well as a listing of other phrases that these types of companies bid on.

Another option is to simply enter a search phrase in the initial search box on GoogSpy. Doing this with the phrase "cheap flights" reveals a list of about two dozens URLs and over one hundred variations of that keyword phrase. Selecting one of the phrases (rather than one of the URLs) takes you straight to the page that lists the companies that bid on this term.

The tool isn't just handy from the actual optimization and pay-per-click side of things, it also has uses in determining whether your competitors have beat you to the market. Trying to convince your boss that you need the funds to start a PPC campaign? Showing him that four of his top competitors are already advertising there may be the incentive he needs to make a budget adjustment.

The tool appears to be actually scraping data, rather than using the Google API, so there's a chance that Google may decide they're not too fond of it. That said, the tool has been around for a few months now and has been covered on several blogs, so there's a good chance that Google sees it as a potential revenue builder. After all, why would they argue with a tool that encourages folks to spend more money on Google?

http://www.searchengineguide.com/laycock/004899.html


Handling Robots

What exactly is a Robot?

A Robot in Search engine optimization services terminology is a search engine software program which visits a page on a website and follows all the links of the website from that page and indexes some or all of the pages in the website.

Why do we need robots?

Everyday search engines receive hundreds of new website submissions. It is quite cumbersome and time consuming for a human to review the whole of the website and judge whether that particular website meets the search engine optimization standards and index the same. Here is where our friend robot comes into picture. Robots are highly intelligent software programs which crawl the entire website, checking the relevancy, consistency and significance of the site thereby effectively indexing the website into the search engine database reducing the amount of time consumed per site. In this way a robot can quickly index more sites per day. Though a robot is not a very critical aspect of search engine optimization services technology, it is advisable to include it.

Controlling a Robot

Normally, a robot visits the home page of the site and follows the links present in the page, scanning each link and page. Sometimes we do not prefer a robot to index a particular page(s) in our site. For instance, you might want a series of pages to be viewed in sequence and would like to index only the page one. To achieve this, we have a special kind of Meta tag known as Robots Meta tag. The robots Meta tag is similar to other Meta tags and is placed in the head of the document. This tag dictates the robot which pages to be indexed, which pages not be indexed, which links should be followed and which links should not be followed.

A typical meta robots tag would resemble as follows - This blog is not allowing to post the meta robot tag.

The meta tag describes us that the robot should index the page visited but to not follow the links in the page.

The other most and significant part of controlling robots is the robots.txt file. The robots.txt file is used primarily to control areas or portions of the website by excluding those portions being visited by the robot. Whenever a robot visits a site, it first checks the robots.txt file.

The robots.txt file,

  • is a text file created in notepad or any text editor
  • should be placed in the top level directory or root of the website or server space
  • should include all lower case letters

Through the robots.txt file we can,

  • Exclude all robots from visiting the server
  • Allow complete access to all robots
  • Exclude robots from accessing a portion of the server
  • Exclude a specific robot
  • Exclude certain type of files from accessing by specifying the file extensions.

Finally, we can conclude that the robots.txt file basically acts as a filter thereby providing total control over the search engine robot.

While talking of robots I would like to mention the revisit tag. This tag is an important tag in the realm of search engine optimization services technology. The revisit tag tells the search engine the time duration after which it should visit your site again. If you change your site's contents frequently then your revisit time should be say a week else it can be higher. Your search engine rankings dip if the search engine visits you a second time and finds that the content has not been altered significantly. Though not all search engines honor the Revisit tag, it is advisable to include the tag. If you are keen on top search engine rankings use the revisit tag judiciously.

How to choose your Search Engine expert?

There are lot of websites touting themselves as search engine experts who assure you of top rankings in search engines. Most of these search engine experts are one-person outlets with little wherewithal to support you in the long run. The best way is to type the following keywords in any popular search engine of your choice.

Example:
  • Search engine optimisation services for UK clients
  • Top search engine ranking
  • Search engine positioning
  • Search engine experts

You may use other keywords of your choice, which are similar to these keywords like SEO services, better search engine rankings etc.

Pick out the first five web results that satisfy these keywords and repeat the search on another popular search engines. The website which figures in both the search engines for these keywords is obviously a company who knows a thing or two about search engine optimization services.

Contact these companies either by mail and see the response time. If the website provides testimonials checkout the testimonials. Never entertain companies, which have PO boxes, or companies who are SOHO entities. You never know when they are going to close down. Don't believe in quick fixes. The process of Search engine optimization is a continuous and strategic in nature.

SEO Glossary

ALT TEXT - HTML attribute that provides alternative text for images. Search Engine spiders or robots can read ALT text but not the images.

BANNER EXCHANGE - A network where participating sites display banner ads in exchange for credits which are converted (using a predetermined exchange rate usually 1:2) into ads to be displayed on other sites.

BUTTON EXCHANGE - Network where participating sites display button ads in exchange for credits which are converted (using a predetermined exchange rate) into ads to be displayed on other sites.

DEEP LINKING - linking to a web page other than a site's index or main page.

DESCRIPTION TAG - A HTML tag used on the web page to provide a description for search engine listings.

DOORWAY PAGE - A page made specifically to rank well in search engines for particular keywords, serving as an entry point through which visitors pass to the main content. Usually dynamic pages are provided with a doorway page.

FFA - Free-for-all links list

INBOUND LINK - A link from a site outside of your site.

KEYWORD - A word used in a search engine to perform a search. It's necessary for the site content to be keyword heavy.

KEYWORD DENSITY - Keywords as a percentage of indexable text words.

KEYWORDS TAG - META tag used to define the primary keywords of a Web page.

LINK POPULARITY - A measure of the quantity and quality of sites that link to your site.

MANUAL SUBMISSION - Adding a URL to the search engines individually by hand and not automatically.

META TAG GENERATOR - Tool that will output META tags based on input page information.

META TAGS - Tags to describe various aspects about a Web page.

OUTBOUND LINK - A link to a site outside of your site.

PAY PER CLICK SEARCH ENGINE - Search engine where results are ranked according to the bid amount, and advertisers are charged when a searcher clicks on the search listing.

RECIPROCAL LINKS - Links between two sites, often based on an agreement by the site owners to exchange links.

SEARCH ENGINE OPTIMIZATION - The process of choosing targeted keyword phrases related to a site, and ensuring that the site places well when those keyword phrases are part of a Web search.

SEARCH ENGINE SPAM - Excessive manipulation to influence search engine rankings.

SEARCH ENGINE SUBMISSION - The act of supplying a URL to a search engine in an attempt to make a search engine aware of a site or page.

SERP - Search engine result page.

SPIDER - A program that automatically fetches Web pages. Spiders are used to feed pages to search engines. It's called a spider because it crawls over the Web. Other terms for these programs are webcrawler and robots.

TEXT LINK EXCHANGE - Network where participating sites display text ads in exchange for credits which are converted (using a predetermined exchange rate) into ads to be displayed on other sites.

TITLE TAG - HTML tag used to define the text in the top line of a Web browser, also used by many search engines as the title of search listings.

TOP 10 - The top ten search engine results for a particular search term.

URL - Location of a resource on the Internet.

WEB RING - a means for navigating a group of related sites primarily by going forward and backward.

SEO Unique Ideas

These are a few tips to promote your site efficiently and get effective returns. These may look frivolous but, they can really bring in a lot of traffic to your site.
  • Make the site navigation easy.
  • Many first time visitors to your website will be directed through search engines, so maximizing the placements in search engines effectively is well worth the time, effort and investment.
  • Continuous study on how the search engines work, and if required make changes to keywords of your site to maximize search rankings.
  • Periodically submit and re-submit your site to directories and search engines.
  • Use page-scanning techniques such as Google and ensure that your headings are termed in such a way that the page is shown within certain search criteria.
  • Adding some HTML hyperlinks to the home page that lead to major inside pages or sections of your website is appreciable. Place them down at the bottom of the page. The search engine will find them and follow them.
  • Participate in mailing lists and discussions that are relevant to your business, and subtly plug your site and yourself.
  • Put your Link Exchange or Smart Clicks banners or small hot link you belong to.
  • Each page in your web site will have different strategic keywords that reflect the page's content. Strategic keywords should always be at least two or more words long to use them in title, description with Meta tags without repeating them. Meta tags will help control your site's description in search engines.
  • Pick phrases of two or more words, and you'll have a better shot at success. Make sure your strategic keywords appear in the crucial locations on your web pages. Make sure that your HTML text is visible.
  • Use ALT Tags for images. The spider or the robots cannot read images but they can read the text in the ALT tags.
  • Often work on the page and update with new links, or some related stories, even links to other obtainable pages on the subject. This helps your web page to be Spam free.
  • The user will not view more than the first 10-30 entries or 2-3 pages of search results. Sometimes some typical keyword search will result in the return of over maximum possible sites that contain the keywords or the phrase used in that search.
  • Links from other sites, including portals, online yellow pages and directories, can be a significant source of traffic, if done effectively.

How Important is Content for a Website?

The main factors that determine the success of a website are - Effective Search Engine Optimisation, Google PageRank and high quality content.

Search Engine Optimisation (popularly known as SEO) is a process of tweaking the meta tags (Title tag, Description tag, Keywords tag, Alt tag) and placing the keywords in the right positions and making the site search engine friendly so that the search engine spiders access all the pages of the website and index the site on to the search engine.

Google PageRank (PR) mainly depends on the link popularity. In other words PR depends on the number of other sites that link back to your site. The more the number of links that link back to your site the good the PR.

There is an interesting state with regards to Google, as Google is built upon the founders' (Sergey Brin and Lawrence Page) use of PageRank. PageRank checks for the incoming links as a type of "rank" or a "count" for the importance and relevance of the page. Google's algorithms give a great deal of importance on, inbound links in ranking pages for position in their search pages. In fact, Google will not keep a site in its index unless there is at least one inbound link.

Despite this, the "Content is King" saying persists in some circles. It is true that this line has become monotonous and is characterized by a thought, or claim, that anyone who participates in aggressive link building is in some way cheating the Google's algorithm.

The only long-term solution for a good search engine rank is quality content. It is very easy to edit using the strategic keywords as and when required. You do not have to be an expert to do it. Know the keywords and try using them in the content. But, one needs to take care that the content does not look foolish by using the keywords repeatedly. One needs to focus on addressing the needs of the reader/visitor. The content also helps in inviting the search engine spiders, as you just need to feed them with the right keywords.

Usually, search engine marketers cram the web pages with full of keywords and use small text at the bottom of the pages to create a good fodder for the spiders. The usability of the page or how readers want to view it is seldom given a thought. This is a good short-term strategy and many a times it can be considered as spam by many search engines and all the efforts are lost. One should always consider that the Spiders are always smart and it is difficult to fool them.

Let us take an example of a site's content which has too many spelling mistakes and grammatical errors. It looks flabby and a customer will think numerous times to give his / her email address or the credit card details. It hardly matters if your site ranks in search engines and has too much of traffic. But, at the end of the day sales matter to any Webmaster. The site needs to convert a visitor to a buyer. A buyer needs to visit the site again to buy more or refer the site to other friends. Good content serves to all readers - spiders, customers and referrals.

Let us take another example of a door-to-door salesman. The salesman travels all the way to your house and just wants to enter your house so that he is able to convince you to be his customer. The way he communicates matter a lot and that is the only way a salesman can create an impact. Fortunately, online the customer himself/herself comes your site with his/her own wish, spending his/her own time and internet. The site content needs to communicate to the reader in such a way so that the reader gets more inside the website or make him/her buy. Content is the thing that invites customers to go into the inner pages and retains their attention. It is the one and only means, which would bring customers to your website and turn leads into sales through all your advertising campaigns. One needs to hit right on the bull's eye when it comes to content. A Webmaster should not compromise when it comes to content.

High quality content signifies a high value to any reader and it the main criteria for the reader to judge your site and convince him/her to become a customer and a regular visitor. A good content should also be unique and should be updated regularly so that people will come back to your site often to see what is new.

The easiest way to update content regularly these days is to create a Blog. These are inexpensive and very simple to integrate into one's website. Search Engine marketers are rushing to add blog to their sites as they are proving to be very spider friendly. After all what the search engine spider wants to see is - what the searchers want - good, relevant and updated content.

How to write Quality Content?

The website visitors should have an easy time understanding of the flow of the website. If they have to search through multiple pages to find the information they are looking for will not be a second time visitors. Above that they will not stick to your website for long.

An average visitor usually takes about four to five seconds to decide to click on some inner link or button. A clear idea of what your site is about should be evident straight away, followed by easy navigation to other pages that display further topics in more detail.

Search engines love relevant content. Keyword relevancy is a very important component of SEO. The more relevant the content is to a specific search term, the more likely the page will rank at the top of search results for that search query.

Keyword density is another big deal with search engines. There is an optimal ratio of key terms to the overall amount of text that must be used for search engine optimization purposes. Keyword Density refers to the overall density of a given keyword for a particular page and is extremely, extremely important for SEO strategy.

The more unrelated words (may be adjectives, articles and exclamation marks) that are used consistently throughout the content will bring down keyword density.

The content of a website forms an integral part in formulating an effective search engine optimization strategy. The content is the heart and soul of a website and is a breadwinner in accomplishing a top search engine position. If you want to build a web site let it have high-quality content with the key phrases at most of the places. Google will rank that site in the search result pages on the basis of the content. This is a combination of the words found in the title, the headings on the web page and the body of the text.

Finally, for search engine optimization, the content forms a constructive role in achieving top search engine rank apart from various other important factors like formatting of text, keyword density, keyword frequency, keyword positioning, keyword proximity and link popularity would add up in achieving a top search engine ranking.

So, high importance needs to be given to the content one has on the website apart from doing all or any of other things to promote the site either online or offline. Good Content converts your visitors into customers and helps in search engine ranking as well. If you do not do the business then you are losing the business to your competitor. Quality Content pages forces the visitors to act and re-act on the website. Increasing conversions is what it all matters in any business. If you take the time to write relevant content, and update regularly, you will have a better chance of people finding you than all the other tricks you could try combined for getting the top position in Search Engines. Search engines are becoming smarter day-by-day, they understand what people want to find and they are getting cleverer at it every day - this is good news!