To better understand Internet marketing, one must first possess a basic knowledge of how the internet works and some of the terminology that is used when discussing Internet marketing.
Search Engines are basically computer programs that allow people to search for things from a large database of information using a web interface, Google and Yhoo! are prime examples. For a search engine to work it must have access to some amount of data about the various web sites on the internet. When a search is conducted on the engine it is this data that is searched (not the internet in general - as this would not be practical) To collect this data most engines employ programs that look at web sites and collect data about them - these programs are affectionately referred to as Spiders or Bots. The spider will look at a web site and collect a subset of the information on the site as well as a list of all the links on that site (this is one way it finds the next web site to visit) This entire process is known as ´Indexing´ When a web site is submitted to a search engine - it is just being added to the spider´s list of sites to visit. Each search engine has it´s own flavour of spider that extracts data from sites in a slightly different manner.
Similar to search engines, directories are large databases of information about web sites, the difference being that these are compiled by humans rather than spiders/bots. To get listed on a directory a web site must be submitted and then reviewed by one of the directory editors. Directories are a very important internet marketing tool because fraudulent sites that may succeed in getting listed on some search engines through "creative" submission are usually caught by the directory editors - the information in their database is therefore typically more accurate and is treated as such by savvy internet users. Many search engines also pull from the databases of directories when conducting a search. The best known and therefore most important directories are currently Yahoo! and dmoz.org (the Open Directory Project)
Of all the places where a web site should be listed Listing/Portal sites are arguably one of the most important, however it usually costs money to be listed. Listings sites generally are specific to a particular industry segment and are often the first results returned by the search engines if someone does a generic search - which is precisely why they are so important. They spend all their time and effort (and often a fair amount of money) to be listed in a top position on the search engines - so if a business site is listed on that portal it can indirectly reap the rewards of their efforts.
Many customers report that they get more traffic from a single listing on a relevant listings/portal site than from all the search engines combined.
These are the words and phrases that represent the business or organization. Any Internet marketing plan will require that a small number of key phases be "targeted" in effect, the goal will be that if someone performs a search on the search engines for one of those key phrases the web site targeting them will be returned near the top of the result list by the search engine. It is impossible to target more then 2 or 3 different phrases for any single web site successfully, so carefully choosing appropriate key phrases is a critical component of any Internet marketing plan.
The web gets its name from the way sites are linked from one to another, creating a "web" of interconnected information. Links can be outbound from a site, inbound to that site, ´recriprocal´ in which case the two sites share links between them, or internal (a link to another page on the same site)
Cascading Style Sheets: A method used to apply style (fonts, colours, spacing, etc...) to content within a web page.
HyperText Markup Language: the code initially used to create web pages, superseded by XHTML, but still in use.
eXtensible HyperText Markup Language: the latest code used to create web pages. It improves HTML by ensuring pages are well formatted and tightens up some of the ´loose´ concepts in HTML. Most modern web pages are being designed using a combination of XHTML and CSS
The goal of any web site, in effect, is to sell something. That may be a product but it may also be just a concept or idea. In all cases, however, the site should be designed to direct the visitor through to some, perhaps one of many, conclusion. Web sites that only present information but do not provide any calls to action fail in this regard. It is therefore critical that the design of the site itself is considered a part of the marketing plan.
A web site, in and of itself, is just a beginning. It is a tool that assists in an ongoing internet marketing plan. To make your site perform (and thus generate new business or awareness for your organization) is going to require work - partly from your designer/internet marketing partner and partly from members within your organization. This document is designed to help web site managers understand both ongoing tasks and what must already have been done during the design and development stages to achieve those goals.
In addition to discussing how to market your web site this document will look at a few of the common internet marketing pitfalls and scams to avoid. Through a clear understanding of all the factors involved in web site marketing it should be straightforward for a business or organization to formulate a realistic Internet marketing plan that will fit within both their time and budget constraints.
The marketing techniques set forth herein are designed to promote a web site for the long term. Both to achieve a strong search engine ranking that is stable, build visitor loyalty, and increase visitor satisfaction. Many marketing firms specialize in short term marketing of sites, and therefor employ techniques that show results more rapidly but are shorter lived and more prone to failure as the Internet grows and changes. These techniques, while giving the appearance of success, have proven to show a poor overall return on investment as they are usually quite expensive, requiring a continual outlay to remain effective.
When compiling a web site it is critical to always be mindful of the top 3 key areas for both Human and Computer (search engine ´Spiders´ aka ´Bots´) visitors to the site:
Human Visitors | Search Engine Bots/Spiders |
---|---|
Simple & Logical | URL (address) |
Interesting/Useful | Titles & Headings |
Attractive/Fun | Content & Links |
As the above table illustrates the two groups of visitors are looking for different things from a web site - this dichotomy presents an important aspect of web site marketing and design and will be looked at in detail throughout this document.
Frequently used as the primary measurment of the success of web sites - the overall traffic a web site receives is not always the most important thing to strive for when marketing. If a site receives thousands of visitors each day but is not generating a corresponding conversion from those visitors, is the marketing plan to be considered successful? This document will attempt to deliver a plan that will bring targeted traffic to a web site, which depending on the market, may never be thousands of visitors per day - but whatever number of visitors the site achieves, they should be appropriate visitors.
Some mention was made in the introduction that one must always consider the two classes of visitor that any web site can expect - humans and the search engine indexing programs known as "Spiders" or "Bots" These two classes of visitors are looking for radically different things when visiting a web site and any web marketing plan must be designed to please both. This makes marketing more difficult, but there are many ways to achieve a happy medium and create a web site that will appease the wants of both.
Humans want a web site that contains something of value to them, and that is easy to navigate. These two desires are far more important then the appearance of the web site - although it is important that the site not be so unattractive that it a detractor. Frequently web sites are designed entirely to be attractive and/or "cool" and loose so much of the simple and easy to navigate factors in the process that human visitors, while initially pleased with the appearance of the site, rapidly loose interest and never return. Worse yet they may even become frustrated with the speed and/or complex navigation and actively avoid the site. It may be surprising, but word-of-mouth marketing is still as significant a force on-line as it is in the analog world.
A design can achieve the traits of being simple, logical, and informative, in part, by avoiding some common mistakes:
Excessive Graphics (especially animated ones)Sites should always allow human visitors to choose if they want rich media content, such as Flash, audio, or video content instead of forcing it upon them - a proven way to make some site visitors leave instantly.
Humans also tend not to want to read. They prefer to rapidly scan material: looking at headings, buttons, and lists. Avoiding reading through paragraphs of text so they can rapidly continue forward to their goal. They will however read detailed information that is of direct interest to them. When adding content to a web site keeping this facet of human behaviour in mind is critical.
Spiders have a very different goal when looking through the content of a web site then do humans. Their goal is to extract key details of what the site contains in order to build a structured database for whichever search engine has sent them forth. They look at some areas of the content in the web site and completely ignore others. The entire game of "Search Engine Optimization" is built around tweaking these key areas to specifically meet the goals of the spiders and thereby ensure that certain data is included in the search engine databases. Sites thus achieve a high ranking for the key words and phrases that have been "targeted" in this manner.
Spiders can´t understand and therefore completely ignore all of the following:
Flash contentInstead, spiders look closely at the structure of a web site and extract textual information from primarily the content and some of the meta areas (this will be explained in more detail later on)
Most search engines actually provide quite a bit of detail about how they would like to see web sites built in order to rank well on their engine. Where some marketing companies choose to try and trick the search engines into giving their customers´ sites a higher rank, we suggest following the rules and suggestions put forth by the search engines - thus working with them to achieve a higher rank. Experience has shown this to be, by far, the better long term solution.
For now, it is interesting to compare the list of items that the spiders ignore, with the list of things that can turn your human visitors away. The conclusions are self evident - a web site should be well structured, simple, logical, and avoid excessive use of multimedia content in order to satisfy both humans and spiders.
With the vast amount of information available on the Internet some form of structure is critical. The same is true for each individual web site. The search engines use databases that contain information extracted from web sites and converted into their own structure to attempt to provide for the first goal. Creating a web site that is well structured assists the search engines in their goals and has the attractive side-effect of improving the web site´s ranking. A well structured site is also more pleasing to human visitors because it makes information both easier to find and easier to understand once it is located. This section will look at key structural components of a good web site.
There are now many standards for the way web pages should be created. These have been developed to ensure that the many different client programs (known as ´User Agents´) can interpret the information contained on web pages and correctly display them. The standards body responsible for the web (and many other related technologies) is the W3C (World Wide Web Consortium) and can be found online at www.w3c.org
When designing a web site in today´s world where it may be viewed by humans or computers, from virtually any country (perhaps even from space) and on many different platforms (computers, palm tops, cell phones) these standards are essential and adhering to them will allow a web site to stand above and apart from others that do not.
There are still vast numbers of web sites that are not standards compliant, the main reason for this are WYSIWYG (What You See is What You Get) type editors that generate code for web pages automatically. This code is typically replete with errors and renders the site non-compliant. Most modern web browsers have extensive built-in error handling to try and display pages correctly despite all these errors, but even if a non-compliant web site displays correctly in some browsers, this by no means ensures it will not be rendered inaccessible to some - including the search engine spiders (which are just a different type of User Agent)
Standards compliant pages are designed to display in any User Agent, depending on the User Agent viewing the page, perhaps only certain aspects of the page will be visible, but the textual content is always available (even to screen-reading software used by those with visual impairment)
There are many tools available online to test a web page to see if it is compliant (see the Tools Appendix) and while being standards compliant alone will not ensure a higher rank then selected non-standards compliant sites, it will ensure a web site is visible to a wider audience, and will certainly impact the ability of spiders to successfully index the site.
Standards define the structure of the code used to build the page but this is usually something that only experienced web designers can influence. Regardless of wether a site is standards compliant or not the structure of the page and content will dramatically effect the site´s ability to rank for given key words and phrases.
As was noted, both humans and spiders want to see logically structured content. Humans want this so that they can rapidly scan the information and find what they need, and the spiders so that they can apply their algorithms and extract information for the search engines database.
This structure can be provided through the use of correct coding for headings using heading tags in either HTML or XHTML and including correctly delimited paragraphs and lists. It is difficult to explain this coding in detail without embarking on an exhaustive explanation of (X)HTML coding. The web designer should know what is meant by heading tags and should ensure that a web site employs them where necessary.
What is more important within the confines of this document is the text that appears in those headings. Search engines apply special consideration to page titles, headings, lists, and even bolded text. Thus whenever a heading is designed for a site the careful insertion of key words and phrases into the heading can easily result in a higher rank for the site.
Think for a moment about how many web sites use the words "Welcome to ..." in their main home page heading. When one considers the importance of using key phrases in a heading tag, the error in this approach is obvious.
Bad Headings | Better Headings |
---|---|
Welcome to Bob´s | Bob´s Victoria BC Bakery |
Page 7 | Best Fly Fishing Sites in BC (m-n) |
Enjoy your stay... | Luxury Hotel and Spa Accommodations |
The table above contains three commonly seen errors when creating headings:
In the first example "Welcome To Bob´s" at least contains Bob´s name, but unless customers already know the name, they will not be searching for it. It is far more likely that they will be searching for a "Bakery" in "Victoria BC" in some order or another - the better heading includes both Bob´s name as well as a keyword "Bakery" and a location.
In the second example, frequently seen on content managed web sites, or sites created automatically from presentations or printed reports, there is no mention whatsoever of what the page content may relate to. The better heading assumes that this was a list of good fishing sites in BC and that page 7 were places starting with m-n. Now the heading again contains keywords and the key phrase "Fly Fishing" while still keeping it separate from the other sections by including the "(m-n)" component.
In the third example, seen all too often, an accommodations provider has chosen a heading that expresses a feeling rather then a title for the page or content. The better heading includes 4 key words. Human visitors instantly know what the site is about and the search engine´s spider will record the 4 keywords in their database.
Search engines also prioritize the headings on web sites in order of their prominence. (X)HTML code provides for up to 6 levels of heading prominence with H1 being the most prominent and H6 the least (most search engines look at headings 1 through 3)
Headings are visible components of the page content, but the page title is displayed outside the content of the page (in the title bar of most web browsers) Notice how in the example below the title contains many key words and phrases:
It should also be considered that many search engines use the title as the text to display when returning your web site as a result, and it is the only thing visible on a visitors computer when the web browser is minimized.
Overall the title is the most important visible element of any web page (after the web address) as far as the search engines are concerned, followed by the headings. Therefore choose titles and headings carefully to ensure they always contain key words and phrases that match the content of the page in question.
In addition to the visible components of any web page, there are also many invisible components of a web page that the search engines will look at. Unfortunately because of extensive abuse of these elements by unscrupulous Internet marketers the search engines have been forced to implement stringent abuse checking algorithms and limit the weight of these elements in their calculation of search results. However if a web site is following the rules they can still be a very important area to consider when marketing.
The best known "meta information" on a web page are the so-called "meta-tags" which for many refer only to the "keywords" and "description" tags. There are actually many different meta tags and several of these have various formats. Meta tags can contain many different things including:
KeywordsA detailed discussion of the meta tags and how each works in their various formats is beyond the scope of this document, however any experienced web designer should automatically include several formats of most of the meta tags mentioned above. The most important thing to ensure when developing a marketing plan is that the keywords and description included in the meta tags for each page on a web site are correct for that page. While not many of the search engines actually look at the keywords tag any more, many do use the description as the text presented in search engine results.
Meta Tags are not the only meta information on a web page. Along with the new XHTML standard and a push from the W3C to ensure accessability for persons with disabilities to web sites; many new areas within the content of web pages have been opened up for meta information, including but not limited to:
Alternate text for imagesA well designed web site should use all of these meta information locations as places to supply more key words and phrases. This also meets the goal these areas were designed for in the fist place: ensuring that the web site is more accessible, which extends the sites audience.
As mentioned before, the web got its name due to its interlinked nature, the ability for one page of information to directly reference another and allow a user to click through and view the related information instantly.
A concept brought forth by the Google search engine called "Page Rank" used this to reinvent the search engine world. Google started to measure the number of times a site was linked to by other sites, in a way viewing each link as a vote for that site. Sites with more links thus had more votes, and therefore a higher "Page Rank" Theoretically, in the Google engine two pages with identical content but different "Page Rank" would be listed in the results with the site having the higher "Page Rank" first.
Since its inception the "Page Rank" concept has proven to provide better results and is largely responsible for Google´s success. It has also caused a frenzy of discussion about sites needing links to achieve a better rank.
First and foremost is it critical to remember that only incoming links to a web site will improve its rank. A site with a thousand links to other places will not benefit at all in terms of "Page Rank" (in fact some search engines may actually lower a site´s rank if pages with nothing but links are found) Therefore any time other web sites can be encouraged to link to yours is a benefit - Google´s "Page Rank" takes this one step further though: it also looks at the relevance of links. Google wants to see links that come from related industry sites and use link-text that has you key words and phrases.
For example:
A web site about fly fishing in BC would do better with links from companies promoting fly fishing and/or fly fishing equipment using the text "BC Fly Fishing" rather then from a bakery in the US using the text "click here"A somewhat exaggerated example but it identifies the way that Google´s "Page Rank" categorizes links.
The "Page Rank" concept, while invented by Google has now been adopted, under a variety of different names, by virtually all the other major search engines with the same basic rules and structure.
Any time a web site provides a link to somewhere else it provides an avenue for a customer to leave! That may sound harsh, but it is true. There are ways to limit the chance that they will never return like opening the link in a new window, such that when the browser window is closed the linking site is back on the customer´s screen, but it is important to always consider that customers can be lost through an outgoing link.
Furthermore, any link to another web site, in a visitor´s eyes, is a recommendation for that site. So be sure that any sites being linked to meet all appropriate standards of decorum and do not compete with or promote the sites of your competitors. That said, one might think it best to include no outgoing links on a web site - unfortunately some search engines may now penalize web sites they term "black holes" with no outgoing links.
So how do you win? Through careful selection of relevant links (hopefully reciprocal - see below) to informative sites that will by inference promote your product, service, or organization. Include a paragraph of textual content along with the link to clearly explain to your visitors why the link is there - this avoids your visitor blindly leaving your web site. It also provides some fodder for the search engine spiders along the way. This way your visitor should understand why the link is there and will likely return once they have finished gleaning what they can from it. Finally, always open the link in a new window - just to be safe.
As a way to increase the "Page Rank" of a site with the search engines (as discussed previously) you want incoming links but other sites are out looking for exactly the same thing - Thus was born the "reciprocal link" concept.
There has been a lot of press recently about the importance of reciprocal links for web site marketing - however there are risks. Every link off your site is precisely that - a way out. Links effectively encourage customers to leave you web site, precisely what you don´t want them to do! However the incoming link may bring you new customers - so in the end it´s trade-off. Effectively the same rules apply as for ´outgoing links´ above with the advantage that at least there is a built in return path and a positive for your site´s "Page Rank"
Over the course of marketing your web site you will invariably start receiving email from other web sites requesting a reciprocal link. Do not just jump in and say yes to all requests. Remember that an outgoing link is a vote for that web site - check the site and be sure this is a web site that is worthy of your recommendation. Also ensure that your link back from that site is not form a page so deeply buried that it could never be found by a normal visitor.
It may seem strange but links within a site are also very important. There are several reasons:
First: they can help your site visitors navigate your web site. They should be part of your basic site navigation and clear and concise enough to allow visitors to know what to expect before they click the link.
Second: Google and other "Page Rank" type engines will actually look at these internal links as well when calculating "Page Rank" - although they are weighted far less then links from other sites.
Third: Most importantly all search engines look at the text of the link on your site as a place to find key words and phrases. This is where many web sites fall into the "Click Here" syndrome - ´click here´ is most certainly not a key word or phase for most of the web sites that use it for links and therefore it should be avoided. Instead use text that describes the content of the page being linked to - this will help site visitors and help with search engine rank at the same time.
In addition to these three main benefits, internal links can also provide direct access to areas deep within a web site from top level pages (perhaps even the home page) Some search engine spiders will only navigate 3 levels deep (ie: three clicks from the home page) when indexing a web site, so content below this level may be missed. Providing direct links from the home page, for example, will allow them to index this content. At the same time, because a web site´s human visitors want information quickly and through scanning, these same "deep" internal links may encourage them to explore the site more thoroughly.
The Web, historically, was designed to facilitate a free exchange of information. Despite the proliferation of internet advertising and related clutter, the main goal for most people using the web has not changed. The typical person out searching the web is looking for information about something - You win the visitor if your web site can provide that information and can be found in the search engines. The search engines are effectively trying to help the visitors find sites with the information that they are seeking, so to some extent, just by providing the information a web site is placed in contention.
When most people think of web site content they envision pages of information, which for the most part is true, but it is only scratching the surface of all the myriad forms of content that can be provided on a web site. For now, looking only at pages of information - if all the rules set forth for structure, standards compliance, and meta information are followed, any information that is added to a web site as a page of information (with or without images) will be a net benefit for that site. Both in terms of the human and spider visitors.
How much content is good? In short: the more the better. A web site can never have too much content - just keep it organized, simple, logical, relevant, and well structured.
Not only is having lots of information important, but the age of that information is also important. Search engines, and human visitors, appreciate sites that show signs of "life" Sites that provide frequent updates, new content, and update existing sections with new information are favoured by the search engines and send the message to human visitors that there is actually someone there, behind the scenes, working for them, bringing them what they want. Create a schedule within to ensure that site content freshness is maintained, and stick to that schedule. Assign the task of updating manageable sections of the site to specific people within your organization.
Customers always love to get the latest technology or information or formula or concept or... etc . That's why products on store shelves are always plastered with "New & Improved" logos. The content of your web site is no exception, in fact it's even more important. One of the best ways to keep your site changing and interesting for returning customers is to provide a news section - it also gives them incentive to return. The key here is that it must be updated regularly - in fact a site with a news section where the last article is dated months (or even years) ago is actually doing more harm then good. But there is another benefit: the search engines are also reading the news! Regular news updates to a web site are one of the best ways to ensure search engines consider a web site "fresh" as mentioned above. Provide a RSS news feed and a web site can benefit yet again:
RSS stands for Really Simple Syndication or Rich Site Summary (depending upon whom you ask) and it is the most widely used syndication format on the web today. Effectively it is an XML based data format for encapsulating news headlines and a brief description along with a link back to the complete article in a single small text file. Creating a RSS feed from the news on a web site allows the sites news to be syndicated.
Syndication is the best part of adding news articles to your site on a regular basis. Through syndication you can effectively broadcast the headlines and a brief summary of news items on your web site to a world audience. This dramatically increases the chance that search engines will regularly index your web site. Some instant messaging clients like 'Trillian' can read RSS based syndicated news content and alert your customers automatically whenever a news item is added to your site. Any good content management system should provide automatic RSS creation for news added to your web site - so all that is needed to take advantage of these benefits is just to regularly add news articles to the site.
There are many others types of content that can be included on a web site to please both humans and spiders alike.
For example:
Glossary of terms (related to the industry)Always remember that any content a site provides should, first and foremost, be well structured, standards compliant, simple, logical, and fresh.
Every web site proprietor must ask - what does my web site provide visitors in addition to information about my product, service, and/or organization?
There is no single more effective marketing tool then a free giveaway - it works in all industries and in may forms, the Internet is no exception. Many of the most popular web sites featured in main-stream news reports are often giving away something many people want - for free. There may be a "catch" or an additional fee for the "full version" but in almost all cases people can visit the site and get something useful for nothing.
The search engines themselves provide information for free - this brings them the traffic they need to make money selling advertising. Any way a web site proprietor can find to give away something of use to the general public for free will always assist in generating traffic. Here are some ideas:
Freeware Software (tools/utilities or games)This is by no means an exhaustive list but should suffice to give most site proprietors some ideas. The key here is that by giving people something for free they will actively promote the site by telling their friends.
S.E.O. stands for Search Engine Optimization, and after reading this document should strike you as a strange concept - if the web site has been designed correctly in the first place, the S.E.O. should not be necessary. Unfortunately many web sites are not designed correctly, and must therefor be corrected, after the fact, at an additional cost. It is of great importance, therefore to be very selective when choosing a web design firm. A good firm will be well versed in web standards and will design and build a web site that does not require any "optimization" as it will already follow the rules in terms of structure, compliance and should be well received by humans and spiders alike.
As search engines become more and more intelligent they are clamping down on as many of the "tricks" people find to cheat their way to a top ranking as they can. Over the years many tricks have been discovered and, within about a year or so, the search engines clamp down on those sites that employ them. Our experience has shown the a consistent top ranking is easy to achieve by simply providing quality site content. A good example of all this are the news reports concerning "Google Updates" where the Google search engine clamped down on sites employing many different types of tricks and dropped them hundreds of positions or in some cases dropped them altogether. Our customers by and far had a very different experience. Almost without exception, customers that were following the recommendations set forth in this document increased in rank; or at the very least maintained their position after the updates. This is because they were trying to match the same goals as the search engine. Logically if the search engines continue to tune their systems to rank sites that follow their rules higher, sites that do will in turn rank higher. It is really that simple.
However, as soon as a new site goes online the proprietor will probably start receiving email from companies stating things like "We noticed your site is not listed" or "Your site was broken" or even "We Guarantee 'x-thousand' targeted visitors to your site in the first week" and asking you to pay for their services. To avoid a generalization, some of these companies might be useful for sites designed by amateurs, however if the site has already been designed correctly these services are completely unnecessary. They may even place a web site into one of the categories of sites that will be dropped when the next "Google Update" rolls along.
Many web site proprietors become obsessed with their search engine rank - when in fact this is only one aspect of a comprehensive Internet marketing plan. What good is being listed first on the engine if no-one is searching for that product? This may sound like a strange question but not all products are things people routinely search for on the internet. Conversely, some things have so many web sites in competition for the same key phrase that achieving a high rank is almost impossible without purchasing it.
Although their popularity is waning FFA (Free For All) sites are still around. These sites purport to assist in web site marketing by allowing anyone to post a link to their site, which then supposedly would increase their search engine rank because it would count as a link to their web site - the fact of the matter is that these FFA sites merely collect the email addresses of unsuspecting web masters and sell them to spammers. PawPrint.net once conducted an experiment by submitting a fake web site to several FFA sites. We subsequently received over 2000 spam email messages over the next 3 days to that account. Incidentally, many of the S.E.O. companies promising to submit your site to n-thousand search engines and directories - are actually submitting your site to thousands of FFA pages!
To some extent the measure of the success of a web site can be quantified by the number of incoming email messages that the site generates. This could consist of questions from customers or notifications of online orders. In any regard you now have a responsibility to check email, and respond to your customers as quickly as possible. Your visitors, in turn, will begin to judge your business by the quality and speed of your response.
Respond to email within 24 hours, sooner if possible: Visitors appreciate a rapid response. Keep your existing visitors happy, and encourage potential new visitors by providing rapid useful email support, this alone could be enough to set your business apart from the competition. Save your responses to frequently asked questions in a text file for a quick copy and paste into subsequent email responses.
Include your web site address in your email signature: This ensures that anyone forwarding your message to a friend or colleague will be able to readily access your web site. It can also make it just that little bit easier for your customers to get back to your site when reading your response email.
Beware the ostrich defence: Some "experts" suggest removing all mention of your email address from your web site as a means of avoiding spam. The idea is that by removing the ability for email harvesting programs to collect your email address, you remove the majority of additional spam that a web site generates. While this might have the desired effect in terms of spam - the important thing to consider is that you also remove the ability for your customers to contact you by email, which seems somewhat analogous to paying for an advertisement in the yellow pages but not listing your phone number!
Mentioning the web site in all advertising media may seem obvious, but it is frequently missed. A good marketing plan can take further advantage of expensive print and media advertising by providing a special area of the web site for visitors to access. This also allows tracking of how many visitors arrived at the web site after hearing about it through one of these media.
For example:
A radio ad mentions the web site address as www.somewhere.com/radio/Never loose a chance to market the web site address - going so far as to leave the web site as a source for more information on corporate outgoing voicemail messages may satisfy customers while they wait for their call to be returned. A successful Internet marketing campaign includes all avenues for promoting the web site address.
Ensure the web site is listed on all corporate stationary, including business cards, letterhead, and even envelopes.
As the general public become more Internet savvy, the effectiveness of banner and pay-per click advertising campaigns are waning. This type of advertising can be fairly expensive and has only limited results.
Once a web site is up and running, someone should continually check the web site statistics. In many cases the hosting provider or web design firm will provide stats reporting as a service, larger organizations may choose to purchase a dedicated solution. In either case, good statistics software will quantify and analyze the traffic to that web site including where visitors came from, how long they visited the site, and even what words they typed into the search engines to locate the site in the first place.
This last statistic is one of the most important to track because it is a sure indicator of which aspects of the Internet marketing plan are working to drive traffic to the web site. Continue to track statistics and adjust the marketing plan accordingly, perhaps even adjust the site's titles, headings, and content based on findings to achieve a better ranking for the key words and phrases that have been targeted.
Everything up to this point is effectively background information, the challenge is now is to combine all of this background understanding and build an effective Internet marketing plan.
Choose words and phrases that, when entered word-for-word into the search engines, should produce this web site as a result. Basically one must identify what the potential visitors might be searching for. One should also be careful to ensure that the chosen phrases are not too general. For example a site selling car parts may choose to target the phrase "car parts" which at first seems logical, but consider that this would pit them against every other car parts retailer in the English speaking world - it would be far better for them to include their location in the phrase and narrow down the competition.
Check the competition - look up any competitors for the industry in question and try to figure out what key phrases they are targeting - this may assist in choosing the phrases to target.
Finally, test the phrases selected - simply type them into some of the major search engines and see who comes up - then look at their pages and determine if your site has a chance of competing against them. Remember that on the internet - physical size is of no consequence, many times small business web sites easily overpower huge corporate ones through effective design and structure alone.
It may already be too late for this step - but if the site does not already have a registered domain try to select a domain that includes one of the key phrases or, at least, some of the keywords that were selected.
Regardless of which domain name that is selected, try to register all related names eg: an organization registering the name 'widgets.com' should also register 'widgets.net' 'widgets.ca' 'widgets.org' and any other names that unscrupulous competitors may choose to register in order to compete with the web presence of 'widgets.com' This tactic is known as cyber-piracy and is illegal, but is costly and time consuming to correct and for the limited cost it is far better to register all related domains and eliminate the risk altogether.
Work closely with the web designer and a copywriter if available to create a fair amount of content that meets all of the criteria set forth in this document. Ensure that it can be kept well structured (always lace content with the key words and phrases) provide calls to action for visitors (try to direct them through the site) remember that humans scan - they don't read.
Ensure that the web design firm selected will design a standards compliant web site that follows all the structural and design ideals set forth in this document. Identify people within the organization who will have the responsibility to keep the content current after launch, issue news, and respond to email with rapid useful information.
Find and provide something for free (a draw)
Most web design firms will submit a new web site to the key search engines one time at no charge - ensure the site is complete before starting this procedure. There are several automatic submission software packages that will continue to submit the site on an ongoing basis for a one time charge or your organization may elect to pay a company to do this on their behalf (usually far more expensive) In general repeat submission should be on a quarterly basis (more frequent submission is discouraged by the search engines and in extreme cases can get sites banned)
Also ensure that the web site has been submitted to key free directories - in these cases only one submission is required, but follow the instructions of each directory carefully.
Search engines may take up to 3 months to index a new web site, but don't wait for them to start updating content - start a regular update regime as soon as the web site goes live by having the people within the organization, already identified, commence content updates. This will have the dual effect of on-the-job-training and building up site content as new visitors start to browse the new web site.
Start browsing other industry related web sites and look for places to get listed - this is where the bulk of any Internet marketing budget will likely be spent and should achieve the best ROI for that expenditure. In many cases when conducting a search the top few results will be from portal sites - select several of these and investigate the cost of a high visibility listing or feature advertisement on these sites.
Once the key portals have been identified purchase advertising for a limited time so that results can be tracked and the effectiveness of each campaign determined.
Make it easy for other web sites to link to this web site by providing useful content or a free "draw". Judiciously accept recriprocal linking agreements from select sites to build overall "Page Rank"
Continually monitor site stats, especially referrals from sites where advertizement/listings have been purchased. Compare these to referrals from search engines and adjust all aspects of the site including content, key page titles, headings, and link text to meet the Internet marketing plan goals.
The following on-line tools and resources will help acheive some of the goals identified in this document.
Check your web site code for errors - this is probably the single most important web tool to help create a standards compliant web page.
Does your web page use CSS? This tool will validate that CSS code portion of you web page - this rounds out the validation process.
Do you care about making your page accessible for persons with disabilities? These guidelines will help you tune your site to be more universally accessible, we have also found that conforming to these guidelines makes web pages faster and they seem to rank better on the search engines.