Note: this has been updated from the original posting
Building a professional looking website that represents your company or organization well is an important first step. After that you will want to concentrate on making it easy for your customers to easily find you- unfortunately a great looking site is not enough. There are many techniques that can be used to help search engines find your site. The process of using these techniques is called Search Engine Optimization (SEO) and unfortunately it is a complex task.
Following is a list of factors that ConX-2-U has identified that are used by many search engines to rank the relevance of an article during a search. The bad news is there are many rules to consider, the good news is that by following the rules you will most likely not only be found, but you will content that your users actually enjoy more.
There are many search engines to choose from and to optimize for. Obviously your primary concern should be Google since it has roughly 80% of market share; however, some effort should be made to also attract the alternative sites. Bing gets about 7% of web traffic; Yahoo around 6%; Baidu (a popular Chinese search engine) also gets around 6%; Ask, AOL, and the others have all lower than 1% of the volume so you really shouldn’t worry about them much.
There are not only many search engines, but each search engine uses different rules to determine which websites are listed, and in what order they are listed in the search results. Knowing the rules can help you design a site that is more easily found- and to avoid the pitfalls that can get you on a search engine blacklist (search engines, like people, do not like poorly written, redundant, or useless content).
Before moving on it is important to understand the following maxim:
The less you steer your site towards Google the more Google will steer people to your site.
Why? The fact is that Google has a lot of brilliant engineers whose focus is to help people find the information they want. If you provide the information people want most, then odds are Google’s hard working engineers will find it first.
Google Ranking Factor Checklist
The below SEO Rules listed in the following pages are not listed by weight or by relevance.
The term “Keyword” refers to the “Keyword Phrase”, which can be one word or more words.
Positive Page Related Rules/Factors
The thing about keywords is that there are many ways to use keywords and depending on how you use the keywords they will be more or less effective. below are a listing of the types of keywords you should focus on.
Keyword in Domain name
A clear domain name can count for a lot. Sadly for us “Conx2u” is an example of a bad domain name. It is supposed to sound like “connects to you”; however, google doesn’t care about homophones. As many parents say: do as a say not as I do.
Most good domain names are taken so you may be stuck following my lead; however, keep in mind that hyphens count.
Keyword in URL / Page Name
First word is best, second is second best, etc. You can use hyphenated filenames, but keep it to two or three hyphens only. Note: Use hyphens rather than a “_” or “.” Limit hyphens to 1 or 2, too many hyphens can be bad.
The header is very important. Good strong
Keyword in Title tag
Title tag should be 10 – 60 characters, no special characters.
Keyword in Description Meta Tag
Provides theme for page. Keep to less than 200 chars.
Google no longer relies upon this tag, but will often use it as do other search engines.
Keyword in Keyword Meta Tag
Google may no longer use this tag, but others definitely do.
- Shows theme – less than 10 words.
- Every word in this tag MUST appear somewhere in the body text. If not, it can be penalized for irrelevance.
- No single word should appear more than twice. If not, it may be considered spam.
Keywords in Content
Over use of a keyword can bring a negative result. Google looks to see if the keywords fit naturally into the article. They determine this by a complex algorithms; however, there are some guidelines to consider.
Use keywords for 5 – 20% of the overall text (e.g. all keywords divided by total words in article).*
Use a specific keyword only 1 – 6% of the overall text (e.g. each keyword/ total words).
This may sound complex; however, most people who have to actually try to break these rules if they are writing naturally. Just try to use your words in a natural way and you should be good.
Use the words in a phrase, eg. directly adjacent is best. Two or more keywords next to each other can help. As long as you keep away with problems with density you will be good.
Headers (H1, H2, H3, etc.)
Headers are very important. They not only give the reader an idea of what your main points are, but they will also let Google know.
Note: I have found that it is easy to lose track of keywords when it comes to writing headers. Fitting a keyword into the header may not be as easy as you first think since a real header is often meant to ask a question or highlight a aspect of a topic rather than the topic itself.
You should use the Alt Text tag to describe ALL graphics. This will help with normal searches, and definitely help with image searches!
Does word order in the page match word order in the query? Try to anticipate query, and match word order.
Using the word in the early part of the article is better than in the middle or the end.
Strong is treated the same as bold, italic is treated the same as emphasis.
Keyword Stemming is taking the stem of a word, and building additional words by adding a prefix or suffix and using pluralization (Example: stem, stems, stemmed, stemmer, stemming, stemmist, stemification).
Stemming can be important since often users will use unexpected phrases. For instance when I was promoting my QMetry products we focused primarily on the keyword “test manager” and even changed the official name to “QMetry Test Manager” so the keyword would be in our name. This worked very well for people searching for “test manager”; however, we originally ignored “test management”. Check out this casino list! Our competitors did not. Worse our competitors used both “test manager” and “test management” (along with many other keywords and actually started scoring higher than us for “test manager” natural searches.
Like the domain name the linked page should also contain the keywords. (Note: you should validate all links. Use a free link checker.) Use an efficient Tree Structure so that users are only two clicks away from any page – and no page is deeper than 4 clicks.
use appropriate links between lower-level pages to each other. This may have a smaller priority for SEO; however, you want users to be able to navigate your site well which drives your traffic up and that will help SEO.
Use a descriptive, on topic, anchor text for outgoing links. Link only to good sites. Do not link to link farms. Links can and do go bad which can then result in site demotion. Validate all links periodically and avoid “Link Churn”.
Use less than 100 links out total. Too many links can easily make your website look like a “link farm” which will result in a site demotion. A good rule of thumb for a typical company site is no more than 100 outgoing links. Larger sites can do with more links.
Keep your File Size under control. Try not to exceed 100K page size. Better to have lots of smaller pages then a few large pages; however, use common sense and don’t cut your pages up too small either.
Search engines, like users, like fresh pages. Active websites are more attractive to some search engines. So update your articles periodically- when it makes sense. Freshness is also judged by the ratio of old pages to new pages. Having fresh content is good, old content does not hurt, a good ratio is best.
It is always best to keep the URL length small, and hopefully easy to read (at least for human eyes). The less characters the best, but as a general rule never go over 100.
Site Size and Age
Larger sites are presumed to be better funded, better organized, better constructed, and therefore better sites; however, search engines are also looking at content and they will avoid large “spam sites” that contain either computer generated, duplicate pages, or sites that contain copies of content found elsewhere.
Even in the internet age, the older you get the more respect you are given; however, newer pages on an older site will get faster recognition- similar to how often all the attention centers on a baby when they enter a room of adults.
Negative Page Related Rules/Factors
As I mentioned early when talking about density too much of a good thing can be bad. You should not develop a page by focusing only on the keywords. Write good content and add the keywords that make sense.
Targeting too many unrelated keywords on a page, which would detract from theming, and reduce the importance of your REALLY important keywords.
This should be obvious, but no text = no keywords and is therefore invisible to search engines.
This was actually a problem with one of my last site revamp. The original focus was on keywords, site structure and navigation. I made a lot of progress with my UI/UX lead and had mapped out a nice framework; however, we didn’t have a graphic designer on the team yet. When the CEO saw the framework he hated it (he is a very visual guy). The site was taken over and put into the hands of a graphic designer. I worked with him and I have to say the graphics were excellent. Even the UX was good, and with our feedback it became great.
Unfortunately the CEO who had little time to read content let alone review it kept the focus on the graphics and the keywords were never inserted back in and all the content was lost. The site ended up looking fantastic and was released without content review.
The result was a great looking site that impressed the visitors when the found it, but far lower traffic since we dropped off of multiple natural keyword search rankings. Until we could prove the error to the CEO we couldn’t fix the error and start pulling our rankings back up.
Affiliate Sites are sites that had massive inter-linking, and little unique content. Google in particular went after cookie-cutter affiliate sites with a vengeance. Don’t link to “link farm“.
Periodically you should double check your links to make sure they did not go bad (some failed company sites are taken over by link farms, or even adult sites that can then effect your ranking).
Do not immediately send your visitor to another page using meta refresh.
If you have many sites with the same web host, linking them together can indicate that you are actually one entity, and therefore the links won’t count the same as if it was from an external site. Cross linking is easy to spot, and to penalize.
Stealing information from other websites is a great way of getting blacklisted.
Some search engines, most notably Google, uses multiple caches for their search engine. If a comparison between the two caches brings inconsistent results, like a completely different set of keywords, then this can effect your results.
Frequency of Content Change
While you want to keep your pages fresh don’t overdo it. Like a uninterested suitor Google can tell when you just want to get their attention- don’t try so hard when you are trying to woo her. Too frequent = bad.
Use of Frames
Most of you probably don’t remember this early invention of the internet. that is a good thing. In addition to just being a bad user experience frames can effect your search results too. Don’t go retro.
This was another old technique in the early days of the internet and worked for many years. Essentially the author makes the text the same color as the background so the viewer doesn’t see it, but the search engine does. You might not get caught, but if you do it will hurt you.