Apr 6, 2011

51 SEO points for OnPage Optimization

  1. Place your keyword phrases at title tag. Title must be less than 65 characters.
  2. Place the most important keyword phrase closes the beginning of the page title.
  3. Put some main keyword in keyword Meta tag.
  4. Write a good description for Meta description tag, description must be unique to each page.
  5. Keep Meta description short and meaningful write only 1 or 2 sentences in description.
  6. Target the most important competitive keyword phrase at the home page.
  7. Target one or two keywords phrases per page.
  8. Use only one H1 header per page.
  9. Place the most important keyword in H1 tag.
  10. Use H2 and H3 for sub headers where required.
  11. Use Bold / Italic / Underline for your keyword phrase for extra weight in contents.
  12. Use bulleted lists to make content easier to read.
  13. Use ALT tag for images, so that crawler can know about images.
  14. Don’t use flash on your website because crawler can’t read flash.
  15. Try to keep easier navigation of your website.
  16. Use text based navigation.
  17. Use CSS to creating navigation menu instead of JavaScript.
  18. Use keyword phrases in file name, you can use hyphens (-) to separate the word in file names.
  19. Create a valid robot.txt file.
  20. Create a HTML site map for crawler and user.
  21. Create a XML site map for Google crawler.
  22. Add text links to others page in the footer of site.
  23. Use keyword phrase in Anchor text.
  24. Link the entire pages to each others.
  25. Use keyword rich breadcrumb navigation to help search engines understand the structure of your site.
  26. Add a feedback form and place a link to this form from all the pages.
  27. Add bookmark button.
  28. Add a subscription form at every page for increasing your mailing list.
  29. Add RSS feed button so that user can subscribe easily.
  30. Add Social Media sharing button.
  31. Use images on every page but don’t forget to use ALT tag.
  32. Use videos on your site which is related to your niche.
  33. Write informative, fresh, unique, useful content on your site.
  34. Write site content in between 300 to 500 words.
  35. Keywords % must be 3 to 5%.
  36. Don’t copy any content from other websites, fresh and unique content is the key of your success.
  37. Add deep link to related articles.
  38. Regular update your website with fresh content.
  39. Use CSS to improve the look of your website.
  40. Write your content for human not for robot.
  41. Buy Country Level Domain name if your website is targeting local users.
  42. Use a good keyword suggestion tools to finding good keywords phrases for your website.
  43. Use 301 redirection to redirect http://www.domainname.com to http://domainname.com.
  44. Try to buy local hosting server for your website if your site is targeting local people.
  45. Use keywords rich URL instead of dynamic URL.
  46. Break your article in paragraph if your article is long.
  47. Add full contact address on contact page with direction map.
  48. Validate XHTML and CSS at http://validator.w3.org/.
  49. Don’t use hidden text or hidden links.
  50. Avoid graphic links, because text in the image cannot be indexed by the spiders.
  51. Don’t create multi page for with same contents.

Dec 8, 2010

Top Ten of the Most Common SEO Mistakes

1. Bad Titles
Titles are the most important thing about a webpage. Search engine spiders see the title on your page first and they make a general assessment about it based upon this. They scan the page and make sure the content of the page matches the title and then assign rankings based upon these matches and how they best fit. The title tag is the best way to inform the search engine what your page is about, and it plays an important part in ranking if used properly. Placing your main identified keywords in the title and ensuring to obey the rule of leftward and stemming is vital – leftward rule, the closer the word to the left of the sentence the more important it is; stemming is where words can be associated but with words in-between i.e.: Great Blue Widgets when stemmed would allow Great Widgets. It is important to ensure your title matches your page content, description and Meta keywords and is unique for your website. Every page title and content should be unique otherwise it will be ranked supplemental.

2. Filename of the Page
It is important if you are using dynamic websites like Joomla or other content management systems that a search engine friendly URL translator is installed. Because I use Joomla every day I will describe what I mean relating to Joomla. In Joomla (and most other dynamic websites) URL’s like the normal Joomla URL “index.php?option=com_mtree&task=listcats&cat_id=1766&Itemid=35″ drive the website. The problem with these URL’s is that they are unintelligible by humans and also by search engines. Installing a search engine friendly URL component will change these pages to something more meaningful and also including your keywords in these page names can help with SEO. The newly transformed search engine friendly URL would look like this “buy-blue-widgets.html”. If you do not have a dynamic website, then ensure your page name is short, to the point and contains your main keywords for that page maybe with a call to action like “buy” and “blue widgets”.

3. Duplicate or Bad Content
Ensuring your content is unique and relevant is vital in SEO success. Often websites just duplicate content or bring in plagiarized content from other websites. It is my experience that these websites fail dismally in search engine rankings. Write some decent and unique content or maybe search online and find someone who can actually write you decent unique content for a fee. Make it good, friendly and unique content that people will want to link to. If you are listed in the supplemental index of Google, then making sure your titles, descriptions, Meta keywords and on page content is unique, is the only way to drag them out of the supplemental index.

4. No Links
Having covered off a few of the on page most important SEO tips we now look at links, which are an off page SEO concept. For each person or website that links to your website it is a vote for your site. If the website that links to yours has a high pagerank itself then it carries more weight. Simply having good keywords, titles and text on your page is only part of SEO. Just because you created a unique and perfectly optimized webpage does not mean you will automatically gain a decent pagerank for that page. In order to ensure SEO success you have to create incoming links – this is probably the hardest part of search engine optimization because you are effectively selling your pages all over the Internet. One way to gain instant (within a few weeks) links is to pay for text advertising – A search on Google will uncover some of the best text linking services. Link exchanges are bad because for every vote you get you are giving one away and, if you happen to link to a “black listed” site, then Google will penalize you significantly. One way links are the ultimate in SEO link campaigns so seek out directories (either paid or free, you need to evaluate), post in forums, write articles and submit them and ask other friendly webmasters if they can help you out with a link. The best way to gain links is to have unique content that other people actually want to link to.

5. Incoming Link Anchor Text
Having pointed out how important it is to gain incoming links, it is now important to also point out that the text those links are anchored to is important also. The text should be targeted at your main keywords and the page they point to should have those keywords as relevant and prominent also. The website you link from should be relevant to your website. So, as an example I own a web design company in Melbourne, Australia so I gain links from web hosting providers and/or open source Joomla sites in or around my region. Getting 1000 links from a casino related website will serve no benefit to my website and will not make it rank higher for “web design” keyword search terms. Use professional common sense when gaining links.

6. Bad Internal Page Links
We return to on page factors that you can work on within your website. It is important to make sure that the anchor text linking to pages within your own website is relevant to the target page. Make sure the title tag is filled in also for each link. You have the greatest control over links from within your own web site so make sure they are relevant and that the link title and the on page copy match the main keywords of the target page.

7. Live Links
Using some external tools like the link checker that W3 Org offers is good. I suggest harnessing as many tools as you can to ensure all of the links on your website are live, working and not sending people to 404 error pages. It is also important to ensure your page markup (html, xhtml) is valid. W3 also provides a tool for this. Make use of both of these freely available tools.

8. Impatience
Search Engine Optimization is not a short term task. It is one of an ongoing refinement after refinement and hour after hour of working on your website. Producing unique content, checking it, validating, listing and checking the search engines and making sure they have it listed correctly. Do not think for one second that you can optimize your site thoroughly ever and do not ever think that just because you are number one for a search term that it will indefinitely stay that way. Search Engine Companies are always changing their ranking and rating systems and coupled with this there are always other people out there gunning for your top spot. Keep looking to improve and learn more and don’t sit back and wait for others to take your top spot. Be proactive, seek out new content, links and keep ahead of the curve.

9. Keyword Selection
This is proving to be a very common mistake among beginners to SEO. Often people base their own keyword selection on what they *think* is right but is this what people actually search for when they are looking for your product or service? Often the answer to this questions is no. It is an important step to do the correct keyword research and ensure you have the right keyword list before you optimize any pages with them. Using Overture, Google Adwords and some other paid keyword tools like WordTracker (the default standard in Keyword Finding) are vital to success in finding good keywords. It is important to localize your target market and figure out what they are actually searching on when they are looking for products or services that you offer. Do the research and be as specific as possible while still being broad enough to capture some high keyword traffic.

10. Keyword Spamming and Stuffing
If you sell “Blue Widgets”, then every page of your site does not need to have “Blue Widgets” in the title, description and Meta tags. Try and be objective and analyze your website. Focus pages on a specific group of keywords you have identified from the keyword list. Sometimes finding niche markets locally first can be best. So, as an example, “Buy Blue Widgets Australia” or “Buy Blue Widgets Melbourne”. These would be examples of good second or even third tier search keyword phrases, but you can make some good sales with these niche terms because the searcher is targeting buying these items.

Source: seo-news.com

Nov 16, 2010

What is better: SEO or PPC?

SEO or PPC

Many webmasters are unsure whether they should advertise their website with SEO (search engine optimization) or PPC (pay per click advertising).

Actually, most commercial websites work best if you use both SEO and PPC. The exact mix depends on your goals.

Pay per click advertising (PPC)

Advantages:
  • You get instant results. If you advertise your website on pay per click search engines, then you will get traffic now and not several months later.
  • PPC ads are perfect for time limited offers such as holiday sales.
  • You can stop PPC ads at any time.
  • PPC ads make it easy to test different keywords and landing pages.
  • PPC ads also work with websites that are not very well designed and wouldn’t get good search engine rankings.
  • PPC ads allow you to bid on a large amount of keywords, including misspellings and other keyword variations that you cannot put on your web pages.
Disadvantages:
  • PPC advertising can become very expensive if you bid on the wrong keywords or if you don’t calculate the maximum bid price correctly.
  • Click fraud can be a problem. Not all clickers are potential customers.
If you advertise your website with PPC ads then you should use a ROI tracking tool to make sure that you don’t waste your money.

Search engine optimization (SEO):

Advantages:
  • Traffic through organic search engine results is almost free if the up-front work has been done.
  • After optimizing your website you can use your money for different things and the optimized site will still run.
  • A larger number of visitors and search result clickers is not a problem.
  • Search engine optimization delivers long term results that don’t require permanent financial input.
Disadvantages:
  • SEO can be relatively time-consuming up-front.
  • SEO can require a redesign of your web pages to make your website search engine friendly. However, this usually also results in a better user experience.
Search engine optimization delivers lasting results and it costs considerably less in the long term. However, you must make sure that you optimize your website correctly, if your want to get high search engine rankings.

Pay per click advertising and search engine optimization both contribute to the success of your website. If you use both wisely, you can get many new visitors and customers without spending a fortune. See the recommended resources below for PPC and SEO software tips.

Source – free-seo-news.com

What is Robots Text File?

The robots.txt  file is a set of instructions for visiting robots (spiders) that index the content of your web site pages. For those spiders that obey the file, it provides a map for what they can, and cannot index. The file must reside in the root directory of your web. The URL path (web address) of your robots.txt file should look like this.

http://www.seoconsultants.com/robots.txt

The Robots text file open in Notepad might look like this:
This is a screen shot of an empty (not recommended) robots.txt file


Definition of the above robots.txt file:

User-agent: *
The asterisk (*) or wildcard represents a special value and means any robot.

Disallow:
The Disallow: line without a / (forward slash) tells the robots that they can index the entire site. Any empty value indicates that all URLs can be retrieved. At least one Disallow field needs to be present in a record without the / (forward slash) as shown above.


The presence of an empty "/robots.txt" file has no explicit associated semantics, it will be treated as if it was not present, i.e. all robots will consider themselves welcome.

The Disallow: line without the trailing slash (/) tells all robots to index everything. If you have a line that looks like this:

Disallow: /private/

It tells the robot that it cannot index the contents of that /private/ directory.

Summarizing the Robots Exclusion Protocol - robots.txt file

To allow all robots complete access:

User-agent: *
Disallow:

Important Note:

The above format is the common and acceptable standard for allowing all spiders' access to the site. We've recently learned (2002-06-09), that the practice of having just a User-agent: * and Disallow: without a trailing forward slash (empty robots.txt file) may not be recommended. Some spiders may incorrectly interpret this as blocking all content. You'll notice that we disallow the _private, css, and JavaScript folders in the below example and do not recommend an empty file.

2003-05-13 - Do not disallow your /css/ directory. Recent issues with Google may suggest that disallowing your css directory could be a flag for a manual review to see if you are using css to deceive the indexing robots (spiders).

This is a screen shot of a robots.txt file


To exclude all robots from the server:

User-agent: *
Disallow: /

To exclude all robots from parts of a server:

User-agent: *
Disallow: /private/
Disallow: /images-saved/
Disallow: /images-working/

To exclude a single robot from the server:

User-agent: Named Bot
Disallow: /

To exclude a single robot from parts of a server:

User-agent: Named Bot
Disallow: /private/
Disallow: /images-saved/
Disallow: /images-working/

Note:

The asterisk (*) or wildcard in the User-agent field is a special value meaning "any Robot" and therefore is the only one needed until you fully understand how to set up different User-agents.

If you want to Disallow: a particular file within the directory, your Disallow: line might look like this one:

Disallow: /private/top-secret-stuff.htm

Keep in mind that using the above example excludes that specified page (top-secret-stuff.htm) but will not exclude the entire /private/ directory. If you have files that you do not want indexed, then you should put them in a private folder and Disallow: the entire directory, or put them in a password protected directory, or don't put them on the web at all!

You should validate your robots.txt file. Enter the full URI to the robots.txt file on your server. The robots.txt file always resides at the root level of your web.

Web Site Design Guidelines

1. Your main page should specifically let your visitors know exactly what you're offering. If your potential customer can't find your product or service, they definitely won't waste a lot of time looking for it. They'll go on to the next site and probably never return. They're visiting your site for a specific purpose. They want something your site offers.

2. Create a page to display your "Privacy Policy" in regard to the personal information you collect from your visitors such as, email address, Internet Service Provider, etc. Explain your reasons for collecting the information and let them know how the information will be used.

3. Create a page about you and/or your company. Include your name, company name, photograph, biography, address, phone number and email contact information.

4. Display your copyright information at the bottom of each page.

5. Keep in mind, your visitors may enter your site from pages other than your main, so make sure you include good navigational links on every page. Place your navigation links together at the top, bottom, left or right side of the page. Use tables to neatly align your links and maintain a nicely organized and uniform appearance throughout. Try to keep the number of clicks required to get from your main page to any other page on your site down to four and place your company logo on each page.

6. Use caution when selecting your background and text colors. Busy backgrounds make text difficult to read and draw the attention away from the text. In addition, always be consistent with your background theme on each page of your site. Keep in mind, colors affect your mood and will have an affect on your visitors as well. Bright colors such as yellow and orange, cause you to become more cheerful or happy, while colors such as blue and purple have a calming effect. Dark colors such as brown and black have a depressing effect. A good rule of thumb is to use colors based upon the type of effect you're trying to achieve.

7. ALWAYS check and double-check your site for spelling errors and make sure your images and links are all working properly. If you have several errors, this will make your site appear to be unprofessional. If you are designing your site using an HTML editor, use spell check. Proper grammar is also very important.

8. If you must use frames, use them sparingly. Frames, if not used properly, can make your site look unprofessional. Avoid making your visitors have to scroll from side to side to view your content. This can be very irritating and cause your visitors to leave.

9. If you must use Java on your site, use it sparingly. Java can be slow and has a tendency to crash browsers.

10. If you're using pop-up windows to display special offers or ezine subscription information, try to use a JavaScript that utilizes cookies. This way, the window will only be displayed to your visitors the first time they visit your web site.

11. View your web site through different browsers and screen resolutions so you will see how your visitors will view your site.

Visit:
SiteOwner- Check your web pages for HTML validity and browser compatibility.


NetMechanic - Provides a variety of free services for your web site including; browser compatibility testing, graphic file size reduction, link check, HTML check, load time check, spell check and more.


12. Continually add new content to your site. Give your visitors a reason to keep coming back.

Web Design Mistakes to Avoid:

  • Large fonts
  • Large scrolling text across the page
  • Large slow loading graphics
  • Large Welcome banners
  • Multiple banners and buttons
  • Multiple colored text
  • Multiple use of animated graphics
  • Multiple use of different fonts
  • No contact information
  • No Meta tags
  • Over powering music set to AutoPlay
  • Over use of Java
  • Pages scrolling to oblivion
  • Poor browser compatibility
  • Poor content
  • Poor load time
  • Poor navigation
  • Poor organization
  • Poor overall appearance
  • Poor use of frames
  • Poor use of mouse over effect
  • Poor use of tables
  • Pop up messages
  • Scrolling text in the status bar
  • Spelling/Grammar mistakes
  • Text difficult to read
  • Too many graphic and/or line dividers
  • Too many graphics
  • Too much advertising
  • Under construction signs
  • Animated bullets
  • Broken links and graphics
  • Busy, distracting backgrounds
  • Confusing
  • Different backgrounds on each page
If you've never designed a web page, it would be wise to become familiar with HTML. (Hypertext Markup Language.)

A great place to start is NCSA Beginner's Guide to HTML: HTML Premier

Take some time to research and plan your web site. Your success depends upon it. The simple, well-designed sites make the sales.