10 Tips to Avoid SEO Mistakes

Oleh Nanang Suryana | Senin, Maret 22, 2010 | | 0 Komentar »

With so much misinformation out there, along with a lack of knowledge about how SEO works, you could end up getting your website banned from the search engines. Learn how to avoid common mistakes with these 10 simple tips do's and don'ts.

1. Do Create Fresh Content

Search engines are famous for penalizing a website for publishing duplicate content. With plagiarism on the rise and availability of content checking tools such as Copyscape, marketers have become more cautious. Yahoo is considered to be among the harshest of all the search engines with regard to this penalty. Add fresh content to your website to help build visitor interest and credibility with the search engines.


2. Do Take Advantage of Google Analytics

2009 was the year when web analytics gained momentum. Google Analytics came up with advanced metrics and intelligence report features which revolutionized free analytics tools. Companies realized the benefits of using web analytics tools to extract their relevant data. Implement Google Analytics to analyze data and build a 2010 plan to increase traffic and rates of conversion.

3. Do Perform Competitive Intelligence

Before starting your search engine optimization program, visit the competing websites in the top results. Research these types of questions:
  • How many websites are competing for the same keyword?
  • How old are the websites in top search engine results pages?
  • How many back links do the top ranking websites list?
  • What type of social media is used by the competing sites?
4. Don't Link to Low Quality Websites

Link building is a very crucial aspect of search engine optimization. Search engines consider the number of incoming links to a website as an indication of

their popularity and give them priority rankings. Many beginners fail to realize that it is links from authoritative and quality websites that are important

and they mistakenly link to low quality websites for higher rankings. This tactic can cause the credibility of the website to go down with search engines and

In some cases, the website may get banned.

5. Do Maintain a Uniform URL Structure

If your website is dynamic, then you need to modify the URL structure of the web pages. This maintains uniformity and helps searchbots to understand which

page it is indexing. It is very easy to maintain the URL structure in dynamic websites. Blogging platforms like Wordpress provide an option for permalinks.

Customized dynamic websites can use URL rewrite in the .htaccess file for the same.

6. Do Include Long Tail Keywords

With a million websites competing for short tail keywords, it can take more than 6 months to rank in the top 20 for a competitive keyword. In this case, long

tail keywords come in handy. Long tail keywords are more specific and can contain the name of a specific product, brand or city. Ranking for long tail

keywords is comparatively easier and the rate of conversion is better than that of short tail keywords. Do include keywords in the title tags.

7. Do Target the Correct Keywords

Targeting the wrong keywords is a common mistake many optimizers make and even worse - veteran SEO professionals do it. Marketers select keywords that they

think are explanatory of their website, but the average searcher does not think in those same keyword terms. Picking the right keywords can increase or

decrease traffic to your SEO campaign. A first-class keyword suggestion aid, for example the Google search-based keyword tool, will help you find keywords

that are appropriate for your site.

8. Do Implement a Robots.txt File

The primary purpose for using a robots.txt file is to gain complete control over the data indexed by the searchbots. Implement a Robots.txt file only when

you want to prevent unwanted web pages from being indexed. A robots.txt file is always placed in the root folder of the website where the searchbots can

access it easily.

9. Don't Use too Much JavaScript

Searchbots are not designed to read and understand JavaScript code. If a website contains a few lines of text in the JavaScript code, chances are that

searchbots will ignore the entire block of code along with the text. This is true in the case of JavaScript menus. Try to keep the use of JavaScript to a

minimum. Alternatively, create an external JavaScript file if it is unavoidable.

10. Don't Use Flash for SEO

Flash websites are very eye-catching, but search engines cannot read or index this type of content. If it is impossible to avoid a Flash-centric website and

you need search engines to index it, you will have to offer an html version too. Search engines don't like Flash sites for a reason - a spider can't read

Flash content and therefore can't index it.

These 10 simple tips can help you to avoid making potentially dangerous SEO mistakes and ensure your site is indexed and boost rankings.


Artikel Terkait:

0 Komentar

Posting Komentar

Silahkan isi komentar. Saya akan berusaha berkunjung balik ke blog/website Anda yang telah berkomentar.

Jika Anda merasa postingan blog ini bermanfaat, Anda bisa berlangganan melalui email tentang update terbaru.