Natural Web Design
SEO Tips & Tricks: They Don't Want You to Know About

Take Control by Managing Your Content

Dynamic Web sites and Content Management Systems are prone to duplicate content issues. It is up to each Web site to proactively manage their own duplicate content before Google’s crude computer algorithms dump on them.

You basically have two choices. Either you make the decision yourself which webpages you want in Google‘s index. Or, you can let Google’s crude computer algorithms make the decision for you. Making the rounds reading all the SEO related blog comments should provide all the proof that any one needs that nine times out of ten Google’s algorithms will make the wrong decision.

Content Management Tools

Your content is managed with either your robots.txt file and / or your use of robot noindex meta tags to specifically request that certain content not be indexed. Or, if it is already indexed that it be removed from the Google search engine index with a robots noarchive meta tag.

Duplicate Content that Exists on Your Web site

No matter how justified it may seem to you at the moment, Google positively does not like two different domains displaying the identical content. Ergo, if you attempt to capture the foreign market using country specific domains then you just better plan on showing different content in each one of your domains.

Multiple format versions of the same content will result in duplicate issues. The two most common problems are caused by offering a format version that is targeted at cell phones and other mobile devices. And, by offering printer formated versions of your articles. You should specify which version of your content that you want indexed. And, which versions should not be indexed.

Blogs should be syndicating as little of their posts as possible. Allowing other blogs to freely scrape entire posts from your Web site will only result in duplicate content issues. Likewise, copying entire posts from other blogs will create a duplicate content issue.

Duplicate Content that Exists ONLY in Google

Changing the URL addresses of your webpages creates duplicate content issues in the Google index that do not exist on your Web site. You must always use a 301 Redirect whenever you move any of your webpages to a new URL. Ignore the problem, or use the wrong form of redirection, and Google will treat your moved webpages as duplicate content.

Referring to your home page, or any directory index, throughout your Web site inconsistently with more than one version of an URL address can likewise create duplicate content issues that exist only in the Google search engine index.

While some top level webpages can get away with almost no text on the page, your lower level webpages should always have at least 200 words of text on them. Web sites should manage any undeveloped content by requesting that it not be indexed by attaching a robots noindex meta tag to all stub articles.




 

Menu

 

 

About Us
About You
Contact Us
Latest Additions






Natural Web Design