Article Categories
» Arts & Entertainment
» Automotive
» Business
» Careers & Jobs
» Education & Reference
» Finance
» Food & Drink
» Health & Fitness
» Home & Family
» Internet & Online Businesses
» Miscellaneous
» Self Improvement
» Shopping
» Society & News
» Sports & Recreation
» Technology
» Travel & Leisure
» Writing & Speaking

  Listed Article

  Category: Articles » Internet & Online Businesses » Link Popularity / SEO » Article
 

Using subdomains to bypass Googles sandbox




By Rob Sullivan

I thought a post on searchenginewatch's forums was very interesting. While I haven't yet tried it it does seem to have merit. It talks about using subdomains from an existing site to help kick a new domain into the index and out of the "sandbox." There was apparently some talk at SES Chicago on getting a site indexed in regular results a lot quicker. This involved the use of an existing established similar domain (either one you own or one you bought) and use it to help you get a site out of the "sandbox." The full forum post can be found here.

The technique is in the gray-to-black range. I wanted to mention that first, however if your site has been boxed for a while, this may be an alternative for you. It also requires some coding, and assumes your site is built in PHP although I would also guess there is similar ASP code out there.

First lets look at how this works: You have an established related domain and a new domain which is 'boxed. By establishing a subdomain on the established site and mirroring the content of the new domain there you will get the new subdomain indexed more quickly because it would inherit some of the trust of the main domain.

Once it has established itself you would use some form of redirect (likely a 301) to redirect crawlers to the new domain. The new domain then inherits whatever the subdomain gained in terms of link popularity transferred from the established, trusted domain.

Sounds simple, but there are a few things you need to do.

First, obviously, is to find an established domain. If you need to buy an expired but still relevant site (and it's in your budget) the author recommends you do so. You would also not change the registrar information, according to the author (this would be considered in that dark gray range).

You don't want to change the registrar information because there is a chance that Google will notice the change in ownership and any trust the domain you just bought has previously earned would likely be lost.

So lets say you just bought a related domain that's been around for a couple years and has a PageRank 5. by leaving the site intact and not changing the registrar information you are essentially ensuring the site maintains its existing stature in the engines.

Then you will create a subdomain on the site. Here you will place a mirrored copy of all your content navigation, etc. from your new site. Since the new site hasn't been added to the index yet there will not be a duplicate content penalty.

You will also use some PHP coding to change the page header information to fool the webserver into thinking the page was created earlier than it was (the suggested PHP code is found in the forum post linked above). By telling the webserver the pages are old, you are informing the crawler that the pages are old as well.

This is because the crawler requests this information from the webserver at the time of indexing.

Because you've established a completely new section within an established domain the new section will get indexed sooner than the new domain.

It will inherit link popularity and trust from the parent domain allowing it to establish itself more quickly than the new site.

Once the subdomain has been fully indexed by Google you will want to redirect it to the new domain.

By doing this you have allowed the content to be found by Google which then assumes the pages are properly aged because it has been told by the webserver that the pages are in fact old (even though in reality you've recently created them).

By redirecting the subdomain you are then passing the inheritance and trust given to the subdomain by the main domain to the new site.

The reason this works is because the established site is already trusted by Google. Therefore the vote from the trusted site helps illustrate to Google that the new site is also trusted.

There are some things to consider with this tactic however:

Now that it's been widely publicized I wouldn't expect it will take that long for Google to realize the hole and patch it.

Also, the whole trustbox patent is based partly on authority but also on age. So while a page may appear old (because you've altered the page header served) Google may opt to instead consider the page's age from the time it found the page.

In other words, even if the page is a year old, if Googlebot just found the page yesterday then it's only 1 day old. While the patent does say that "scoring the document based, at least in part, on the inception date corresponding to the document," it also goes on to say that Google could determine that age to be not the page date, but the date when it found the page.

And remember that like any type of blatant manipulation you risk being penalized by Google. Google engineers also visit these forums, don't forget, so they are also keenly aware when new tactics are shared which are designed to circumvent the current algorithms.

Therefore it is their job to fix those holes, and likely also find ways to penalize the sites taking advantage of the holes. While no one can prove or disprove this theory, I've heard of enough sites which have been removed from the index for doing something they weren't supposed to.

Therefore, while this may sound like a great way to get yourself out of the 'box early, consider the alternatives. What if you do get out of the index early but Google catches on in 3 months, 6 months or more? Do you think they may decide to "backdate" any changes to your site if they determine that you participated in such a tactic? Then, not only are you back where you started, you could be worse off than if you had just taken your lumps and done things properly.

 
 
About the Author
Rob Sullivan is a SEO Consultant and Writer for http://www.textlinkbrokers.com . Please link to this site if you publish this article.

Article Source: http://www.simplysearch4it.com/article/18906.html
 
If you wish to add the above article to your website or newsletters then please include the "Article Source: http://www.simplysearch4it.com/article/18906.html" as shown above and make it hyperlinked.



  
  Recent Articles
Benefits of Articles Submission in SEO
by Amit Kothiyal

Internet Traffic and SEO Techniques
by Roberto Sedycias

About SEO for 2007
by Arthur Browning

The Absolute basics of SEO
by Matt Canham

Sales Force Automation System – A System to Save Time and Money!
by praveen olive

Latent Semantic Indexing In SEO
by Arthur Browning

15 FAQ'S on Search Engine Optimisation (SEO)
by Kath Dawson

Website Optimization Services Warning to the Public...
by Gregory Osborne

The Art Of Making Your Site Visitors Stay
by monica lorica

Web Designers Guide To SE Friendly Design
by monica lorica

How To Find The Best Link Partners Online
by monica lorica

Website submission means gaining presence on the Net
by Clint Jhonson

Selecting the SEO Consultant That is Best for Your Business As a business owner, it is impossible to ignore the power of the Internet. With a worldw
by Tyler Dewitt

Role Of Page Rank
by Keyur Parmar

Be an SEOptimized Web Designer
by Alfredo

How to Choose a Good SEO Consultant
by Anthony Yap

A Quick Outline of Search Engine Optimisation
by Chris Phillips

Search Engine Optimization: Why is it important?
by Hanu Nirukurti

Can't connect to database