Mobile app version of vmapp.org
Login or Join
Phylliss660

: Is it possible to A/B test web servers using a PHP redirect and sub domains without breaking all the SEO goodness? We're trying to figure out if a certain type of web server architecture is

@Phylliss660

Posted in: #ABTesting #Php #Seo

We're trying to figure out if a certain type of web server architecture is better than our current web server. We need to A/B test these 2 boxes with live traffic to really get a sense of which one performs better on our internal KPIs. The hosts involved can't help with the A/B test. So naturally I'm looking to see if it's possible to roughly split the traffic 50:50 between the servers using PHP redirects and sub domains. For example:

mydomain.com - (randomly redirects all requests to 1 of the following sub domains, maintaining url structure and query strings...)

www1.mydomain.com (original webserver config)
www2.mydomain.com (new webserver config)



We would need to run this test for at least a week or two to generate good stats. What I'm not sure about is whether the search engines would see this as bad and effect our rankings.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Phylliss660

3 Comments

Sorted by latest first Latest Oldest Best

 

@Annie201

You can consider to use .htaccess to redirect Googlebot always to , and keep canonical urls to the version.

Redirect people with 302 redirections, obviously.

If it's not a good solution, maybe there's another road:
you can consider to not redirect all your traffic to ww1. or ww2. Keep 34% of your visitors on , then split the rest to ww1 and ww2 for 33% each.

You need more time to achieve statistical significance, but in that way you can keep a good amount of traffic on default and it would be more like an A/A/B test.

10% popularity Vote Up Vote Down


 

@Gretchen104

If the content and internal link structure is different between both servers (I am not talking about the presentation), bots will notice the changes and will likely increase their visits to your site.

They will be very confused if they are being served different content back and forth and this may trigger a manual review. Google wants to be sure about the content being served to users. If algorithms have doubts, they tend to remove sites from search results automatically too.

To avoid such issues, make sure that a given user is always served content from the same server, by setting a cookie for example. Keep in mind Google bots don't use cookies, but you can detect them and send them to the original config server.

If you do that, you will be SEO safe.

10% popularity Vote Up Vote Down


 

@Heady270

A/B testing via PHP will not break your SEO as long as you aren't using any kind of SEO hacking "tricks".

Webpages around the web have all kind of different PHP scripts and companies like Google are pretty good at figuring out what is going on.

Webpage A: Has structured information and links.

Webpage B: Has (generally) the same structured information and links.

To indexing companies these two pages are the same but to your user one might be better(finding information, commenting, retention, etc) and finding out which (A or B) is the point.

Just keep an eye on your analytics platform and tag, comment, or mark any strange dips.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme