Mobile app version of vmapp.org
Login or Join
Ann8826881

: Is multiple domain names and links from same IP causing poor search engine rankings? I have an ecommerce website which is not doing so well in Google. I am trying to improve this of course,

@Ann8826881

Posted in: #Google #Seo

I have an ecommerce website which is not doing so well in Google. I am trying to improve this of course, and am looking at some possibilities for why it isn't doing well.

The website has four domain names, all of which have been indexed by Google.

A few months ago I applied 301 redirects to any requests for two of the domain names so now it is down to two domain names (one is a .net, the other is a .com.au, the others were .net.au and .com).

I prefer to use my main domain name (the .com.au), but one of the names has been around for a long time and has more inbound links. According to a PageRank tool, both are PR2.

It is a Classic ASP site and up until recently had a lot of querystring parameters. In the last week or so I added URL rewriting so there is now no parameters for most pages. I don't do 301 redirects from the old URLs but instead I add the META canonical tag indicating the preferred new URL.

At the same time I redesigned the site and improved title tags, META descriptions, and H tags but it hasn't been long enough yet for Google to index many of these yet.

I also looked at what pages Google has indexed and strangely it has some strange pages in the index, there are a lot of pages which are actual keyword searches (more a bunch of random letters than an actual word). What I mean is that it is as if they had typed in something to search for in my search box - there are no links to pages like this and the only way of getting this is to type something in to the search box). So I added a META robots tag with noindex,nofollow anytime that I render pages like this.

Years ago I set up a fake price comparison site which lists all my products and links back to my site. It has a different keyword rich domain name but is on the same server and same IP address. It's a completely different layout but does have the same product categories and product descriptions (although I have stripped formatting out of them so they are not identical except in text).

I also have a few blog sites which again are on the same server/IP and all have advertising for the website.

My questions are:


What should I do with the multiple domains, just use one, or continue with two or more? Could I be penalised for having multiple domains with duplicate content?
Should I add 301 redirects, not just the META canonical tag?
Any idea about Google indexing my search results page, and did I do the right thing with the META robots tag?
Is the fake price comparison site likely to be causing problems?
Are all the links to the site from other domain names but the same IP address likely to be causing problems?


Thanks for any help. Sorry for so many questions in one.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Ann8826881

1 Comments

Sorted by latest first Latest Oldest Best

 

@Yeniel560

Well, that's quite a description.

First, as you mentioned, you have to give time for things to happen. Considering the 301's, those url's should not being accessed much right now, at least from bots, but clients may still use them, specially since one of the removed ones is .com, which is usually a default for people and for browser when no extension or a wrong one is given.


If the content is the same, and just different domain names, do the redirects, keep them in place and wait for them to work.
yes, both are complementary.
You did right, although, remember that nobody has to follow the instructions you give, by that I mean, that robots, could ignore, any meta and any robots.txt and do their job. Google honors the tags and the .txt file
If the links are properly set, there should not be much problem from the technical aspect, I mean, the robot follows the links to the right product and so it does all the rankings and sorting related. The sharing ip schema is not necessarily bad in itself, but Google bot is smart enough to recognize that situation. But as I said, it's not necessarily bad, if the people are getting there in a normal process and not tricked by hidden or obscure redirections and they can follow the links from the 'fake' site to the other one, there should be no problem. If the site already did the job it was intended to, you may consider taking it down or improving it, not just with more traffic redirections to your, but with more information, more content and some links to other sites.
As with the last question, that should be no problem and Google bot knows enough to recognize the situation, it did before your changes and it still knows.


Remember that all sites need improvements to stay up, consider improving the information, adding some campaigns or promotions, check how the competitors are doing, check from different computers how your site appears with related searchs, etc.

About the change in domains, if your are going to consolidate all, remember that you want your clients to do the same, so send some newsletters telling them that, do it in a period of few months, 2 or 3 times so it sets. Also you may include some text in your site informing of the preferred link.

Hope this helps.

Bye

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme