Mobile app version of vmapp.org
Login or Join
Yeniel560

: Spammed subdomain caused huge loss in traffic, HELP About a month ago I asked a question where I mentioned I created a forum, went on holiday and when I got back the forum got spammed with

@Yeniel560

Posted in: #Backlinks #Google #Seo

About a month ago I asked a question where I mentioned I created a forum, went on holiday and when I got back the forum got spammed with 1000s of link posts.

However the plot thickens, the forum posts on my website now got linked to by other spammy sites, thus causing 1000s of backlinks pointing to my root domain.

The forum was hosted inside a subdomain folder under my public_html like this public_html/sportsforum.example.com

All those spammy links are pointing to sportsforum.example.com. However my main website example.com is feeling the affects and competitors are just shooting past me.

The site has never done ANYTHING wrong, it had a solid backlink profile, got a large number of organic searches a month and had a number of quality posts.

I have 410'ed the subdomain, and dont know what else I can really do. I am kind of freaking out, any help is very welcome.

EDIT:

Here is an image of my current link profile after the spam generated via majestic



EDIT2: I DO NOT WANT TO USE THE DISAVOW TOOL I HAVE HEARD IT CAN BE VERY DANGEROUS...Opinion on this?

EDIT3: Currently Searchconsole is not displaying any of these spamlinks however, they are often slow..the spam links does however show up when I do a site: example.com search

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Yeniel560

1 Comments

Sorted by latest first Latest Oldest Best

 

@Sherry384

If following a link returns a 404 response code, Google rightfully retains the URL in the index as well as the link. Why? Because a 404 indicates a temporary condition. Google retries any page that returns a 404 for a period before it considers the page truly gone. This will take quite a while. Even then, Google will periodically retry any target page of a link for as long as the link remains. If a 404 is found, the problem may persist.

If, however, following a link returns a 410 response code, then the URL can be dropped from the index immediately and the link becomes truly broken. Why? Because a 410 error is a deliberate act and not the default response of 404.

Imagine a database schema for a minute. When Google indexes a link, there are at least the from (source page), to (target page), and link text data elements stored within the database. The source page and target page data elements are URL IDs of URLs stored within the URL table. When a URL is stored, one can assume there will be a response code element associated with the last fetch. If the URL returns a 404, the page is not resolvable, however the URL remains valid until determined not to be. In turn the link table still has a valid target URL ID making the link valid. If the URL returns a 410, then the URL can be removed. This removes the target URL ID from the link table thus breaking the link. Next, imagine that the sub-domain no longer resolves. All URLs associated with the sub-domain can be removed from the URL table en masse instead of one at a time.

While I may have simplified the process, you can see how the various options effect the speed at which your problem resolves itself.
@Goyllo 's suggestion to remove the CNAME record is likely your best bet. Your second best option is to return a 410 response code for all pages. If you continue to return a 404 response code, your problem may not actually go away as long as the link remains.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme