Mobile app version of vmapp.org
Login or Join
Ogunnowo487

: Website being bombarded by non-existent URLs I noticed a newly registered domain is being hit by non-existent URLs like /pithy/597/47363-1117.doc /pithy/597/47363-1117.doc /2015/150728.html etc... It's

@Ogunnowo487

Posted in: #Backlinks #Google #Googlebot #Links #WebCrawlers

I noticed a newly registered domain is being hit by non-existent URLs like


/pithy/597/47363-1117.doc
/pithy/597/47363-1117.doc
/2015/150728.html
etc...


It's being hit for thousands per day which is causing high server load.

I also noticed the IP appears to be coming from google 66.249.79.85

I figured it's probable that this are old links from the old domain owner or I could simply be attacked by some other thing.

Is there a way I can stop this bombardment from happening?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Ogunnowo487

1 Comments

Sorted by latest first Latest Oldest Best

 

@Angela700

I noticed a newly registered domain is being hit by non-existent URLs... It's being hit for thousands per day which is causing high server load.


There are ways to mitigate the load from all sources (including google) that you believe are causing it.


Cut down on the amount of code in the error pages.


This means no error pages with fancy templates, and no error pages with graphics. Heck, making an error page consisting of only one line of text would be sufficient. I do that for all my potential hackers to other sections of my server all the time. The price... maybe 29 bytes per potential hacker instead of several kilobytes per hacker. The end result: bandwidth savings which may result in a lower server bill.


Unload unnecessary modules.


Determine which modules you need and only stick with them. Loading extra modules causes the server to use additional memory. If the total amount of memory required for all programs including the server program (example: apache) exceed available memory, then the hard drive will be used as additional memory and this can slow things down.


Add rate-limiting.


If your server is linux-based, you can use iptables to rate limit connections from certain IP addresses. I think you can do similar with windows servers as well. Rate limiting means determining the maximum number of connections per second an IP address can make.

If you are using a free or low priced service, suggestions 2 and 3 are something your administrator will need to look after.


I also noticed the IP appears to be coming from google 66.249.79.85


Google scans websites all the time normally at a maximum rate of two requests per second. My server isn't innocent. Mine is checked too.

If google does more than two per second, then someone may be using google webmaster tools and could be managing your domain with it and decided to crank up the requests per second bar under "site settings". In this case, you can try to reverse the verification so that google will ask the offender to verify the site is once again his. Try the following if you haven't done so:


In the document root folder, delete a file that begins with google, ends with .html and in between it, contains a series of hexadecimal numbers. For example: google2da9166c91fd3e7e.html
Remove the meta tag in your homepage that begins with <meta name="google-site-verification"
Remove the DNS TXT record from your domain that contains google-site-verification=
Remove the google analytics asynchronous tracking code from your pages.
Remove the google tag manager container snippet.


I gave half of these options as inverted steps from the methods google webmaster tools allows for one to verify their domain. You may also want to check out google's help pages or try contacting them.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme