: Googlebot submitting thousands of requests to our map locater and using up the API quota We have a store locater page on our customer's site. The end user enters their postcode and a search
We have a store locater page on our customer's site. The end user enters their postcode and a search radius and we display results on a Google Map.
Recently we had begun to notice that the site was hitting the free map search limit (roughly 25,000 per 24 hours) without a notable increase in overall traffic. I turned on some additional logging to try and find what the problem was.
It turns out that Googlebot is pushing through thousands of searches on this map. This is a small sample:
2017-07-09 23:56:22,719 [7] INFO ShopLanding - [Thread 41] Google Maps: searched G23 received OK from 66.249.66.221
2017-07-09 23:56:35,469 [7] INFO ShopLanding - [Thread 10] Google Maps: searched CA6 received OK from 66.249.66.221
2017-07-09 23:57:24,563 [7] INFO ShopLanding - [Thread 48] Google Maps: searched BN14 received OK from 66.249.66.223
2017-07-09 23:58:00,970 [7] INFO ShopLanding - [Thread 42] Google Maps: searched CB4 received OK from 66.249.66.221
2017-07-09 23:58:13,064 [7] INFO ShopLanding - [Thread 54] Google Maps: searched DY9 received OK from 66.249.66.221
2017-07-09 23:59:18,722 [7] INFO ShopLanding - [Thread 59] Google Maps: searched TS3 received OK from 66.249.66.223
2017-07-09 23:59:53,223 [7] INFO ShopLanding - [Thread 49] Google Maps: searched S45 received OK from 66.249.66.221
Is there a way that I can stop Google from pushing through so many requests? This is eating a significant proportion of the free allowance. Legitimate searches seem to be under about 200 per day.
EDIT
The site is built on C# ASP.NET. The store search is using POST, the URL doesn't change on submit. I can post a sample of IIS logs tomorrow morning to confirm this behaviour.
More posts by @Shanna517
1 Comments
Sorted by latest first Latest Oldest Best
To stop googlebot from searching via googlemaps
put a file named robots.txt in the root of your domain.
e.g. www.wikipedia.org/robots.txt
Sample robots.txt:
User-agent: Googlebot
Disallow: /search-store/
Where /search-store/ is the page that sends the request to google maps.
If it happens to be something else than Googlebot, you can try disabling all crawling to this page with :
User-agent: *
Disallow: /search-store/
Note that it won't stop misbehaving scripts that ignore robots.txt.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.