Mobile app version of vmapp.org
Login or Join
Vandalay111

: 403s appearing in WMT but when I FETCH GOOGLE returning same URLs as 200 HI there over the last few days Google Webmaster Tools has been showing me 403 errors for a series of URLS. However

@Vandalay111

Posted in: #403Forbidden #GoogleSearchConsole #Wordpress

HI there over the last few days Google Webmaster Tools has been showing me 403 errors for a series of URLS. However when i FETCH them they come up as 200 OK and when I look in Google using keywords they appear in the index. Everything seems fine except for the errors. Is there any reason for this? Is there anywhere I should check for the cause of this. It is a wordpress site.

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @Vandalay111

4 Comments

Sorted by latest first Latest Oldest Best

 

@Deb1703797

Moray Bresnihan, checking crontab and using PS requires shell access to the server as the root user. This is achievable if you are renting a dedicated server (or equivalent). Then depending on your server setup, if it is secure, then you should have to login to the shell via SSH on a special port (other than 22) and use the credentials supplied by your dedicated server provider for any account other than root.

A good program for windows to use to get into your shell is putty. The download link is here:
www.chiark.greenend.org.uk/~sgtatham/putty/

Once logged in, use the SU command and type in the root password when prompted. then in there, you can use crontab -e to open the default editor with a list of items that will be executed at specified times.

If you happen to be unlucky and not have your own shell access, then you will need to contact your web space provider and ask them to do what I mentioned above for you.

Maxing out your memory to the peak is terrible regardless of how you look at it. You want to make sure you have some free bytes of RAM on your server when every process possible is running on your server. Failure to have free RAM on your server causes your server to run slower at best because swap space will then be used as memory and that in turn causes a disk to work harder than normal.

You may want to look into stress testing. This means accessing server email, accessing server ftp, and attempting to connect to a php page on apache at least a few hundred times a second (apache bench is a good tool for this). When you stress test, your memory usage will go up and down like wild because many processes start and stop on a frequent basis under a stress test.

10% popularity Vote Up Vote Down


 

@Harper822

In response to your error log response, it seems like your problem is worse than what you're claiming it to be. based on whois records (go to whois.com), it turns out that the IP address you specified does belong to google. I pasted the first few lines of the result so you know what to look for:

#
# ARIN WHOIS data and services are subject to the Terms of Use
# available at: www.arin.net/whois_tou.html #


#
# The following results may also be obtained via:
# whois.arin.net/rest/nets;q=66.249.93.188?showDetails=true&showARIN=false&ext=netref2 #

NetRange: 66.249.64.0 - 66.249.95.255
CIDR: 66.249.64.0/19
OriginAS:
NetName: GOOGLE
NetHandle: NET-66-249-64-0-1
Parent: NET-66-0-0-0-0
NetType: Direct Allocation
RegDate: 2004-03-05
Updated: 2012-02-24
Ref: whois.arin.net/rest/net/NET-66-249-64-0-1

OrgName: Google Inc.
OrgId: GOGL
Address: 1600 Amphitheatre Parkway
City: Mountain View
StateProv: CA
PostalCode: 94043
Country: US
RegDate: 2000-03-30
Updated: 2013-08-07
Ref: whois.arin.net/rest/org/GOGL

I'm actually surprised the server returned a 403 error. I thought in your error explanation, error code 500 would be returned.

Nevertheless, the REAL problem is that you're trying to run too many things on your server and you don't have enough memory to cover everything. You have a few options starting with the least recommended:


Create a swap drive and set the size of it to at least the amount of extra memory you need and use it. I'd only recommend this if you are in an extreme hurry to have a website working in an OK state. The downfall to this method is that a disk is used for memory and accessing a disk is about 4 times slower than accessing memory.
Cut back on running processes. If your server is meant to display web pages only and thats it, then make it so the mail server never starts along with other things you don't need.


If you use linux, then check the crontab and disable any programs there you don't need running by removing them. Also, use the PS command to see what processes are running and use the kill command to remove what you don't need running at one time.

If you are still low on memory, then check your web server. I'll assume apache. Check your httpd.conf settings and use a lower numbers for maxclients and serverlimit and then restart apache. (a graceful restart will not likely work here).


Install more ram on the server.


Whatever you do, keep an eye every now and then on free memory and only load essential services. In linux, you can see free memory in megabytes with this command:

free -m


The 2nd item in the free column (on the line where it shows buffers/cache) is your free usable memory.

total used free shared buffers cached
Mem: 1005 615 390 0 20 220
-/+ buffers/cache: 373 631
Swap: 243 0 243


On my system, I have 631 MB free usable memory out of 1005 MB. 390 MB hasn't been used by the OS at all.

Also, suphp is a handler for php which will execute one time per request, so you'll also need enough memory to run that as well.

10% popularity Vote Up Vote Down


 

@Jamie184

A 403 error is Forbidden. It can be for a series of reasons from file/directory permissions, not having DirectoryIndex in your configuration defined, or even a code error within your CMS/Blogging software. It has nothing to do with the network or your connection. In fact, the server responded didn't it? It simply means that the request you made was denied for a reason within the scope of a 403 error.

If you are finding intermittent errors, blame code first. What software are you using on your site? See if there is an update. I say this because I recently went through the same thing with the code on my site- and yes(!) I wrote it- Thank you very much. It was a code error that threw an exception within Apache that then results in a 403. While I or no-one can tell you why this is happening without putting our fingers on your keyboard, I would inspect my log file and see what requests were made and start from there. It took me a couple of days to figure out I was clobbering data between request processes when the requests per second count got rather high (in the multiple of dozens). The odds of clobbering data was infinitesimal even then and out of 68,000 requests I only got about 12 403's and only during very rapid request periods. Still- it was a code issue and simple to fix.

Otherwise, a 403 should be consistent and happen all the time.

10% popularity Vote Up Vote Down


 

@Angela700

Timing can be an issue.

In the time that you tested the URL's that turned up with error 403, they referenced files on the server with bad permissions. Make sure that the world has at least read and execute permissions to the file. In linux, you can use

chmod 755 (path/to/file with issue)


The fetch time must have happened AFTER setting the correct permissions.

Google also has a span of IP addresses reserved to itself for various operations and chances are you have a firewall installed that is blocking some of google webmaster tool's IP addresses.

If that doesn't work, look at all your apache logs and see if they report errors for certain file accesses. If there is anything unusual in your logs, then chances are your server is being hacked.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme