Mobile app version of vmapp.org
Login or Join
Hamm4606531

: Is Google rationing the number of hits my website receives? Most of the traffic on my website (85%) comes from Google. Most of visits are local to the country of the domain. I see suspiciously

@Hamm4606531

Posted in: #GoogleSearch #Traffic

Most of the traffic on my website (85%) comes from Google. Most of visits are local to the country of the domain.

I see suspiciously repetitive amount of visits each month, round 12.500 with only a small variation. This includes a month where most of people in my country go to vacation.

Is Google rationing a number of visits to my site? Would that be legal?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Hamm4606531

2 Comments

Sorted by latest first Latest Oldest Best

 

@Shanna517

Google's search ranking algorithms are commercial secrets, so it is hard to prove one way or the other whether they are rationing clicks. However, we can do some simple statistical analysis that gives us an indication of whether this may be the case.

If Google was not rationing clicks, we would expect the distribution of clicks over time (e.g. per four-week = 28-day period), to approximately follow a Poisson distribution — unless traffic events are not independent, in which case they would arrive more bunched together, e.g. around events that are relevant to the Web site. But in any case, if the traffic data are much MORE constant than the Poisson Distribution suggests, then we can be reasonably certain that Google is rationing links: because any interdependence between traffic events, or "organic" traffic growth over time, should make the distribution LESS uniform, not more! In other words, for a Web site receiving an average of 20 visits a month (with randomly distributed Web traffic events), we would expect per 28-day period (on average):

1–5 visits .007% of the time
6–10 visits 1.07% of the time
11–15 visits 14.57% of the time
16–20 visits 40.26% of the time
21–25 visits 32.87% of the time
26–30 visits 9.87% of the time
30–35 visits 1.27% of the time


etc. On average, we would expect less than 16 visits, during at least one 28-day period in every seven. We should expect at least 26 visits, during at least one 28-day period in every nine. But this is not what I am seeing: instead, for several years, my inbound organic search traffic on the site I'm investigating, has never been less than 16 or more than 25 per 28-day period! Traffic is almost perfectly constant. Every 28-day period, in a rolling window or sequentially (it makes little difference), almost without fail, I have exactly 18–22 clicks: never less than 16, and never more than 25.

I've seen the same pattern previously on other Web sites I've developed: anecdotally, it seems like when you first create a Web resource, Google crawls it. When everything is tidy (no major reasons NOT to transfer traffic, and when the site has several decent pages of unique content), they start drip-feeding your site with a little traffic at a constant rate. Once your site has a few dozen decent pages of unique content, and starts building a reputation with some good quality inbound links & organic traffic that doesn't bounce straight back to Google; suddenly, traffic seems to jump from consistently 20 visits per month, up to a hundred or more! It is precisely as though Google's internal safety-limits are being progressively removed.

Anecdotally, we're seeing evidence that Google is rationing clicks per Web site. They also have a potential commercial motive to do this: out of sheer frustration with our long-time inability to break through their apparent click-rate ceiling, we have both gone in search of answers to this question, and in the course of our search for answers, come across Google's advertisements and tutorials for their "paid search" auctions/ services. Whether it's actual Google business policy or not, to do this; search algorithms are such tightly held commercial secrets (necessarily so, to help obstruct black-hat SEO), that it might even be possible for some mid-level technical managers to subvert their algorithms into behaving like this in order to nudge more Webmasters into paying Google, out of sheer frustration!

Potentially benign motives for this behaviour might include: to buy time for Google engineers to analyse any Web sites that suspiciously start ranking much higher than they should (pulling them apart for signs of black-hat SEO, before modifying their core ranking algorithms to demote sites on that basis). It's quite possible their ranking algorithms have various layers of defence: core algorithms for analysing quality×popularity, and safety-valves to buy time in their arms-race with the black-hats and to obfuscate the workings of their core algorithms from prying eyes (e.g. probing attacks by competitors seeking to reverse-engineer Google's secrets).

Based on comparison of Poisson distribution to the actual time-distribution of search traffic transferred to the Web site I am investigating, I might calculate a maximum probability that Google is not manipulating traffic in this way (I estimate that this probability is very small). However, the actual real-life distribution is so constant as to make it almost pointless doing the calculation — and in any case, Google only lets me see 90 days of history in their Search Console! (I wonder why…) Personally, I think their policy is unfair to small businesses that are apparently being made by Google to traverse artificial deserts in search of an oasis of Web traffic — potentially in part, to artificially make their paid advertising services more desirable!

10% popularity Vote Up Vote Down


 

@Si4351233

Putting aside conspiracy theories that very rarely have any basis in reality there are many reasons why your traffic levels don't seem to vary a huge amount, and why your ranking dropped so substantially and took a while to recover.

Core User Base:
Depending on the target audience of your site you will more than likely have a core user base of users who regularly come to your site, some users who don't visit as often, and new users who visit rarely, what this will mean is that on average your page view levels will remain fairly steady. In addition this also depends on how you are working out this pageview figure, whether you are using Analytics or server logs.

Black Hat SEO Techniques:
Google regularly goes through sites to see if black-hat SEO techniques are being used. This is done by automated tools and is checked both by using the Googlebot as well as using automated bots that emulate standard browser user agent strings and come from IP addresses in no way associated with Google. If Google detects black-hat techniques (such as false links, spam links on other sites, etc) then the the SERP rank gained by those techniques is eliminated and the site penalised which can take a long time to recover from. The span links especially are the most frequent cause of penalties in SERP ranking and Google includes the following as spam links, paid links, comment spam, blog networks. blog networks. and guest blog posts.

Basically your rank can and will be penalised for any action which violates the Google webmaster guidelines. You should note these penalties can be a result of a manual review or algorithm updates such as Google Penguin.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme