Mobile app version of vmapp.org
Login or Join
Heady270

: How should "AdSense and DoubleClick" phrase in Google Webmaster Guidelines be interpreted? I was reading Google Webmaster Guidelines and came across a phrase I do not understand. Make reasonable

@Heady270

Posted in: #Advertising #GoogleSearch #RobotsTxt

I was reading Google Webmaster Guidelines and came across a phrase I do not understand.


Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.


I tried searching Google and here for more explanation, but no luck.

Are they saying that we should be blocking something or should not be blocking something? And blocking what exactly?

I know they want sites to use rel="nofollow" on paid links. But those would be external links that are not controlled by my robots.txt file.

Can someone explain this to me, with examples if possible?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Heady270

2 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

Google does not want paid links to be counted by the search engine as "recommendation" links that change the Pagerank and reputation of the page that is linked to. When Googlebot can crawl sponsored links, Google calls it "link selling". Google is willing to apply penalties for link selling that include:


Lowering the reported Pagerank for the site in the Google Toolbar (to make the site look less valuable to those buying links.)
Making the rankings in the search results worse for the site selling links.
Removing a site entirely from the Google search results


Here is an article that discusses link selling and its penalties.

When you have sponsored links on your site, Google wants you to take steps to ensure that Googlebot does not crawl those links and view them as your endorsement of the site being linked to. Technical measures that can be employed are:


Serving the ads with JavaScript and blocking those scripts with robots.txt.
Using rel=nofollow attributes on each link to prevent them from passing Pagerank.
Redirecting links through a tracking script which has a URL that is blocked in robots.txt.


As Google notes in their document, you don't have to worry about this with most ad networks such as AdSense and DoubleClick. They handle the required blocking for you. They use a combination on the techniques -- the ads are served via JavaScript then the JavaScript code to serve the ads and the ads themselves are listed in robots.txt.

The only time that you really need to worry about this is if you are selling ads yourself and directly linking to other sites.

10% popularity Vote Up Vote Down


 

@Ravi8258870

My interpretation of this is that they want you to block any pages which are nothing but advertisements. For example, you may have an ad server on an "ads.example.com" subdomain. You should place a robots.txt file to stop ads on this server being crawled.

The second sentence is slightly confusing, but my interpretation reads "For example, we [Google] block Google AdSense ads with robots.txt". In other words, "For example, look at how we do it in gan.doubleclick.net/robots.txt .

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme