Mobile app version of vmapp.org
Login or Join
Eichhorn148

: My site fails to do content negotiation with Googlebot using the Accept-Language header HTTP offers Content Negotiation (by reacting on the Accept-Language request header) to deliver translated representations

@Eichhorn148

Posted in: #Google #Googlebot #Http #Internationalization #Seo

HTTP offers Content Negotiation (by reacting on the Accept-Language request header) to deliver translated representations of a resource under one URI. In January 2015 Google finally implemented this feature and calls it "Locale-aware crawling by Googlebot".

Could anybody get different representations under one URI successfully into the index?

I'm trying this for www.mixcloud-downloader.com/ which can additionally to its default English representation serve a German representation. This site is since one month in Google's index. Additionally to the Vary: accept-language response header, I also added an indication which languages are supported:

<link rel="alternate" hreflang="en" href="http://www.mixcloud-downloader.com/" />
<link rel="alternate" hreflang="de" href="http://www.mixcloud-downloader.com/" />


If I would search for something from the German representation which is unique in the internet (e.g. "Mixcloud Internetadresse" with quotes) google.de doesn't find it.

What do I have to do, to make Google successfully aware of a translated representation under one URI?

Edit: I noticed that my application was sending a bogus Content-Language header, which might have confused Google. I fixed that on 2017-01-19. Previously it did respond with e.g. a zh-CN header and English content (if the client requested zh-CN).

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Eichhorn148

1 Comments

Sorted by latest first Latest Oldest Best

 

@Bryan171

In the Google's documentation you rely Google mentions to recommend to use separate URLs for each language version, considering sensitivity and error vulnerability of local adaptive setup.

Further, in this documentation, Google mentions


You can help Google determine the language correctly by using a single
language for content and navigation on each page
...
Keep the content for each language on separate URLs


If you stay with it come hell, you should rely on both of Accept Language AND geolocation to point visitors and Googlebot to correct language version. Because your troubles begin not with Accept-Language, but with its absence.

Googlebot comes not often with defined locale parameter, and you can't constrain it. If so, visitors of your site like Googlebot, without Accept-Language, get, as now, only english content served, not dependently of their location.

You should enrich your setup with the second check routine, for the visitor's location, somehow like this:

- visitor comes
- check 1: Accept-Language yes/no? If yes - serve matching content, if no:
- check 2: geolocate visitors IP-address.
If Germany/Austria/Swiss- serve german, if another country - server english.


But even with this setup you will be never sure and can never actively influence that Google indexes content in all languages. The most possible variant would be, that Google gets english content to 100%, and german... my bet for best case - 30%.

Your site has a handful of pages - make just unique URLs for each language and you haven't this headache.

PS: On answering your question found a cool tool for checking exactly the question's subject: technicalseo.com/seo-tools/locale-adaptive/.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme