Mobile app version of vmapp.org
Login or Join
Jennifer507

: Side effects of customising a users currency based upon their IP I have implemented user IP detection to several of my eCommerce sites, so that it shows the currency and delivery charges based

@Jennifer507

Posted in: #Seo

I have implemented user IP detection to several of my eCommerce sites, so that it shows the currency and delivery charges based upon the users location. All is good, and this system has been working for a while, but I have began to notice a worrying side effect with the search engines.

Google seems to only crawl websites using US based IP's; country searches such as on google.co.uk now show prices by default in $'s (within the listings), which are reducing the number of click-throughs as prices are no longer being shown in the user's local currency.

The impact is even greater as these are all UK based sites. Even though they are setup to sell to the global market, we still want to keep our local market strong.

One way could be to exclude the currency/delivery detection for the spider IP range, but the same still applies; we have a big following in Europe, US, Australia and New Zealand who convert better if they see their local currency within the search listings.

There are product feeds setup with google, but these don't filter into the main search listings.

The alternative route is to implement country targeting through subfolders/subdomains (/uk/, /us/, /fr/, /it/), however this seems very clunky in a modern internet. For example I would have to list every country in Europe, even though they all pay in euros, and all have the same delivery price; in effect I'd be creating x50 extra identical pages for each product (as there are around 50 countries in europe)

Any suggestions?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Jennifer507

2 Comments

Sorted by latest first Latest Oldest Best

 

@Eichhorn148

Hate to disagree with the accepted answer, but why not use the offer schema type so you are not messing with CSS for data?

<div itemprop="offers" itemscope itemtype="http://schema.org/Offer">
<meta itemprop="price" content="229.95" />
<meta itemprop="priceCurrency" content="USD" />
</div>
<div itemprop="offers" itemscope itemtype="http://schema.org/Offer">
<meta itemprop="price" content="180" />
<meta itemprop="priceCurrency" content="GBP" />
</div>
<div itemprop="offers" itemscope itemtype="http://schema.org/Offer">
<meta itemprop="price" content="400" />
<meta itemprop="priceCurrency" content="AUD" />
</div>


I also think this parses better from a code stand-point as you can also bake in additional information for search engines and parses such as elligibleRegion, seller, warranty info etc.

On display; if this becomes an issue, simply allowing the user to choose from a drop-down (can optionally be marked up separately using these as a data-source). Also allowing saving of selections with account data is probably a good option; or at the least appending a query string to the url. Combining these should be enough to show the relevant currency for humans.

I would not use CSS, but it's not wrong to use it, it's a design choice. Potentially JavaScript would be a good candidate for those concerned about logic load on their sites; but it is very important to remember only use JS presentationally. No form inputs or user-data other than quantity and product-ID should be decided by JS so that your site remains usable in a non-JS state. Potentially scripting logic on the server for those concerned with reliability and consistency is the ideal, but the schema markup included is I think very powerful for use even in static content and simple websites.

10% popularity Vote Up Vote Down


 

@Gloria169

Google's "International" section of their Webmaster Tools talks about "Locale-aware crawling by Googlebot". This appears to be a relatively new (seems to have been announced in January 2015), and is fully automated by Google (emphasis added):


Today we’re introducing new locale-aware crawl configurations for Googlebot for pages that we detect may adapt the content they serve based on the request's language and perceived location.


They are using both geo-distributed crawlers as well as the accept-lang headers, which should help, but as they haven't said where these new IP's are coming from might still miss the UK.

One thing that might also help is the use of micro-data on your site.

This would allow you to serve all currencies in your markup, hide those that weren't valid for the current user/ip through CSS/JS and still have Google understand what you're doing:

<div class="curr-gbp" itemprop="offers" itemscope itemtype="http://schema.org/Offer">
<!--price is 1000, a number, with locale-specific thousands separator
and decimal mark, and the $ character is marked up with the
machine-readable code "USD" -->
<span class="usd" itemprop="priceCurrency" content="USD">$</span>
<span class="usd" itemprop="price" content="1000.00">1,000.00</span>
<span class="gbp" itemprop="priceCurrency" content="GBP">&pound;</span>
<span class="gbp" itemprop="price" content="750.00">750.00</span>
<span class="aud" itemprop="priceCurrency" content="AUD">$</span>
<span class="aud" itemprop="price" content="1500.00">1,500.00</span>
</div>


You set the class on the containing div to the currency you've selected for the user and then hide the other options through CSS:

.curr-gbp .usd, .curr-gbp .aud { display: none; }
.curr-usd .gbp, .curr-usd .aud { display: none; }
.curr-aud .usd, .curr-aud .gbp { display: none; }


Google should then recognise the mark-up and display it as appropriate in its listings.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme