The Internet has made it possible to run a website for a truly global audience. This raises the interesting question, though: how do I best serve readers in different countries?
For many webmasters, this isn’t a problem at all. For example, here at Torque our content is identical whether you’re based in the US, UK, France, or anywhere else in the world. As long as you can read English, our content will serve you well.
Sometimes, however, a one-size-fits-all approach isn’t the best option. Consider an eCommerce store, for example. If the eCommerce store sells internationally, it may need to adjust its product pages, languages, prices, shipping costs, and currency, depending on a visitor’s location.
Creating location-dependent content is obviously a far trickier proposition, and there are two options available:
- Use alternative URLs
- example.com vs example.co.uk
- example.com/us/content vs example.com/uk/content
- us.example.com vs uk.example.com
If you’re concerned about duplicate content penalties, you can add the rel-alternate-hreflang tag to each page — this page explains hreflang in more detail.
- Dynamically display content based on a visitor’s IP location or their browser’s language settings —location-adaptive pages. In other words, the website will detect when you’re visiting from, say, the UK and will display the content relevant for a UK audience.
In the past, Google recommended using the former: use alternative URLs. This was because the Googlebots it sent scouring the Internet all used US IP addresses. If your website served content dynamically, well, the US Googlebot would only be able to see the US-version of the content. This made it difficult to index all your content variations for different countries and languages.
Good news, though: all this is set to change, as from the end of January, Google will be sending out a more advanced Googlebot to do its bidding.
The two significant changes to the Googlebot are:
- Location-dependent crawling: Googlebots will have non-US IP addresses, too, also known as geo-distributed crawling.
- Language-dependent crawling: Googlebots will crawl with different language preferences.
These changes will significantly improve the Googlebot’s ability to detect a location-adaptive page and, more importantly, to index it.
What do I need to do?
In a word: nothing.
The changes to the way websites are crawled are applied automatically, which means webmasters have nothing to worry about.
However, it’s worth pointing out that those of you looking to add location-adaptive pages to your website shouldn’t rush out to do so just yet. While these changes are undoubtedly a step forward, they are also in the early stages, and there bound to be some teething problems that may need to be resolved. It’s unlikely that Google will roll out all IP locations and languages in one go, so there may still be problems indexing some location-dependent pages.
It’s also interesting to hear that Google still recommends that you continue to use alternative URLs. According to one of the lead developers on the project, Pierre Far: “Separate URLs are better for users, and that’s what really counts.”
At first glance, the latest Google changes will be great news for any website trying to grow their international presence: they will no longer be unfairly penalized for using location-adaptive pages, which can impact their visibility in non-US locations. This, in theory, creates a more fair marketplace, as those most deserving of being at the top of the SERPs are allowed to rise the ranks. Google helps its customers find the best websites, and that benefits everyone.
The locale-aware crawling changes might not be relevant to everyone, but it’s certainly worth keeping up to date with all significant search engine changes. It will be interesting to see the next step Google takes, as it might give us an indication of the direction they’re heading in with these location-dependent pages.
As always, we’ll keep you up to date with the latest goings on at Google and the other major search engines, and provide our take on how they will impact WordPress users.
What are your thoughts on the location-aware crawling changes? Let us know in the comments section below!
Shaun Quarton is a freelance blogger from the UK, with a passion for online entrepreneurship, content marketing, and all things WordPress.
Start the conversation