This is not a new topic, we’ve been saying this for probably over a decade and even Matt Cutts posted a video answer on it in 2013. But hey – John Mueller from Google answered it again. You are allowed to do geolocation based redirects, directing US users to your US site, German users to your German site, as long as you don’t treat Googlebot differently than a user.
Here is John’s tweet:
No — it’s fine to do that, but keep in mind that we’d only index one version, so if there’s something important in there, make sure it’s also crawlable & indexable directly.
— 🍌 John 🍌 (@JohnMu) January 9, 2019
So if you serve the US version to Google, since most of Googlebots come from the US, and you have content on your German version you want Google to index – make sure you give Google and users a way to access that content from the US.
Here is the video from 2013:
Here is the transcript:
Today’s question comes from Peter Peter asks using geo detection techniques as against Google. I am offering the useful information price to the users based on the geolocation. Will Google consider this as spam that is showing X content to search engines and Y content to users.
So I’ve said this before but let me just reiterate geolocation is not spam.
As long as your showing you know someone’s coming from a French IP address, let’s redirect them to the French version of my page or the French domain for my business, that’s totally fine. Someone comes in from a German IP address I’ll redirect them over to the German version of my page, that’s totally fine.
The thing that I would do is make sure that you don’t treat search engines any differently than a regular user. So if Googlebot comes in you check the IP address imagine we’re coming from the United States, just redirect Googlebot to the United States version of your page or the dot com, whatever it is that you would serve to regular United States users.
So geolocation is not spam Google does it whenever users come in we send them to what we think is the most appropriate page based on a lot of different signals but usually the IP address of the actual user.
So the last part of the question was showing X content to search engines and Y content to users. So that is cloaking, that’s showing different content to Google than to users and that is something that I would be very careful about. But as long as you’re treating Googlebot just like every other user wherever whatever IP address they come from and your geo-locating as long as you don’t have special code that looks for the user agent of Googlebot or special code that looks for the IP address of Googlebot and you just treat Googlebot exactly like you treat a visitor from whatever country we’re coming from then you’ll be totally fine because you’re not cloaking you’re not doing anything different for Google.
You’re doing the exact same thing for Google that you would do for any other visitor coming from that IP address. As long as you handle it that way you’ll be in good shape. You won’t be cloaking, you’ll be able to return nicely geo-located pages for Google and for search engines without any risk whatsoever.
Forum discussion at Twitter.
Site Search 360 Trends