Google Says Disallowing UTM Parameters In URLs Will not Assist With Crawling Or Rating


Google Train Tracks Crossing

Google’s John Mueller stated on Reddit that disallowing URLs with UTM parameters in them will not assist you to to enhance crawling or score with Google Search. He added {that a} web site ought to attempt to maintain its inside URLs clear and constant, however over time, the canonical tags ought to assist with exterior hyperlinks that carry UTM parameters on them.

John wrote, “I doubt you’d to see any seen results in crawling or rating from this. (And if there isn’t any worth from doing it, why do it?)” When he was requested about disallowing such URLs.

He added:

Typically talking, I would nonetheless attempt to enhance the positioning in order that irrelevant URLs do not have to be crawled (inside linking, rel-canonical, being according to URLs in feeds). I believe that is smart when it comes to having issues cleaner & simpler to trace – it is good website-hygiene. In case you have random parameter URLs from exterior hyperlinks, these would get cleaned up with rel-canonical over time anyway, I would not block these with robots.txt. If you happen to’re producing random parameter URLs your self, say inside the inside linking, or from feeds submissions, that is one thing I would clear up on the supply, slightly than blocking it with robots.txt.

tldr: clear web site? sure. block random crufty URLs from outdoors? no.

That is all similar to earlier recommendation from John Mueller that I quoted in these tales:

Discussion board dialogue at Reddit.


Please enter your comment!
Please enter your name here