Advertisement
  1. SEJ
  2.  ⋅ 
  3. Technical SEO

An SEO Guide to URL Parameter Handling

URL parameters create duplicate content, waste crawl budget, and dilute ranking signals. Learn six ways to avoid potential SEO issues with URL parameters.

An SEO Guide to URL Parameter Handling

While parameters are loved by developers and analytics aficionados, they are often an SEO nightmare. Endless combinations of parameters can create thousands of URL variations out of the same content.

The problem is we can’t simply wish parameters away. They play an important role in a website’s user experience. So we need to understand how to handle them in an SEO-friendly way.

To do so we explore:

What Are URL Parameters?

url parameter elements

Also known by the aliases of query strings or URL variables, parameters are the portion of a URL that follows a question mark. They are comprised of a key and a value pair, separated by an equal sign. Multiple parameters can be added to a single page by using an ampersand.

The most common use cases for parameters are:

  • Tracking – For example ?utm_medium=social, ?sessionid=123 or ?affiliateid=abc
  • Reordering – For example ?sort=lowest-price, ?order=highest-rated or ?so=newest
  • Filtering – For example ?type=widget, colour=blue or ?price-range=20-50
  • Identifying – For example ?product=small-blue-widget, categoryid=124 or itemid=24AU
  • Paginating – For example, ?page=2, ?p=2 or viewItems=10-30
  • Searching – For example, ?query=users-query, ?q=users-query or ?search=drop-down-option
  • Translating – For example, ?lang=fr, ?language=de or

SEO Issues with URL Parameters

1. Parameters Create Duplicate Content

Often, URL parameters make no significant change to the content of a page. A re-ordered version of the page is often not so different from the original. A page URL with tracking tags or a session ID is identical to the original.

For example, the following URLs would all return collection of widgets.

  • Static URL: https://www.example.com/widgets
  • Tracking parameter: https://www.example.com/widgets?sessionID=32764
  • Reordering parameter: https://www.example.com/widgets?sort=newest
  • Identifying parameter: https://www.example.com?category=widgets
  • Searching parameter: https://www.example.com/products?search=widget

That’s quite a few URLs for what is effectively the same content – now imagine this over every category on your site. It can really add up.

The challenge is that search engines treat every parameter based URL is a new page. So they see multiple variations of the same page. All serving duplicate content and all targeting the same keyword phrase or semantic topic.

While such duplication is unlikely to cause you to be completely filtered out of the search results, it does lead to keyword cannibalization and could downgrade Google’s view of your overall site quality as these additional URLs add no real value.

2. Parameters Waste Crawl Budget

Crawling redundant parameter pages drains crawl budget, reducing your site’s ability to index SEO relevant pages and increasing server load.

Google sums up this point perfectly.

“Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”

3. Parameters Split Page Ranking Signals

If you have multiple permutations of the same page content, links and social shares may be coming in on various versions.

This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.

4. Parameters Make URLs Less Clickable

parameter based url clickability

Let’s face it. Parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are less likely to be clicked.

This will impact page performance. Not only because CTR can influence rankings, but also because it’s less clickable on social media, in emails, when copy pasted into forums or anywhere else the full URL may be displayed.

While this may only have a fractional impact on a single page’s amplification, every tweet, like, share, email, link, and mention matters for the domain.

Poor URL readability could contribute to a decrease in brand engagement.

Assess the Extent of Your Parameter Problem

It’s important to know every parameter used on your website. But chances are your developers don’t keep an up to date list.

So how do you find all the parameter that needs handling? Or understand how search engines crawl and index such pages? Know the value they bring to users?

Follow these five steps:

  • Run a crawler: With a tool like Screaming Frog you can search for “?” in the URL.
  • Look in Google Search Console URL Parameters Tool: Google auto-adds the query strings it finds.
  • Review your log files: See if Googlebot is crawling parameter-based URLs.
  • Search with site: inurl: advanced operators: Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
  • Look in Google Analytics All Pages report: Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.

Armed with this data, you can now decide how to best handle each of your website’s parameters.

SEO Solutions to Tame URL Parameters

You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.

Limit Parameter-Based URLs

A simple review of how and why parameters are generated can provide an SEO quick win. You will often find ways to reduce the number of parameter URLs and so minimize the negative SEO impact. There are four common issues to begin your review.

1. Eliminate Unnecessary Parameters

remove unnecessary parameters

Ask your developer for a list of every website parameters and its function. Chances are, you will discover parameters that no longer perform a valuable function.

For example, users can be better identified by cookies than sessionIDs. Yet the sessionID parameter may still exist on your website as it was used historically.

Or you may discover that a filter in your faceted navigation is rarely applied by your users.

Any parameters caused by technical debt should be immediately eliminated.

2. Prevent Empty Values

no empty parameter values

URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.

In the above example, key2 and key3 add no value both literally and figuratively.

3. Use Keys Only Once

single key usage

Avoid applying multiple parameters with the same parameter name and a different value.

For multi-select options, it is better to combine the values together after a single key.

4. Order URL Parameters

order url parameters

If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.

As such, parameter order doesn’t matter from a duplicate content perspective. But each of those combinations burns crawl budget and split ranking signals.

Avoid these issues by asking your developer to write a script to always place parameters in a consistent order, regardless of how the user selected them.

In my opinion, you should start with any translating parameters, followed by identifying, then pagination, then layering on filtering and reordering or search parameters and finally tracking.

Pros:

  • Allows more efficient use of crawl budget.
  • Reduces duplicate content issues.
  • Consolidates ranking signals to fewer pages.
  • Suitable for all parameter types.

Cons:

  • Moderate technical implementation time

Rel=”Canonical” Link Attribute

rel=canonical for parameter handling

The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.

You can rel=canonical your parameter-based URLs to your SEO-friendly URL for tracking, identifying or reordering parameters. But this tactic is not suitable when the parameter page content is not close enough to the canonical, such as pagination, searching, translating or some filtering parameters.

Pros:

  • Relatively easy technical implementation.
  • Very likely to safeguard against duplicate content issues.
  • Consolidates ranking signals to the canonical URL.

Cons:

  • Wastes crawl budget on parameter pages.
  • Not suitable for all parameter types.
  • Interpreted by search engines as a strong hint, not a directive.

Meta Robots Noindex Tag

meta robots noidex tag for parameter handling

Set a noindex directive for any parameter based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.

URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.

Pros:

  • Relatively easy technical implementation.
  • Very likely to safeguard against duplicate content issues.
  • Suitable for all parameter types you do not wish to be indexed.
  • Removes existing parameter-based URLs from the index.

Cons:

  • Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
  • Doesn’t consolidate ranking signals.
  • Interpreted by search engines as a strong hint, not a directive.

Robots.txt Disallow

robots txt disallow for parameter handling

The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.

You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.

Pros:

  • Simple technical implementation.
  • Allows more efficient use of crawl budget.
  • Avoids duplicate content issues.
  • Suitable for all parameter types you do not wish to be crawled.

Cons:

  • Doesn’t consolidate ranking signals.
  • Doesn’t remove existing URLs from the index.

URL Parameter Tool in Google Search Console

GSC URL parameter handling

Configure Google’s URL parameter tool to tell crawlers the purpose of your parameters and how you would like them to be handled.

Google Search Console has a warning message that using the tool “could result in many pages disappearing from a search.”

This may sound ominous. But what’s more menacing is thousands of duplicate pages hurting your website’s ability to rank.

So it’s best to learn how to configure URL parameters in Google Search Console, rather than letting Googlebot decide.

The key is to ask yourself how the parameter impacts the page content.

  • Tracking parameters don’t change page content. Configure them as “representative URLs”.
  • Configure parameters that reorder page content as “sorts”. If this is optionally added by the user, set crawl to “No URLs”. If a sort parameter it is applied by default, use “Only URLs with value”, entering the default value.
  • Configure parameters that filter page down to a subset of content as “narrows”. If these filters are not SEO relevant, set crawl to “No URLs”. If they are SEO relevant set to “Every URL”.
  • Configure parameters that show a certain piece or group of content as “specifies”. Ideally, this should be static URL. If not possible, you will likely want to set this to “Every URL”.
  • Configure parameters that display a translated version of the content as “translates”. Ideally, translation should be achieved via subfolders. If not possible, you will likely want to set this to “Every URL”.
  • Configuration parameters that display a component page of a longer sequence as “paginates”. If you have achieved efficient indexation with XML sitemaps, you can save crawl budget and set crawl to “No URL”. If not, set to “Every URL” to help crawlers to reach all of the items.

Google will automatically add parameters to the list under the default “Let Googlebot decide”. The challenge is, these can never be removed, even if the parameter no longer exists.

So whenever possible, it’s best to proactively add parameters yourself. So that if at any point that parameter no longer exists, you may delete it from GSC.

For any parameter you set in Google Search Console to “No URL”, you should also consider adding it in Bing’s ignore URL parameters tool.

Pros:

  • No developer time needed.
  • Allows more efficient use of crawl budget.
  • Likely to safeguard against duplicate content issues.
  • Suitable for all parameter types.

Cons:

  • Doesn’t consolidate ranking signals.
  • Interpreted by Google as a helpful hint, not a directive.
  • Only works for Google and with lesser control for Bing.

Move From Dynamic to Static URLs

Many people think the optimal way to handle URL parameters is to simply avoid them in the first place. After all, subfolders surpass parameters to help Google understand site structure and static, keyword-based URLs have always been a cornerstone of on-page SEO.

To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.

For example, the URL:

www.example.com/view-product?id=482794

Would become:

www.example.com/widgets/blue

This approach works well for descriptive keyword-based parameters, such as those which identify categories, products, or filter for search engine relevant attributes. It is also effective for translated content.

But it becomes problematic for non-keyword relevant elements of faceted navigation, such as price. Having such a filter as a static, indexable URL offers no SEO value.

It’s also an issue for searching parameters, as every user-generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low-quality content pages whenever a user has searched for an item you don’t offer.

It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as

www.example.com/widgets/blue/page2

Very odd for reordering, which would give a URL such as

www.example.com/widgets/blue/lowest-price

And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of UTM parameter.

More to the point, by replacing dynamic parameters with static URLs for things like pagination, onsite search box results or sorting does not address duplicate content, crawl budget or internal link equity dilution.

And having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.

Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding the SEO problems.

But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page. And if obviously not feasible for tracking parameters and not optimal for pagination.

The crux of the matter is that for many websites, completing avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.

So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement as query strings. For parameters that you do want to be indexed, use static URL paths.

Pros:

  • Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.

Cons:

  • Significant investment of development time for URL rewrites and 301 redirects.
  • Doesn’t prevent duplicate content issues.
  • Doesn’t consolidate ranking signals.
  • Not suitable for all parameter types.
  • May lead to thin content issues.
  • Doesn’t always provide a linkable or bookmarkable URL.

Best Practice URL Parameter Handling for SEO

So which of these six SEO tactics should you implement?

The answer can’t be all of them.

Not only would that create unnecessary complexity. But often the SEO solutions actively conflict with one another.

For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tag. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.

What becomes clear is there is no one perfect solution.

Even Google’s John Mueller can’t decide on an approach. In a Google Webmaster hangout, he initially recommended against disallowing parameters, but when questioned on this from a faceted navigation perspective, answered “it depends.”

There are occasions when crawling efficiency is more important than consolidating authority signals.

Ultimately, what’s right for your website will depend on your priorities.

url parameter handling option pros and cons

Personally, I don’t use noindex or block access to parameter pages. If Google can’t crawl and understand all the URL variables, it can’t consolidate the ranking signals to the canonical page.

I take the following plan of attack for SEO-friendly parameter handling:

  • Do keyword research to understand what parameters should be search engine friendly, static URLs.
  • Implement correct pagination handling with rel=”next & rel=”prev”.
  • For all remaining parameter-based URLs, implement consistent ordering rules, which use keys only once and prevent empty values to limit the number of URLs.
  • Add a rel=canonical link attribute to suitable parameter pages to combine ranking ability.
  • Configure URL parameter handling in both Google and Bing as a failsafe to help search engines understand each parameter’s function.
  • Double check that no parameter-based URLs are being submitted in the XML sitemap.

No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.


Image Credits

Featured Image: Paulo Bobita
In-Post Images/Screenshots: Created/Taken by author

Category SEO
ADVERTISEMENT
VIP CONTRIBUTOR Jes Scholz Marketing Consultant at JesScholz.com

Jes Scholz is an organic marketing consultant and SEO futurist. She delivers award-winning audience development strategies focused on entity optimization, ...

Advanced Technical SEO: A Complete Guide