An SEO Guide to URL Parameter Handling
URL parameters can be an SEO nightmare since combinations of parameters can create thousands upon thousands of URL variations out of the same piece of content. However, URL parameters certainly play a crucial role in a website’s user experience, so it’s important to understand how to use them in an SEO-friendly way.
Check out our SEO guide to URL parameter handling to learn more about what URL parameters are, some of the SEO issues they can cause and some solutions and best practices.
What Exactly are URL Parameters?
Parameters are the part of a URL that follows a question mark. Also known as query strings or URL variables, URL parameters include a key and a value pair, which are separated by an equal sign. You can add multiple URL parameters to one web page by using an ampersand.
There are a number of use cases for parameters, including tracking, reordering, filtering, identifying, paginating, searching and translating.
Which SEO Problems Do URL Parameters Cause?
So exactly which SEO issues do URL parameters cause? URL parameters cause a wide variety of issues, including:
- Parameters create duplicate content
- Parameters waste crawl budget
- Parameters split page ranking signals
- Parameters make URLs less clickable
1 – Parameters create duplicate content:
URL parameters sometimes make no real change to the content of a page, meaning that a page URL with tracking tags or a session ID can actually be identical to the original. There can be multiple URLs with very similar content, and search engines treat each parameter-based URL as a new page, so when they see multiple super similar URLs, they treat them as multiple variations of the same page. Duplicate content is a huge no-no in SEO services, and these similar URLs can downgrade Google’s view of your overall website quality.
2 – Parameters waste crawl budget:
Crawling redundant parameter pages wastes crawl budget, reduces your website’s ability to index SEO relevant pages and increases server load. Overly complex URLs with multiple parameters can cause issues for crawlers by creating high numbers of URLs that point to the same or similar content on your website. Google then has trouble fully indexing all of the content on the site or often consumes more bandwidth than necessary to complete the task.
3 – Parameters split page ranking signals:
If you have multiple versions of the same page content, links and social shares come in on different versions and dilute your ranking signals. This can confuse crawlers, which then become unsure which pages to the index for the search query. This means that multiple versions of the same page will likely rank lower in the search results, rather than one version moving to the top of the results.
4 – Parameters make URLs less clickable:
Parameter URLs are often ugly or difficult to read, making them seem less trustworthy and unfortunately less likely to be clicked. The aesthetic nature of the URL itself impacts page performance. Click-through rate definitely influences rankings, and these URLs are often less clickable on social media or in emails when the full URL is displayed. Every tweet, like, share, email, link and mention matters for your website, and URL readability can definitely contribute to your brand engagement.
Now that you know a bit more about some of the problems that URL parameters can cause, it’s crucial that you fully understand the extent of your parameter problem.
Assess the Scope of Your URL Parameter Problem
It’s important to know which parameters are used on your website, and chances are that your developers don’t keep an updated list of all parameters. That is why you need to follow these steps to fully understand the scope of your problem, learn which parameters need handling, better understand how search engines crawl and index these pages and know the value they bring to users.
Read and follow these five steps in order to better understand your parameter problem:
- Run a crawler. Use a crawler tool to search for any instance of the question mark in the URL.
- Look in Google Search Console URL Parameters Tool. Google typically auto-adds any query strings it finds, so you can often find your parameters-based URLs here.
- Review log files. Look and see if the Googlebot is crawling parameter-based URLs.
- Search with site: inurl: advanced operations. It’s important to know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
- Look in Google Analytics All Pages report. Search for question marks to see how each of the parameters you found is being used by users. Make sure that URL query parameters have not been excluded in the view setting.
With all of this information, you can decide how to best handle your website’s parameters and address any SEO issues.
SEO Solutions to Issues with URL Parameters
There are a number of different tools you can use to address URL parameters and boost your SEO.
Limit parameter-based URLs
If you review how and why parameters exist, you can often find ways to reduce the number of parameter-based URLs and minimize the negative SEO impact. Consider these four common problems to begin your review.
- Eliminate unnecessary parameters. Ask your developer for an up-to-date list of all website parameters and their functions. More often than not, when you do this, you will discover parameters that are no longer necessary, since they don’t provide a valuable function. SessionID parameters, for example, might no longer be necessary but may still exist on your website, because they were used historically. Eliminate any parameters caused by technical debt right away.
- Prevent empty values. URL parameters should only be added to a URL if they have a function. Do not add a parameter key to a URL if the value is blank.
- Only use keys once. Avoid using multiple parameters with the same parameter name and a different value. It is typically better to combine values together after a single key for multi-select options.
- Order URL parameters. When the same URL parameter is rearranged, search engines interpret these two web pages as equal. Parameter order doesn’t matter in terms of duplicate content, but these two pages burn through crawl budget and split ranking signals, causing your pages to rank lower in the search results. You can avoid this problem by asking your developer to write a script to make sure the parameters are also placed in a consistent order.
This tactic will enable more efficient use of crawl budget, decrease any duplicate content issues and consolidate ranking signals to fewer pages. While this technique does require some technical implementation time, it is suitable for all parameter types.
Rel=” canonical” link attribute
The rel=” canonical” link attribute calls out if a page has the same or similar content to another page, encouraging search engines to consolidate ranking signals to the URL specified as canonical. This technique is not the best option when the parameter page content is not similar enough to the canonical, like pagination, searching or translating parameters. This tactic does have a number of benefits, including easy technical implementation, the ability to safeguard against duplicate content issues and the consolidation of ranking signals to the canonical URL, but on the other hand, it does waste crawl budget on parameter pages, it is interpreted by search engines as a strong hint rather than a directive and it is not suitable for all parameter types.
Meta robots Noindex tag
This tag allows you to set a noindex directive for any parameter based page that adds little to no SEO value. The meta robots no-index tag prevents search engines from indexing the page. URLs with a “noindex” tag are typically crawled less frequently and could eventually enable Google to nofollow the page’s links. The meta robots noindex tag has several advantages, including easy technical implementation, the ability to safeguard against duplicate content and the removal of existing parameter-based URLs from the index. The tag is also suitable for all parameter types you don’t want to be indexed, but it does not prevent search engines from crawling URLs and doesn’t consolidate ranking signals.
Robots.txt disallow
Search engines typically look at the robots.txt file first before crawling your website. If they see anything that is disallowed, they will not even visit your website. You can use the robots.txt file to block crawler access to all parameter-based URLs or to specific query strings that you don’t want the search engines to index. This tactic has a very easy technical implementation, allows crawl budget to be used more efficiently, avoids duplicate content issues and is suitable for all parameter types you do not want to be crawled. On the other hand, it does not consolidate ranking signals and does not remove existing URLs from the index.
URL parameter tool in Google Search Console
You can configure the URL parameter tool in the Google Search Console to tell crawlers the purpose of your parameters and how you want them to be handled. Unfortunately, using the tool can cause pages to disappear from the search engine. However, this is less damaging to your website than the thousands of duplicate pages on your site affecting your website’s ability to rank. It’s better to learn how to configure URL parameters in Google Search Console.
Ask how each parameter impacts your page content.
- Configure tracking parameters as “representative URLs.”
- Configure any parameters that reorder page content as “sorts.”
- Configure parameters that filter the page down to a subset of content as “narrows.
- Configure parameters that show a certain piece of content as “specifies.”
- Configure parameters that show a translated version of the content as “translates.”
- Configure parameters that show a component page of a longer sequence as “paginates.”
Google will immediately add parameters to the list, but these can never be removed, even once the parameter no longer exists, so it’s best to proactively add parameters on your own. Then once the parameter no longer exists, you can delete it from the Google Search Console. Add any parameter that you set to “no URL” in Google Search Console to Bing’s ignore URL parameters tool. The benefits of this tactic include no developer time needed, the ability to safeguard against duplicate content issues, the more efficient use of crawl budget and suitability for all parameter types. On the other hand, it does not consolidate ranking signals and it only works for Google with less control over Bing.
Move from dynamic to static URLs
Some people think the best way to handle URL parameters is to avoid them in the first place. Subfolders surpass parameters and help Google understand site structure and static keyword-based URLs are the cornerstones of on-page SEO. If you want to convert parameters into subfolder URLs, you can use server-side URL rewrites. This tactic is a great option for translated content or descriptive keyword-based parameters, like parameters that identify categories, products or filters for search engine relevant attributes. On the other hand, this technique can be problematic for non-keyword relevant elements of faceted navigation, like price.
What is the Best Practice URL Parameter Handling for SEO?
So now that we have gone over all of the possible SEO solutions for issues caused by URL parameters, which of these tactics is best for you?
Unfortunately, you can’t use all of them at once. These SEO solutions can conflict with each other, and using all of them at once would definitely cause an unnecessary level of complexity. There is no one perfect solution to all of your SEO problems when it comes to URL parameters.
What is best for you and your website depends on your priorities. Are you looking for more efficient crawling? The consolidation of authority signals? Easy implementation? Do you want to manage your duplicate content?
No matter what you choose, it is crucial that you track your choice and how it affects your metrics.