The Complete Guide to URLs for SEO

When creating an SEO (search engine optimisation) strategy, there are three categories that are always considered. If you have even a tiny bit of knowledge on digital marketing, you have probably already heard of the ‘big 3’ that should comprise your SEO plans; on-page SEO, off-page SEO, and technical SEO.

URL optimisation and structure is an often overlooked component of SEO which leans more into the technical SEO side of the spectrum. URLs are a driving force behind delivering and indexing web pages into search engine results pages, making them a key factor in optimising your site for search engines.

Let’s jump in with the basics.

URLs: Breaking down the basics

What are URLs?

URL is an acronym that stands for Uniform Resource Locator and is also often referred to as a ‘web address’. It is a string of text that indicates the location of a web page or other resource (PDFs, images, and more) on a computer network, such as the internet. In addition to this, a URL will indicate the method or protocol by which the resource shall be retrieved. This includes HTTP, HTTPS, FTP, etc.

Generally a URL will comprise three components: the protocol, the host name or address and the ‘slug’ or text string.. Each part is separated by a character.

Different URL Types

There are varying URL formats and types as can be expected with their versatility and wide range of uses for delivering content. An understanding of URL types will equip you with the skills to identify the most appropriate URL structure for your web pages.

  • Absolute – The entire address that, when typed into a web browser, will take the user to the exact directory or file of a website. This includes all components of a URL. For example,
  • Relative – Doesn’t include the protocol or domain causing search engines to assume these components to be the same site where the URL shows /exampleone.
  • HTTP – Non-secure/no SSL/incorrect SSL installation, for example
  • HTTPS – Secure/SSL valid, for example
  • WWW – Has ‘www’ within string of text, for example
  • NON WWW – ‘www’ removed from string of text, for example
  • Static – Not dynamically generated, URL has no variable strings therefore URL remains the same. Content remains the same unless hard coded changes are made in the HTML coding, for example
  • Dynamic URL – The result of a search on a data-driven website. For example,
  • Front Loaded – Keywords added to the beginning of a URL, for example

Why are URLs important for SEO?

URLs offer search engine bots (also commonly known as ‘crawlers’) a way to retrieve information to then index and rank in search engine results pages. They need to be compliant and formatted accurately to support search engine bots in appropriately indexing the web page. Essentially, they can offer context and understanding of a web page’s content supporting search engines in ranking the URL for the right search terms, audience and positions.

Where a generic URL is used to describe the contents of a page, it does not offer context to bots or users about a page’s content until they access on-page copy. Consequently, this makes both the user journey and crawling process to understand a web page more time consuming. Reviewing generic and enhanced URLs side by side highlights the usefulness of a keyword optimised URL structure and its value compared to that which is generic, therefore would be grouped with thousands upon millions of similar URLs completely irrelevant to your site.

Generic –
Optimised –

URLs optimised for search enhance user experience – a key aspect of making your website seo-friendly. A URL should be created for users and search engines, therefore they need to be human-readable. An ideal URL will offer context to both users and search engine bots that indicates what the destination page is about. This allows users an easier way to access desired content that they’re searching for, while also supporting search engines to index your web page accurately. Until recently, search engine results pages would often include the URL within site snippets, meaning that web pages with readable URLs might be preferred to help users efficiently filter through pages.. In 2020, it appears that search engines are now replacing this content with the domain and breadcrumbs.

How do search engines read your web page’s URL?

Major search engines, such as Google, attribute a crawl budget to every site. Crawl budget is defined as the number of pages the Googlebot crawls and indexes from a website in a certain time frame. If this process exceeds the crawl budget, then a Googlebot may not fully crawl the website meaning that pages missed will not be indexed and appear in search results pages. URL structure and parameters can influence your crawl budget as they can impact the volume, time, and difficulty it takes for a Googlebot to crawl a site.

Static URLs: What are they and how do they work?

A static URL is where the content of a web page stays the same, as does the URL text, unless changes are made to the hard code in the HTML of the page. These URLs are simple to form and implement with popular Content Management System’s (such as WordPress) as they are generally built with a preference towards static URLs.

For SEO, it is essential that URLs are static, as they make it simple for search engine bots to crawl and index. Essentially, with static URLs the bot does not have to continually identify changes or URL variations for a given, individual web page, and making it easier for the search engine to accurately index a webpage for the right keywords, and user search queries.

Dynamic URLs: What are they and how do they work?

Dynamic URLs have variable text strings which depend on variable parameters that are provided to the server that delivers it. These parameters could already show in the URL or can be generated by users when they search and interact with the site. Generally, when site content is stored in a database and pulled for display on pages upon request, a dynamic URL may be used. An example of this is, if there is a e-commerce site, when a user uses your site to search for a product and filters their search, the site requests and pulls content from a database to then display to the user. This causes many variations in URL parameters and structure as there are endless possibilities that users could use to request content. Each user could also be given a different URL which ultimately displays similar or duplicate content, but has been requested/accessed in a different way to another user causing two dynamic URLs to form.

Google states that they can now effectively crawl and index dynamic URLs. However, by their own admission and advice they state:

Google guidelines

There are several reasons why dynamic URLs are strongly discouraged for SEO purposes. They can cause duplicate content issues, as well as limiting the opportunity for keyword appearances in the URL. They also do not effectively exist until a user interaction causes them to be automatically generated depending on how the user has accessed and requested the web content. As a result, this makes it difficult for search engine bots to identify, crawl and index them as new URLs are being dynamically generated on a regular basis. Below we explain in more detail how dynamic URLs impact SEO.

  • Crawl Budget – dynamic URLs cause more variations of parameters and create a high volume of different URLs to display web page content. Some argue that if a Googlebot has more URLs to crawl and index in a set timeframe, there is a higher probability of some pages not being indexed in search engine results pages. In conclusion, site’s using dynamic URLs are making themselves harder to crawl and less likely to rank for keywords.
  • Keyword Loss – dynamic URLs limit your ability to strategically add keywords into URLs. Because they are dynamically generated, you have less control over what content is displayed in the parameters and text string of a URL as the site generates this automatically when retrieving content from a database to display to users.
  • Context and Relevance Issues – As dynamic URLs have a large volume of variables, the content within these URLs tend to lack information related to the page’s content and topics. Instead, they use multiple parameters focused on retrieving and delivering content. Even when keywords are present, they can become ‘buried’ by the content surrounding them causing the keywords present to have less value compared to a static URL.
  • Long length – Given the different variations of dynamic URLs that need to be generated, URLs tend to use multiple parameters to retrieve and display content. This is because the URL must add and consider every way a user requests web page content, this can vary fastly depending if they used a site search feature, filtered categories, the site menu, and more. Consequently, URLs are longer as they display this information. The optimum URLs for SEO should be short and concise as this makes it easier and quicker for search engine bots to crawl, and index.
  • Unreadable – The URLs are not human-readable as usually they display multiple characters and parameters within the URL. This means that they are not easy to understand for users wanting to know what the page content is about before accessing or loading it. Traffic could be lost from this as some users may find the page untrustworthy or unhelpful, opting for a competitor site which is more clearer on their content.

Dynamic vs Static URLs: Which is better?

A web developer may have told you dynamic URLs are fine to use, but here is why they are wrong from an SEO perspective! Google has tried to reassure that its bots are robust and advanced enough to crawl dynamic URLs effectively. But (and its a big ‘but’), it has been observed time and time again that dynamic URLs do cause more duplicate content vulnerabilities, therefore static URLs are always encouraged instead.

URL optimisation for SEO

There are many methods you can use to create and implement seo-friendly URLs with the end goal of supporting positive ranking movements for target keywords. This list of tips take into consideration Google’s latest algorithm updates alongside tried and tested techniques to create strong URLs.

Relevant keywords in URLs

Adding keywords to a web page URL will support both search engine bots and users in understanding the relevance of your webpage to a certain topic or search query making it easier for your page to rank correctly in search engine results pages. The term ‘front loaded URLs’ is often used to describe the idea of adding relevant keywords to the start of your URLs. The location of your keywords should be prioritised. In 2020, you will find a bunch of borderline scare articles warning you of the penality risks of keyword stuffing in URLs and on-page content. Though it is certainly true that Google is more than effective enough in identifying keyword stuffing and, consequently, penalising or dropping ranks for these web pages – using keywords appropriately isn’t something you should be discouraged from. Adding relevant and a non-excessive amount of keywords to a URL has been proven to be a helpful way of showing the value of your site content in relation to a search query and keyword to search engine bots. By adding keywords to your URL, you are simply creating more ways for your site page to be categorised and indexed for the right keywords encouraging a higher value audience to access your site – how can there be harm in that?

Secure is always better

As mentioned earlier, the protocol of a URL can include secure and non-secure versions. Search engines have a preference towards secure URLs which have an accurately installed SSL certificate. With data protection laws and privacy regulations becoming increasingly tighter, web browsers have become more effective in identifying and warning users of potentially harmful or higher risk sites. These warnings are enough to make anyone think twice about accessing a web page, and often browsers even block access requiring a user to manually override a warning to access the content. These warnings can impact your site traffic and online brand reputation and trust.

Google Chrome Warning for a HTTP site not protected with a valid SSL certificate.

site security warning

Short & Concise

URLs should be short and concise to make them easier for both bots and users to understand. This one simple rule can provide many of the benefits listed above as relevant keywords will not be buried amongst irrelevant terms making them more valuable. Adding short and unique text strings to URLs for each web page, such as blog articles or category pages, gives you the opportunity to target a spectrum of topics and keywords related to your industry. This will also help to minimise duplicate content issues as every page can be attributed a ‘slug’ or text string that directly reflects its own unique content.

Examples of unique URL structure:

Prevent Duplicate Content: Use Canonical Tags

The use of canonical tags aids bots in understanding the original source of duplicated content to ensure this web page is given the right indexing and attributions. Unfortunately, the internet is a place where copied content thrives due to the ease in ctrl C, ctrl + V.. Due to this, search engines need to be robust enough to identify and index the original source of this duplicate content wherever possible. This is how the importance of canonical tags comes into play. Canonical tags have provided digital marketers the ability to notify search engines of even cross-domain duplicate content to prevent issues. Canonical tags are implemented via the HTML of a site page and can be self-referencing.

<link rel=”canonical” href=”” />

Lower Case Characters

URLs are not case sensitive meaning that it is meaningless to add capital characters to text strings. Essentially, these URLs will deliver the same content whether characters are CAPS or not. For better compliance and coherent URL structure, lower case characters should be used.

Unfriendly URLs & Unsafe Characters

Google have specified a number of unsafe characters that make a URL unfriendly to search engines and should be avoided. These are as follows:

” < > # % { } | \ ^ ~ [ ] ` Empty Spaces

How to correctly change a site URL

There is an art to changing URLs. By simply editing the slug or string via the CMS, you may accidently create broken links. It is important to remember that external sites that already link to your site using the old URL will still exist, but instead will return a 404 error if you just change a URL’s content in the CMS. To prevent this, follow the guide below for accurate optimisation.

  1. Edit the URL in your CMS for the relevant pages and save edits
  2. Implement a permanent (301) redirect for the old URL to now direct to the new URL
  3. Test the old URL and make sure it lands onto the page using the new URL
  4. Request a reindex for the new URL through the URL inspection tool on Google Search Console
  5. Ensure to update your sitemap.xml file with the new URLs and request this to be indexed via Google Search Console

Final Takeaways

URLs are a core part of your website development and SEO performance. As an often-neglected tool, they act as a stepping stone in delivering content to millions upon millions of users across the globe – a site wouldn’t work without them!

By showing your URLs some tender love and care – you can discover the real digital marketing benefits that they reveal.