There may be times where you want to block these parameters from being crawled completely using robots.txt-but only in situations where you may have issues with crawl budget. Typically, you want them to be crawled, and each page should have a canonical set to the main version. Passive parameters can cause issues with duplicate content. ![]() Here are some of the problems you may encounter. URL parameters can cause a number of different issues when it comes to SEO, especially in cases where multiple parameters are used. Jumps to the designated timestamp in a video. It’s not common on modern websites to use session IDs to track users. Passes an identifier used to track where sales and signups come from.Īdvertising tags. Let’s look at some examples of each.Īffiliate IDs. Passive parameters do not change the content. On our search engine, yep.com, we use the key “q” for the query, and the value contains info about the user query. Queries a website for information that a user is looking for. Divides content into a series of related pages. Reorders the content in some way, such as by price or rating. An example of this is faceted navigation in e-commerce. ![]() Removes some of the content, leaving more specific content on the page that a user wants to see. Active parametersĪctive parameters modify the content of the page in some way.įilter. ![]() As I mentioned in the intro, parameters can be active or passive.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |