Linkmasking via PRG Pattern Simply Explained

Using Linkjuice in a Targeted Way: Linkmasking via PRG Pattern Easily Explained

The PRG pattern is an effective method for link masking. Get, redirect and post requests are used to hide a link from the search engine. This has the advantage that the Linkjuice can be distributed more targeted.

It does not matter if you did not immediately understand that. This is how many people hear about it for the first time. But do not worry, in this article we will explain the PRG pattern as easy as possible. Basic knowledge in HTML and client-server architecture are helpful, but not necessary.

What is Linkjuice and why should it be controlled?

Let’s start from the beginning. Google’s goal is to give the user the best page for their search. As an SEO, it is always important to remember this simple fact. Google therefore evaluates not only OnPage factors such as the page content but also OffPage factors such as the number and quality of incoming links (back links) of your domains. If a website offers some added value, it will eventually be linked. Each incoming link inherits a little from the good (or bad) reputation of the originating site. This signal is called “Linkjuice” or “Linkpower”. It influences the ranking of your domains and their subpages.

On your website, this inheritance goes on. The power you get through external links is passed on to all internal links on a page. Of course, this link power is limited – many links thin it out. The more you have on a page, the less linkjuice each link gets. Pages that are frequently linked to your domains typically receive more link power. Incidentally, you can find out what these are in the Google Search Console under the menu item “Internal Links”.

Resulta ng larawan para sa link juice

Very simplified model of linkjuice inheritance

In the highly competitive search results Linkjuice can make the difference in the ranking. Therefore, you should link the most important pages of your domains well. In reality, however, it often looks different: Links to the imprint, privacy or social media channels are on a common footer, which is used for all subpages. As a result, less important pages receive the most links and thus a lot of linkjuice.

Simply removing them is not an optimal solution. After all, the imprint must always be available and directly accessible on commercially used pages according to the law. In addition, these footer links may be relevant to the user in many cases. The better alternative is therefore a link masking via PRG pattern. The link remains accessible to the user as usual, but search engines do not recognize it. Thus, no Linkjuice is inherited by this link. As a result, more linkjuice remains for more important pages.

Comparison of linkjuice distribution with and without PRG pattern

By the way: Contrary to popular belief, the use of NoFollow does not save Linkjuice, but merely invalidates it. Therefore, this method is not suitable for controlling linkjuice. In any case, the NoFollow tag was originally introduced so that site operators can distance themselves from advertising links and protect themselves from forum spam. If you mark your internal links with NoFollow, you distance yourself from your own content. This is certainly not a positive signal to the search engine.

The one or the other SEO ring the alarm bells now probably. Something “hiding” from Google is reminiscent of blackhat methods, such as cloaking, where the search engine displays content other than the user. At Google, however, this method of link masking is well-known – no concerns were expressed, nor were they required in any way to refrain from doing so. In any case, it is an advantage for the search engine if you actively take care of the link structure as a site operator and Google can therefore crawl the page more easily.

So if you use the PRG pattern, it’s all about masking links. This allows you to distribute the linkjuice targeted to selected links and prevents at the same time that this is unnecessarily thinned out. But what does the technical implementation look like?

Basic operation of the PRG pattern

First a small crash course on HTML and client-server communication:

In addition to the classic link, which is implemented in the HTML by an anchor <a>, there are other ways to let a user navigate to a page. Instead of an anchor, the PRG pattern therefore uses a form field <form>.

Forms are used on websites wherever users make an entry, for example when logging in.

Example of a simple form field created in HTML code by a form tag.

Of course, the data entered by a user must be transmitted to the server in the above example. This checks if the username and password are correct. Then he redirects the user to his landing page. By clicking on the “OK” button on the form, the link follows and lands on a new page.

A PRG link also uses a form element, but no user input is expected. Instead, the form sends a predefined statement to the server when clicked. He receives the instruction and forwards the user to the desired page. Such a form element can optically be designed so that it does not differ from a normal link and the user thus notices a difference.

The shape element can be given the appearance of a normal link through CSS statements.

Post, Redirect & Get

In order to mask a link, it is still important that the statement of the form element is sent to the server via a “post request” instead of a “get request”.

  • In the GET method , the input data is appended to the URL as a parameter and passed to the server. Such URLs are often found in the internal search function of a page, for example, or when you filter products in an online shop. These parameter links can be followed by the search engine without problems. Therefore, it is not suitable for link masking.

Example of search with parameter link

  • The post method is different when transferring data. In the case of a post request, additional data can be appended in the so-called request body. This technique is useful, for example, for uploading images on the Internet. The data is not transmitted via the URL but invisible within the request body. This will not create a new link. Among other things, this method is used when logging in, as the username and password would otherwise be visible via the URL. The post method does not create a new link, which is why Google can not assign a linkjuice here. Crawlers also ignore forms using the Post method. This is because postforms are used by default to upload individual data to the server. It makes no sense for the crawler to do this. Therefore, a link masking works only by using the post method!

This would have clarified what the “P” stands for in the PRG pattern. But do not worry, the redirect and get part is quickly explained.

As explained in the login example, the server can check both username and password and initiate forwarding to the landing page. Instead, the PRG pattern simply sends the user’s target to the server, for example “Imprint”. It then sends a redirect to your browser. This follows the forwarding and calls the imprint.

The last page call of the browser can again take place through a normal get request. Generally, in the HTTP protocol, pages are requested by default through a get request. In the case of the PRG pattern, this also has the advantage that the server can forward us to a parameter URL, for example to a filtered product page.

What is the PRG pattern for?


There are different uses for the PRG pattern:

  • Faceted filter navigations: filters often generate innumerable parameter URLs. Crawlers can follow this and index numerous filtered pages. This strains the crawl budget enormously. So that search engines really crawl only index-relevant filters, different filters are masked. The function of the filters is not affected.
  • Flyout navigations: In order to obtain a better topic siloing, a closer alignment of the internal linking is to be brought about. This is done by masking parts of the navigation. When topic siloing itself is a masking of content-external links. This allows the search engine to more accurately capture the actual topic of particular areas.
  • Footer links: These links are repeated on every page and are often not index-relevant (terms and conditions, data protection, etc.). You can also mask these. Often, links to social media channels appear on every page that thin out the linkjuice.
  • Back button on forms: The PRG method is mainly used when entering form data. It prevents that with the back button a renewed transmission of the form data takes place. This improves usability.

What do you have to keep in mind when using the PRG pattern?

Unlike a conventional link, the PRG pattern does not use an anchor tag <a> but a form tag <form>. Therefore, you should make sure to create the CSS for the clickable form element, so that it visually does not differ from the other links on the page.

When using PRG links, make sure that pages that are to be indexed remain accessible through at least one crawlable link. For example, if you mask links in the footer, you may want to forego PRG masking on the homepage. This will ensure that they can still be crawled.

Also, do not be tempted to masquerade all external links via a PRG pattern to keep all the valuable linkjuice to you. Of course, a typical website also includes external links. A natural link profile does not just consist of incoming links.


Admittedly, the PRG pattern can be daunting. At the latest in the actual implementation, some basic knowledge of HTML, CSS and server-client communication is required. In this guide, we have omitted code examples and put the focus on SEOs that deal for the first time with the topic of PRG pattern. To better understand the principle, however, you should look at various code examples that are already available on this topic. It is helpful to first consider how HTML forms work.

The effort is worth it. Not without reason, the PRG pattern is already used by numerous, successful websites. These include, for example, the furniture retailer Ambiente Direct or the animal feed distributor Josera . The more elaborate implementation has some advantages:

  • more linkjuice for important pages
  • Saving the crawling budget
  • Reduction of the index
  • Prevent Duplicate Content