🎤 Speaker line-ups live for Portland, & Berlin
⌛ Limited early pricing tickets live for Philadelphia, & Melbourne
🎥 London 2025 on-demand recordings & 2026 tickets

Back to Knowledge Hub

Solving Keyword Cannibalisation Issues Using N-Grams in Screaming Frog’s SEO Spider

Author: Ellie Brett

Last updated: 04/03/2025

Keyword cannibalisation occurs when several pages on a single site target the same set of keywords or phrases, and, as a result, effectively compete against each other in the search engine results pages (SERPs).

When this happens, either one (or more) pages might fall out of the rankings completely, and/or the rankings of all the competing pages may be reduced. This is clearly far from ideal!

In this article I’ll be covering how to use N-Grams in Screaming Frog’s SEO Spider to help identify why two or more pages are cannibalising; plus I’ll provide some options on how to solve cannibalisation issues.

Creating a list of cannibalised pages

In order to make use of the N-Grams feature in the SEO Spider, first you’ll need to create a list of the pages which are cannibalising on your site. There are various tools that can be used to identify cannibalised pages, some of our favourites include Sistrix, SEMRush, SEOTesting, and Ahrefs.

Using whichever tool you prefer, export a list of the cannibalised pages and the keywords they compete for. The data will then need to be organised so that you have a clear list of the keywords and URLs that are targeting them.

Before you begin any analysis:

  • Ensure that the pages are actually cannibalising: i.e. manually check to see if both pages show up in the SERPs.
  • Check the average position for each page over the last few months. Have these pages been cannibalising for a while or is it a new development?
  • Is there a Core Update rolling out? This can cause the SERPs to change frequently as Google decides which pages best-fit the user intent for a given query.
  • Has there been a site update recently? Will the page be dropped from the index once Googlebot crawls it?

Tip: If you have a large number of keywords cannibalising it might be easier to organise your pages into competing keyword groups in order to spot patterns.

How to use N-Grams in the Screaming Frog SEO Spider

What is an N-Gram?

An N-Gram is a sequence of symbols in a particular order. For our purposes, we’re using N-Grams to look at the frequency of words or phrases on a set of pages. You can use N-Grams to look at a single word, or a phrase of up to six words.

N-Gram analysis can be a powerful tool for finding out how frequently specific words or phrases are mentioned on a single page, a group of pages, or across a whole site.

How N-Grams can help with cannibalisation

When looking at keyword cannibalisation specifically, N-Grams can be useful for determining how frequently the cannibalised keywords are mentioned on a set of pages. On the SEO Spider there is a fun ‘keyword density’ feature which looks at the frequency of keywords compared to the whole body of text.

The word cloud feature can also be used to help you understand which keywords are mentioned the most on a page.

Tip: Consider excluding headers and footers from your crawl to avoid words or phrases that are repeated on every page showing up in your analysis.

How to set-up N-Grams

1) Open the SEO Spider and set it to list mode

2) In the extraction menu enable ‘store html/ store rendered html’

3) Insert your list of cannibalised URLs into the SEO Spider

4) Hit ‘Start’ and wait for the URLs to be crawled. Then select the first set of URLs you want to analyse. In the menu at the bottom of the SEO Spider you need to go onto the ‘N-Grams’ tab (you might need to hit the arrow on the right hand corner to find it).

If you want a more detailed tutorial on setting up N-Grams, Screaming Frog has one here: How to Use N-Grams

How to analyse N-Grams to find the words and phrases which are cannibalised

Here you can see the list of cannibalised words or phrases that are used within your set of pages. The figure in the left-hand menu is a total for all of the pages selected. From here you can also see body text, keyword density, headings, and the number of linked anchors that include the keyword.

To see the information for each page in your set, click on a keyword or phrase. From here you’ll be able to see which page has the highest keyword density, and the highest number of linking anchors.

The word cloud feature can also be used if you want a quick snapshot of what the main keywords are on each page. This can help you quickly and clearly visualise the page content, and can also be useful to share with other teams.

Comparing the word clouds of two or more cannibalising pages can also be a great way to understand why Google might be favouring one page over another (or others); plus it can help you to understand whether or not the keyword targeting for the set of pages is appropriate.

Fixing cannibalised keywords

There are various ways to solve cannibalisation issues, but when deciding on which route to go down, remember that the type of page being cannibalised may have an impact on which solution is the most appropriate for your business or client.

Before making changes to any pages that are cannibalising it’s important to think about the intent of the page, and the extent to which it matches the intent of a given set of queries. Sometimes Google will rank both informational and commercial pages within the SERPs, and in some instances, if two of your pages are ranking within the top ten results it can actually be a good thing as you’re claiming more real estate on the SERP.

Page Alignment Changes

A quick win when fixing canonicalisation issues is to make sure the pages are targeting different keywords in their page titles and H1s. These tend to have more weighting for Google when it’s trying to understand the topic of the page, which means changing these elements can resolve some cannibalisation issues relatively rapidly.

De-optimising the content on pages

If you’re unable to remove one (or more) of the cannibalised pages, and/or updating page titles and H1s doesn’t resolve the issue, consider de-optimising the content on one (or more) of the pages.

Determine how the pages can be separated in order to target different subsets of keywords. For example, one page might be used to target more general high value keywords, whilst others are used to target more niche subject-specific terms.

Redirecting or noindex-ing, and canonicals

One of the questions you should be asking yourself with cannibalised pages is: do all of these pages need to show in the SERPs? When reviewing a set of cannibalised pages you might realise that only one of the pages is required to target the keyword.

If only one page is required, the page (or pages) with a lower traffic value/ lower average position can be 301 redirected to the page which returns more value. When redirecting pages make sure to update all internal links so they point to the final landing page and consolidate any content that’s missing from the final landing page.

If both (or all) pages are required from a business perspective, rather than redirecting, consider no-indexing and canonicalising, users will still be able to access all pages on site but they won’t be competing in the SERPs.

Final Thoughts

Addressing keyword cannibalisation can be a valuable tactic for improving search engine rankings. Using features like N-Grams in the SEO Spider, alongside your favourite cannibalisation tools can help you make better decisions on whether to de-optimise, consolidate, or redirect pages.

By carefully reviewing keyword targeting, density, and adjusting your content accordingly, you can ensure that each page targets the appropriate keywords, ultimately enhancing your site's overall SEO performance.


Ellie Brett - SEO Manager

Ellie is an SEO Manager at Screaming Frog with three years of experience in SEO. She has an interest in finding new ways to analyse websites that are both interesting and easy for those with little technical SEO knowledge.

WTSKnowledge Sponsor

February 2025 to June 2025 Cohort

ZipSprout is a matchmaking service between brands who need local SEO and marketing and nonprofits with sponsorship opportunities.

Since 2016, they've distributed over $7,000,000 to local communities across the US.