Skip to main content
google search bar on desktop

Around Sept 12–15, 2025, Google stopped honouring the “&num=100” URL parameter that many rank trackers and SERP data providers used to fetch the top 100 results in a single request. All SEO tools now have to paginate 10× to see the same amount of SERP depth, which temporarily broke some datasets and permanently reduces how much beyond-page-one data SEO platforms can fetch at scale. As a side-effect, many sites saw Search Console impressions and keyword counts drop even when traffic and top 10 rankings were fine.

What’s actually changed?

For years, appending “&num=100” to a Google results URL forced 100 organic listings onto one page, however, in mid-September 2025, Google disabled support for that parameter, confirming that it’s not supported going forward. Therefore, as a result, rank-tracking SEO platforms that depended on it could no longer grab positions 11–100 in one go meaning that the way we track keywords has become more difficult to report on, temporarily anyway.

Timeline:

  • Sept 12–14: Industry first notices the removal and SEO tools & platforms report disruptions

  • Around Sept 15–19: Broad coverage and SEO platforms release statements with many dashboards showing drops in keywords rankings and impressions.

What the major SEO tools said

  • Ahrefs: Confirmed industry-wide impact, with updates still frequent but fewer results can be checked beyond page one while they adapt to these changes. (Source: Ahrefs)

  • Semrush: Acknowledged the change and said they’d implemented workarounds while continuing longer-term fixes. (Source: Semrush)

  • AccuRanker: Announced limits (e.g. Top 20 visibility for ongoing tracking) while working on deeper options for the long term. (Source: Accuranker)

  • seoClarity: Said the shift multiplies retrieval work 10× but claimed their architecture kept client data stable. (Source: seoClarity)

  • STAT / Moz Pro: Acknowledged weekend disruptions but reported data stabilised after mitigation on their side.

Why your numbers dipped (even if rankings didn’t)

When a bot loaded 100 results on one page, every listing even a keyword ranking in position 87 could register an impression and a ranking. With “&num=100” gone, bots no longer generate those deep-page impressions, so GSC impressions and total “ranking keywords” fall, while average position often improves, that’s because low-visibility impressions disappear which didn’t mean or contribute to any tangible metrics.

How to tell “measurement blip” vs. real loss

  1. Cross-check in GSC: Compare the two weeks before and after Sept 15. If clicks are steady but impressions and avg position shift, that’s the “&num=100” effect, not a true drop in visibility.

  2. Look at distribution, not totals: If your rankings in the top 3 or top 10 positions are stable but keyword totals collapsed (esp. positions 11–100), it’s just down to measurement and not reality with what users are seeing. Ahrefs’ latest update specifically noted a reduced depth beyond page one for now while they work on a fix.

  3. Check your SEO platform status notes: Some platforms limited depth such as only showing page 1 results temporarily, so ensure you align expectations with each tool’s current ability.

What this means for reporting (and what to change now)

  • Re-establish your KPIs: Expect lower impressions and number of “keywords found,” especially on desktop, without a corresponding traffic dip as a result. Also ensure you are closely tracking your sessions, clicks, click-through rate, and your keyword coverage in positions 1-3 and 4-10.
  • Annotate September 2025: Add a change note around Sept 12–19 in dashboards so future trends aren’t misread and you can compare data with this change in mind.

  • Expect cost trade-offs: Now because of this change from Google, SEO tools we’re fetching 100 results from one request which now requires 10 paginated requests. This will mean raising computing costs and potentially changing what we pay for our SEO tools.

Did our rankings plummet?

Probably not. Most drops are based on a measurement (fewer bot-driven impressions and less deep-SERP data), not user-facing visibility.

Why does average position look better while impressions fell?

So, because impressions from results far down the SERP that few users see no longer inflate your numbers, your average position should become truer to user reality.

Is this change going to be permanent?

Google has said it doesn’t support the parameter. Assume the change is here to stay and update workflows accordingly.

Jamie Beatty

With more than 8 years of experience, Jamie has helped brands create and execute high-performing SEO strategies. Starting as a Marketing Executive in 2017 and now heading Circulate’s SEO department, Jamie builds comprehensive organic roadmaps that deliver measurable growth, improve user acquisition, and increased revenue." Having worked at multiple agencies over the years, he’s helped big businesses evolve and increase their visibility amongst search engines. Previous clients of his include The Range, Downsing & Reynolds, and The Inn Collection Group. At Circulate, Jamie is in charge of the entire SEO department, leading the team to help an array of clients meet and exceed their KPIs by looking at on-site and off-site SEO, as well as staying ahead of AI-led shifts in search and user behaviour, anticipating generative SERPs, evolving ranking signals, and cross-platform discovery. Jamie says: “With over 8 years in digital marketing, I’ve seen trends come and go, but disciplined, data-led SEO consistently moves the needle. At Circulate, we turn our clients’ expertise into visibility, building durable search foundations, capturing audience demand for how people search today, and ensuring long term growth."

0
    0
    Your Cart
    Your cart is empty