Google has quietly pulled the plug on a little-known feature that SEO pros and curious users loved – the num=100 parameter. This trick used to let you see 100 search results on a single page instead of the usual 10. While it may seem minor, this change is shaking up how people analyze search results and gather data online.
What was the num=100 Trick?
The num=100 parameter was like a hidden cheat code for Google search. By adding &num=100 to a Google search URL, users could expand the results page to show 100 links at once. This wasn’t officially documented by Google, but it became a go-to tool for –
- SEO Monitoring – Professionals could track keyword rankings far beyond the first page.
- Data Collection – Researchers and AI developers could scrape large amounts of SERP data quickly.
- Competitive Analysis – Businesses could spy on competitors’ visibility more efficiently.
Essentially, it saved time and effort for anyone who needed more than the first 10 results.

Why Its Removal Matters
Even though casual users might not notice, this change has a big impact on the digital world –
1. SEO Reporting Gets Tougher
With num=100 gone, tools that track keyword rankings now have to work harder. Many websites are seeing –
- Fewer tracked impressions in Google Search Console.
- Lower counts of ranking keywords make performance seem worse.
- Shifts in average ranking numbers occur because lower-ranked keywords no longer appear in reports.
This forces SEO experts to rely more on actual user interaction data rather than automated snapshots.
2. Data Collection Slows Down
AI developers and researchers who scrape Google results now face hurdles –
- Collecting the same number of results requires multiple requests.
- Time and cost increase for projects that rely on mass SERP data.
- Some datasets may be incomplete, affecting AI models or research accuracy.
3. SEO Tools Have to Adapt
Tools like Ahrefs, Semrush and AccuRanker are adjusting by –
- Focusing more on the top 10–20 results.
- Increasing API requests which raises costs.
- Normalizing reports to reflect new limitations.
The ecosystem is shifting, but these adaptations help maintain useful insights.

Why Google Did This
Google hasn’t officially explained the change, but experts speculate –
- Better Accuracy – Reducing automated data collection ensures metrics reflect real user behavior.
- Improved User Experience – Users may explore results more thoroughly instead of scrolling through long lists.
- Anti-Scraping Measures – Making large-scale scraping harder protects Google’s search infrastructure.
In short, Google seems to be nudging everyone toward more natural, human-focused interactions.
How to Adjust
For marketers, SEO pros and AI researchers, the path forward includes –
- Focus on Engagement Metrics – Clicks, conversions and actual user behavior are now more important than raw rankings.
- Update Tools and Workflows – Adapt scrapers and tracking methods to the new limits.
- Diversify Data Sources – Consider supplementing Google data with insights from other search engines or data platforms.
Being flexible and creative is the key to staying effective in this new landscape.
Curious to learn more tips, tricks, and hacks to level up your skills? Check out Free Skills Hub and dive in!
