Unseen Scraper Attacks Leave WordPress Sites Crawling, Analytics Blind
Breaking: Hidden Traffic Flood Slows WordPress Sites, Evades Google Analytics
A massive automated scraping campaign has been causing intermittent slowdowns on a high-profile WordPress site for months, and standard analytics tools like Google Analytics never detected it. The culprit: Go-http-client/1.1, a library from the Go programming language, hammering the server with over 67,000 requests—all invisible to browser-based tracking.

"The client reported slowness for months, but every diagnostic showed normal traffic," said Alex Chen, lead engineer at SysWP, the server-monitoring firm that uncovered the attack. "We dug into raw server logs and found a flood of requests from a non-browser user agent, completely missed by Google Analytics, Plausible, and Fathom."
How the Attack Worked
The analysis revealed a single user agent—Go-http-client/1.1—accounted for 99% of all unknown traffic, generating 67,323 hits in the observation window. This library makes automated HTTP requests without a browser, meaning no JavaScript runs, so analytics snippets never fire.
Each request consumed a PHP worker, hit the database, and fully rendered WordPress. On a hosting plan with limited concurrent workers, this caused intermittent resource contention. "Sometimes the site was fine; sometimes it crawled—classic symptoms of a scraper competing with real users for server capacity," Chen explained.
Background: Why Analytics Miss the Threat
Google Analytics and similar tools rely on a JavaScript snippet that executes only in a browser environment. Non-browser requests—from bots, scrapers, or API clients—never trigger the snippet, creating a blind spot. This leaves site owners unaware of huge traffic sources that degrade performance.
Server-level monitoring, such as SysWP Radar, captures all requests at the network layer before any client-side code runs. Only there did the full picture emerge: a hidden ecosystem of malicious and suspicious traffic, including axios/1.15.0 (308 hits) and other Node.js-based scrapers.

What This Means for WordPress Site Owners
This incident proves that slow site performance can originate from traffic your analytics never sees. The risks go beyond speed: systematic scraping can exhaust PHP worker pools, trigger 503 errors, inflate hosting costs, steal content for AI training or republishing, and even degrade Core Web Vitals scores.
"If you rely solely on Google Analytics, you're flying blind to the most damaging traffic," Chen warned. "Server-level analytics are no longer optional—they're essential for understanding what's really hitting your server."
Immediate Actions to Take
- Enable server-level logging (e.g., access.log) and analyze for non-browser user agents.
- Use a web application firewall (WAF) to block known scraper libraries like Go-http-client or axios.
- Consider adding a bot detection service that works at the server level.
- Monitor PHP worker usage in real time to spot contention before users complain.
For a deeper dive, see our Background section on analytics limitations and What This Means for your site's security.
Related Articles
- Why I Switched from OneDrive to Ente Photos for Secure Photo Storage
- Mastering GitHub Copilot CLI: Interactive vs Non-Interactive Mode Step-by-Step
- Microsoft Shakes Up Leadership: Ryan Roslansky Takes the Reins of Teams and a New Work Experiences Group
- Mastering GitHub Copilot CLI: A Hands-On Guide to Interactive and Non-Interactive Modes
- Buyer's Market Emerges in Some States as Housing Inventory Growth Slows Nationally
- 10 Hidden Security Risks of AI Agents with Tools and Memory
- How Countries Can Successfully Transition Away from Fossil Fuels: A Practical Roadmap
- Mastering Data Analysis with Python: A Comprehensive Guide