Running a WordPress site comes with its fair share of unexpected challenges, particularly when it comes to balancing security with performance and visibility. One such case involved the popular plugin All In One WP Security & Firewall, a tool I’ve used faithfully for years—until it unexpectedly blocked Googlebot and wrecked my site’s search indexing almost overnight. This article walks through the problem, how it unfolded, and the detailed steps of the robots debugging checklist that helped piece it all back together.
TL;DR
All In One WP Security & Firewall, when misconfigured, blocked access to Googlebot, resulting in a significant drop in indexing and organic traffic. The issue stemmed from overly strict firewall settings that treated important crawlers like suspicious bots. This article details the diagnosis and the step-by-step debugging checklist used to restore normal search engine access. Follow this guide if you’re facing mysterious de-indexing or crawling issues.
How It All Started
Everything seemed normal until a routine check in Google Search Console revealed a red flag: a sharp drop in indexed pages, dropping by over 60% in just a few days. At first, it seemed like a temporary crawl delay, but the issue deepened. The Coverage tab showed multiple “Blocked by robots.txt” and soft 403 errors across critical URLs.
After verifying that robots.txt hadn’t changed and there were no new disallow directives, the suspicion shifted toward server-level restrictions. Sure enough, All In One WP Security had recently been updated and several new features had gone live. That’s where the investigation began.

A Closer Look at the Robot Blockage
The All In One WP Security plugin offers a wide range of security features—from login protection and IP blacklists to firewall rules. Digging into the Firewall > Basic Firewall Rules section revealed one major culprit: the “6G Blacklist” setting.
This feature, designed to block bad bots, was too aggressive. It included rules that filtered out certain user agents—which, unfortunately, also included Googlebot. Combine that with the “Advanced Character String Filter” and “Deny Bad Query Strings” settings, and it created a perfect storm of legitimate traffic loss masquerading as bot protection.
The Robots Debugging Checklist
To fix the issue and prevent future occurrences, the following checklist was developed and later shared with the team.
1. Verify Robots.txt
Start at the obvious spot by checking that your robots.txt wasn’t modified:
- Go to
yourdomain.com/robots.txt - Confirm no
Disallow: /or excessive path blocks - Use Google Search Console’s Robots Testing Tool for syntax verification
2. Inspect Google Search Console Crawl Errors
Check GSC to see what URLs are being affected and what types of error messages you’re receiving:
- Look for “Blocked by robots.txt”, “Access denied”, or “Submitted URL marked noindex”
- Address these warnings directly, especially those under the Indexing > Pages report
3. Test Like a Bot
Use tools to simulate how Googlebot sees your pages:
- TechnicalSEO.com’s Fetch & Render tool
- Use curl command:
curl -A "Googlebot" https://yourdomain.com
If you’re served a 403 Forbidden, you likely have a user-agent or IP-based block restricting crawler access.
4. Inspect .htaccess Rules
All In One WP Security heavily modifies your .htaccess file. Be sure to:
- Backup and review your
.htaccessfor blocks on bots or query parameters - Look for lines like
RewriteCond %{HTTP_USER_AGENT}targeting bots
5. Disable Security Features Temporarily
One by one, disable specific All In One WP Security features:
- Uncheck “6G Firewall Protection”
- Disable “Deny Bad Bots” (under Firewall > Internet Bots)
- Retest Googlebot crawl using a simulation tool
6. Whitelist Googlebot User Agent
If you must keep some firewall protections active, add rules to allow Googlebot:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Googlebot [OR]
RewriteCond %{HTTP_USER_AGENT} Googlebot-Mobile
RewriteRule .* - [L]
</IfModule>
7. Monitor Server Logs
Check raw server logs or use a plugin like WP Activity Log to monitor suspicious activity and see exactly when bots are blocked or allowed. Watching these logs helped me catch 403 responses issued during Googlebot visits.
Getting Reindexed
Once changes were made and Googlebot could freely access the site, a request for indexing was submitted via Search Console’s URL Inspection Tool. Within 72 hours, crawls resumed and index coverage began to recover gradually.
It still took over 10 days to restore previous indexing levels, but at least the slide was stopped and the primary issue resolved. To speed this up, updating the XML sitemap and pinging Google helped reinitiate deeper crawling sessions.
Lessons Learned
This whole situation taught a valuable lesson: “More secure” doesn’t always mean better for SEO. Bot blocking must be balanced carefully so that good crawlers aren’t treated like dangerous threats. Always test your configuration changes carefully, especially after plugin updates or site optimizations.
Here are some habit-level safeguards to implement:
- Always backup your
.htaccessand firewall settings before modifying - After security updates, test crawl access using user agent simulators
- Set up anomaly alerts in Search Console to catch problems early
FAQ: Common Questions About Bot Blocking and WordPress
- Can plugins like All In One WP Security really block Googlebot?
Yes. If configured aggressively, these plugins can block not just malicious bots but essential ones like Googlebot, Bingbot, or even Twitter cards. - What are signs that bots are being blocked?
Index drops in Search Console, increased crawl errors, missing pages from search results, or 403 server errors when simulating a crawl. - How can I safely block bad bots without affecting good ones?
Use fine-tuned allowlists for good user agents and IPs. Avoid blanket blocks by user agent strings alone. Tools like Cloudflare also offer bot filtering with more nuance. - How long does it take to recover indexing after bots are unblocked?
It varies—anywhere from a few days to several weeks, depending on crawl frequency, site authority, and page count. Ensure that sitemaps are accurate and up to date. - Can blocking bots harm my SEO?
Absolutely. If search engine crawlers can’t access your site, they can’t index or rank your pages. That means lost visibility, low traffic, and reduced conversions.
Maintaining site security is crucial, but not at the cost of findability. Make sure to audit your site’s bot access regularly and don’t let plugins block the very traffic you’re working to build.
