And I just want to know if you cross checked the assumption he makes. Always cross check ![]()
Also, what would have helped in the article is to name a few sites where you can manually check if your website is blocked on a infrastructure level.
I ran a few user sites through these tools:
- CrawlerCheck: Free Googlebot & AI Crawlability Test
- AI Bot Access Analyzer - Test AI Crawler Accessibility | Max Braglia
They all show up green. ![]()
Did a quick search and can confirm the assumption from multiple high authoritiy sources. AI bots are being blocked at the server and CDN level, often by default - search for Cloudflare’s “AIndependence” Feature and Hostinger Data Study.
But, be aware that there are different types of AI bots. There is also a significant debate around “zero-click searches” and a real traffic loss.
Another technical issue is the JavaScript Rendering - Unline Googlebot, most bots are not yet great at executing complex JS. This could be a short time issue with Collections as we rely on JS there.
Since we do not charge for clicks and traffic though, the constant clicks will not cost you more.
Unless you have a CDN like Cloudflare in front of your website, you “should” be fine ![]()
I still await an answer from Hetzner if they do block any AI bots just to get close to 100% with my yes ![]()