Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.
Edit: As Thunraz points out below, there’s a footnote that reads “Numbers after + are successful hits on ‘robots.txt’ files” and not scientific notation.


You just solve it as per the blog post, because it’s trivial to solve, as your browser is literally doing so in a slow language on a potentially slow CPU. It’s only solving 5 digits of the hash by default.
If a phone running JavaScript in the browser has to be able to solve it you can’t just crank up the complexity. Real humans will only wait tens of seconds, if that, before giving up.