Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?

Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.

Edit: As Thunraz points out below, there’s a footnote that reads “Numbers after + are successful hits on ‘robots.txt’ files” and not scientific notation.

  • Thunraz@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It’s 12181 hits and the number behind the plus sign are robots.txt hits. See the footnote at the bottom of your screenshot.

    • benagain@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Phew, so I’m a dumbass and not reading it right. I wonder how they’ve managed to use 3MB per visit?