• Strawberry@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      1
      ·
      16 days ago

      The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.

    • LiveLM@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 days ago

      I’m sure that if it was that simple people would be doing it already…

    • nutomic@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      16 days ago

      Cache size is limited and can usually only hold a limited number of most recently viewed pages. But these bots go through every single page on the website, even old ones that are never viewed by users. As they only send one request per page, caching doesnt really help.