We have always had bots visiting our website. They were mostly kind bots, like the crawlers that keep the databases of search engines up-to-date. Those kind bots start by looking at our robots.txt files before doing anything, and respect the restrictions that are set in those files.
However, things have changed. Like other websites, for instance Wikipedia, we are more and more being visited by AI scrapers, bots that scrape the Internet for anything they can find to train AI applications. They are usually extremely hungry for information, so they download much, much more than an ordinary user would do. Moreover, many of them are impolite: they don’t respect the rules set in our robots.txt files, they hide who they really are, they don’t put a little pause in between requests – on the contrary, they hammer our servers with requests from lots and lots of different IP addresses at the same time. The result is that parts of mageia.org, like our Bugzilla, Wiki and Forums, become unreachable.
Below you can see the CPU load of one of our most important servers, where, amongst other things, our forums and wiki are located:

Even if our infra upgrade had already been finished, this would be really hard to mitigate.
Blocking the used IP addresses is useless because they constantly switch to new ones. One of our sysadmins just told me about a big issue: “mobile proxies” where bots proxy their request through unsuspecting users’ phones. That makes the requests look much more legitimate and hard to block without also blocking real users. A lot of that happens without users even knowing their phone is being used like this. Some applications include proxies along with some game or other app and hide it in fine print in the terms of service. Last year, it was reported that Google had removed a bunch of such applications from their store.
Apart from phones, there are IoT devices and also ordinary computers that ended up in botnets, because they were not well protected. They can be used for AI scraping and probably are now.
Our sysadmins do time and again succeed in mitigating the problem, but it is a “cat and mouse game”, so the problem is likely to reoccur.
If you know people working on AI applications which need to be trained, please ask them to make sure their bots read and respect the robots.txt files they encounter. And, of course, please nudge your friends and family, when you think they need that, to make sure their computers and other smart devices get all security updates as soon as they are released.