I think it would be a good idea to protect the resources of this forum from AI scrapers and bots and DDOS attacks by guarding it with anubis.
Quoting from their own description:
Anubis is a Web AI Firewall Utility that weighs the soul of your connection using one or more challenges in order to protect upstream resources from scraper bots.
This program is designed to help protect the small internet from the endless storm of requests that flood in from AI companies. Anubis is as lightweight as possible to ensure that everyone can afford to protect the communities closest to them.
Anubis is a bit of a nuclear response. This will result in your website being blocked from smaller scrapers and may inhibit “good bots” like the Internet Archive. You can configure bot policy definitions to explicitly allowlist them and we are working on a curated set of “known good” bots to allow for a compromise between discoverability and uptime.
Or see https://anubis.techaro.lol/docs/design/how-anubis-works/
Big websites/projects like Arch wiki, Gnome Gitlab, Unesco, Linux Mailing List Archive, and Codeberg also use Anubis to protect their resources.
Like anything: This also has some downsides (i.e. requiring JavaScript and WebWorkers to access the website - but these are default on) but this is more a necessary reaction to recent trends in the internet.
I don’t have access to the technical bits of the discourse forum, but could probably provide guidance, though I think the setup isn’t too hard?