Protect your server infrastructure from the growing threat of aggressive AI web scrapers that cause downtime and resource exhaustion. This anti-bot solution utilizes a clever Proof-of-Work scheme, similar to Hashcash, to deter automated scraping. By requiring a small computational effort from the visitor's browser, the load remains completely unnoticeable for individual human users but becomes prohibitively expensive for mass scraping operations.\n\nKey Features:\n* Server Protection: Prevent automated bots from overwhelming your hosting resources.\n* Proof-of-Work Defense: Utilize a mathematical challenge system that stops scrapers.\n* Advanced Fingerprinting: Benefit from ongoing development aimed at headless browser identification.\n\nKeep your website fast, stable, and accessible to humans without compromising on security.
services:
deluge:
image: lscr.io/linuxserver/deluge:latest
container_name: deluge
environment:
- PUID=1000
- PGID=1000
- TZ=Etc/UTC
- DELUGE_LOGLEVEL=error
volumes:
- ./config:/config
- ./downloads:/downloads
ports:
- 8112:8112
- 6881:6881
- 6881:6881/udp
- 58846:58846
restart: unless-stopped# No sensitive credentials are required for the default configuration.
# Infrastructure settings (PUID, PGID, TZ) are configured directly in docker-compose.yml per instructions.Auto-fetched about 22 hours ago
Auto-fetched about 22 hours ago