about summary refs log tree commit diff
path: root/filters/html-converters/txt2html
diff options
context:
space:
mode:
authorJason A. Donenfeld <Jason@zx2c4.com>2013-05-28 14:17:00 +0200
committerJason A. Donenfeld <Jason@zx2c4.com>2013-08-12 13:14:10 -0600
commit23debef62104c70600be2b745ec3957538eeac6e (patch)
tree3b68eab7624907836b0e614328a529e686224830 /filters/html-converters/txt2html
parentuse favicon by default (diff)
downloadcgit-pink-23debef62104c70600be2b745ec3957538eeac6e.tar.gz
cgit-pink-23debef62104c70600be2b745ec3957538eeac6e.zip
robots.txt: disallow access to snapshots
My dmesg is filled with the oom killer bringing down processes while the
Bingbot downloads every snapshot for every commit of the Linux kernel in
tar.xz format. Sure, I should be running with memory limits, and now I'm
using cgroups, but a more general solution is to prevent crawlers from
wasting resources like that in the first place.

Suggested-by: Natanael Copa <ncopa@alpinelinux.org>
Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de>
Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
Diffstat (limited to 'filters/html-converters/txt2html')
0 files changed, 0 insertions, 0 deletions
03b995add5&follow=1'>Use :target rather than :focus pseudo-classJune McEnroe 2019-12-18Copy cgit auxiliary binaries properlyJune McEnroe 2019-12-18Add git.causal.agency cgit configJune McEnroe 2019-12-18Bail from hi if input is binaryJune McEnroe 2019-12-16Post "cgit setup"June McEnroe