2015年(55)
分类: LINUX
2015-03-12 11:23:44
对于做国内站的我来说,我不希望国外蜘蛛来访问我的网站,特别是个别垃圾蜘蛛,它们访问特别频繁。这些垃圾流量多了之后,严重浪费服务器的带宽和资源。通过判断user agent,在nginx中禁用这些蜘蛛可以节省一些流量,也可以防止一些恶意的访问。
1、进入nginx的配置目录,例如cd /usr/local/nginx/conf
2、添加agent_deny.conf配置文件
#禁止Scrapy等工具的抓取 if ($http_user_agent ~* (Scrapy|Curl|HttpClient)) { return 403;
} #禁止指定UA及UA为空的访问 if ($http_user_agent ~ "FeedDemon|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|YisouSpider|HttpClient|MJ12bot|heritrix|EasouSpider|LinkpadBot|Ezooms|^$" )
{ return 403;
} #禁止非GET|HEAD|POST方式的抓取 if ($request_method !~ ^(GET|HEAD|POST)$) { return 403;
}
3、在网站相关配置文件中插入代码“include agent_deny.conf ;”。
location ~ [^/]\.(/|$)
{ try_files $uri =404; fastcgi_pass unix:/tmp/php-cgi.sock; fastcgi_index index.php; include fastcgi.conf; include agent_deny.conf ;
}
4、重新加载
/etc/init.d/nginx reload
通过curl模拟蜘蛛抓取访问。
root@vm-199:~# curl -I -A "BaiduSpider"
HTTP/1.1 200 OK
Server: nginx
Date: Mon, 09 Feb 2015 03:37:20 GMT
Content-Type: text/html; charset=UTF-8 Connection: keep-alive
Vary: Accept-Encoding
X-Powered-By: PHP/5.5.19
Vary: Accept-Encoding, Cookie
Cache-Control: max-age=3, must-revalidate
WP-Super-Cache: Served super file from PHP
root@vm-199:~# curl -I -A "JikeSpider"
HTTP/1.1 403 Forbidden
Server: nginx Date: Mon, 09 Feb 2015 03:37:44 GMT
Content-Type: text/html
Content-Length: 162 Connection: keep-alive
root@vm-199:~# curl -I -A ""
HTTP/1.1 403 Forbidden
Server: nginx Date: Mon, 09 Feb 2015 03:37:52 GMT
Content-Type: text/html
Content-Length: 162 Connection: keep-alive
nginx上的效果如下。
到这里,nginx通过判断User-Agent屏蔽蜘蛛访问网站就已经完成,可以根据实际情况对agent_deny.conf中的蜘蛛进行增加、删除或者修改。
本文来自: