欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

nginx服务器

程序员文章站 2022-06-11 16:19:18
...

nginx.conf配置

建议:自行找到对应
worker_processes  auto;    #工作进程:根据硬件调整

worker_connections  1024;    #工作进程的最大连接数量


#配置多站点    http中加入此行
include /路径/*.conf;    #主要是加入此行,如有则忽略


#在新建的.conf加入server
server
{
        listen 80;
        server_name 域名;
        location / {
	        root    地址;
            index  index.html;
        }
        error_page 404 /404.html;
		location = /40x.html{
            root   html;
		}
        error_page   500 502 503 504  /50x.html;
        location = /50x.html {
            root   html;
        }
}

 

反向代理和网址

#反向代理
server {
        listen       80;
        server_name  域名;
        location / {
	        proxy_pass   http://127.0.0.1:8096;
        }
        error_page   500 502 503 504  /50x.html;
        location = /50x.html {
            root   html;
        }
 }
 
 
#网站
server {
        listen       80;
        server_name  域名;
        root	F:/resources;
        index  index.html index.htm;
 }

 

跨域

#nginx设置允许跨域
#*代表接受任意域名的请求
add_header Access-Control-Allow-Origin *;
#是否允许发送Cookie
add_header Access-Control-Allow-Credentials 'true';
#请求类型
add_header Access-Control-Allow-Methods 'GET,POST,OPTIONS';
#预检请求
add_header Access-Control-Allow-Headers 'DNT,X-Mx-ReqToken,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
#自定义请求头无法操作
add_header Access-Control-Expose-Headers 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Content-Range,Range';

注:目标URL就是在Tomcat上域名/IP

Nginx反爬虫处理

1.进入到nginx安装目录下的conf目录,将如下代码保存为 agent_deny.conf

#禁止Scrapy等工具的抓取
if ($http_user_agent ~* (Scrapy|Curl|HttpClient)) {
     return 403;
}
#禁止指定UA及UA为空的访问
if ($http_user_agent ~* "FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms|^$" ) {
     return 403;             
}
#禁止非GET|HEAD|POST方式的抓取
if ($request_method !~ ^(GET|HEAD|POST)$) {
    return 403;
}
# 禁止百度及谷歌等
if ($http_user_agent ~* "qihoobot|Baiduspider|Googlebot|Googlebot-Mobile|Googlebot-Image|Mediapartners-Google|Adsbot-Google|Feedfetcher-Google|Yahoo! Slurp|Yahoo! Slurp China|YoudaoBot|Sosospider|Sogou spider|Sogou web spider|MSNBot|ia_archiver|Tomato Bot”)  {
    return 403;
}
2.然后,在网站相关配置中的 server段插入如下代码:

include agent_deny.conf;

3.保存后,执行如下命令,平滑重启nginx即可:

/usr/local/nginx/sbin/nginx -s reload

 

负载均衡

upstream配置:

在http配置下增加upstream配置即可:

upstream nodes {
#监控server的80端口,weigh为权重,2每访问两次,3访问一词

server 192.168.0.1:8080 weigh=2;

server 192.168.0.2:8080;

}


#server配置
location / {  
        proxy_pass http://nodes;  
        index  index.html index.htm;  
}