达到由机器人流量引起的 MaxRequestWorkers 限制 ERR_CONNECTION_TIMED_OUT
Reaching MaxRequestWorkers limits caused by bot traffic ERR_CONNECTION_TIMED_OUT
我有一个 ERR_CONNECTION_TIMED_OUT,网站在大约 1.5 小时的时间范围内多次关闭几分钟,这是我在日志中发现的内容:
[Mon May 21 11:21:51.236380 2018] [mpm_worker:error] [pid 3206:tid 140035442734330] AH00287: server is within MinSpareThreads of MaxRequestWorkers, consider raising the MaxRequestWorkers setting
我去 /etc/apache2/conf/httpd.conf
发现了这个:
MaxRequestWorkers [The number specified for my server]
我昨天刚换了服务器,这是网络专家关于问题原因的一种理论:
The server is hitting maxworkers due to the type of traffic that are
coming in, which are bot traffic. BingBot and Googlebot are crawling
your site again, likely because of the new server. You can manage the
crawl rate of these bots so that they do request so much at any given
time.
这个解释合理吗?我觉得很难相信。
编辑 1:听起来像是 Google!
发起的拒绝服务 (DDoS) 攻击
甚至还有文章讨论机器人流量如何影响网站:
- http://support.hostgator.com/articles/specialized-help/telling-google-how-often-to-crawl-your-website
- https://support.google.com/webmasters/answer/48620?hl=en
- http://support.hostgator.com/articles/specialized-help/telling-bing-how-often-to-crawl-your-website
- http://support.hostgator.com/articles/hosting-guide/lets-get-started/how-to-use-robots-txt
- http://support.hostgator.com/articles/specialized-help/technical/apache-htaccess/user-agent-blocks-mainly-for-bots
我不确定机器人流量是否真的是我的问题的原因,但错误消息包括以下部分:consider raising the MaxRequestWorkers setting
。这就是我所做的,现在一切似乎都在正常工作。
重要提示:如果您这样做,请不要忘记阅读 https://httpd.apache.org/docs/current/mod/mpm_common.html 并记住如果您增加 MaxRequestWorkers,则必须相应地增加 ServerLimit。文档对此进行了解释。
我有一个 ERR_CONNECTION_TIMED_OUT,网站在大约 1.5 小时的时间范围内多次关闭几分钟,这是我在日志中发现的内容:
[Mon May 21 11:21:51.236380 2018] [mpm_worker:error] [pid 3206:tid 140035442734330] AH00287: server is within MinSpareThreads of MaxRequestWorkers, consider raising the MaxRequestWorkers setting
我去 /etc/apache2/conf/httpd.conf
发现了这个:
MaxRequestWorkers [The number specified for my server]
我昨天刚换了服务器,这是网络专家关于问题原因的一种理论:
The server is hitting maxworkers due to the type of traffic that are coming in, which are bot traffic. BingBot and Googlebot are crawling your site again, likely because of the new server. You can manage the crawl rate of these bots so that they do request so much at any given time.
这个解释合理吗?我觉得很难相信。
编辑 1:听起来像是 Google!
发起的拒绝服务 (DDoS) 攻击甚至还有文章讨论机器人流量如何影响网站:
- http://support.hostgator.com/articles/specialized-help/telling-google-how-often-to-crawl-your-website
- https://support.google.com/webmasters/answer/48620?hl=en
- http://support.hostgator.com/articles/specialized-help/telling-bing-how-often-to-crawl-your-website
- http://support.hostgator.com/articles/hosting-guide/lets-get-started/how-to-use-robots-txt
- http://support.hostgator.com/articles/specialized-help/technical/apache-htaccess/user-agent-blocks-mainly-for-bots
我不确定机器人流量是否真的是我的问题的原因,但错误消息包括以下部分:consider raising the MaxRequestWorkers setting
。这就是我所做的,现在一切似乎都在正常工作。
重要提示:如果您这样做,请不要忘记阅读 https://httpd.apache.org/docs/current/mod/mpm_common.html 并记住如果您增加 MaxRequestWorkers,则必须相应地增加 ServerLimit。文档对此进行了解释。