<html><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">Have you considered using robots.txt to tell the bots to stay away from those error pages?<div><br></div><div>Daniel</div><div><br><div><div>On 31 Jul 2008, at 11:2731 Jul 2008, Matthew Rudy Jacobs wrote:</div><br class="Apple-interchange-newline"><blockquote type="cite"><div dir="ltr">You mean you get exceptions from bots randomly causing 500s?<br><br>Yeah...<br>don't know what to do about that.<br><br>Just because it's a bot, doesn't mean it's request is invalid.<br><br>eg. Google indexes a lot of pages that you want to work, but may not get hit very often.<br> <br>If you have proper Regexp inbox rules,<br>then perhaps you can just put mails that don't have a body matching "Mozilla|MSIE|Safari" into a "exceptions (maybe from a bot)" mail folder.<br><br>:s<br> <br><div class="gmail_quote">2008/7/31 Andrea (Q) <span dir="ltr"><<a href="mailto:q@ptumpa.com">q@ptumpa.com</a>></span><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"> Hi,<br> <br> everyday i receive a lot of exceptions but the most of are from bot, there is a way to manage that situation? i don't want to receive the exceptions raised by a bot. I think that i could try to read the header of the request and the HTTP_USER_AGENT field but maybe there is something better.<br> <br> Cheers<br> <br> Andrea<br> _______________________________________________<br> Chat mailing list<br> <a href="mailto:Chat@lists.lrug.org" target="_blank">Chat@lists.lrug.org</a><br> <a href="http://lists.lrug.org/listinfo.cgi/chat-lrug.org" target="_blank">http://lists.lrug.org/listinfo.cgi/chat-lrug.org</a><br> </blockquote></div><br></div> _______________________________________________<br>Chat mailing list<br><a href="mailto:Chat@lists.lrug.org">Chat@lists.lrug.org</a><br>http://lists.lrug.org/listinfo.cgi/chat-lrug.org<br></blockquote></div><br></div></body></html>