1) Are you sure it's not a search engine crawler? If that's the case, you could modify your robots.txt file (do a Google search because I forget the parameters).
2) You could cut them off at the firewall, which is probably what I would do if I couldn't confirm it is a search spyder/crawler. But I know from running my own servers, this is an endless task.
3) If you want to ensure the program isn't being penetrated, then build a file for blocked IPs, and when someone clicks to login, on the ValidateUpdate embed, choose the destination, like:
BLK:Ip=p_web.RequestData.FromIP
SET(BLK:BySysID,BLK:BySysID)
NEXT(BlockedIP)
IF p_web.RequestData.FromIP=BLOCK:Ip !blocked IP address match
!GO SOMEWHERE
ELSE
!GO TO HOME PAGE
END
The GO SOMEWHERE is a web page with no other links, headers, footers, menu, etc. Rather than a "Nana, nana, nana page... You're busted", I'd probably write it to say something like "Experiencing Server Problems..." or something. You might also be able to handle this better in the WebHandler proc to ensure there is no login (I was just thinking that the server message doesn't make as much sense if you've already got a login screen up).