Select Page

Return to All Resources

Hillstone iNGFW White Paper: A Hybrid Approach to Detect Malicious Web Crawlers

A web crawler (also called web spider, web robot) is typically a script or computer program that browses the targeted website in an orderly and automated manner. It is an important method for collecting information on the Internet and is a critical component of search engine technology. Most popular search engines, such as GoogleBot and BaiduSpider, use underlying web crawlers to get the latest data on the internet.

Resource registration

Fields marked with an asterisk are required.

  • This field is for validation purposes and should be left unchanged.