Introduction - If you have any usage issues, please Google them yourself
Will visit records added to the mysql , user-friendly daily statistics spider crawling records, so you can Baidu and other search engines include a general understanding. Usage: 1. Create a , zhizhu.sql into the 2. In the need for statistical spider pages include include dirname (__ FILE __). / Robot.php 3. To protect the data security site, the demo page has been Remove the delete function. But the compressed source code in the package is to have this feature, please rest assured that use