Introduction - If you have any usage issues, please Google them yourself
本人自己用VC++开发的网络爬虫程序,可以实现整个网站的抓取,网页中所有的URL重新生成.-I own VC++ development with the network of reptiles procedures, can crawl
Packet : Crawler.rar filelist
Crawler\DownloadData.cpp
Crawler\DownloadData.h
Crawler\MainThread.cpp
Crawler\MainThread.h
Crawler\NetCrawler.aps
Crawler\NetCrawler.clw
Crawler\NetCrawler.cpp
Crawler\NetCrawler.dsp
Crawler\NetCrawler.dsw
Crawler\NetCrawler.h
Crawler\NetCrawler.ncb
Crawler\NetCrawler.plg
Crawler\NetCrawler.rc
Crawler\NetCrawlerDlg.cpp
Crawler\NetCrawlerDlg.h
Crawler\ProjectDlg.cpp
Crawler\ProjectDlg.h
Crawler\ReadMe.txt
Crawler\Resource.h
Crawler\StdAfx.cpp
Crawler\StdAfx.h
Crawler\Crawler\res\NetCrawler.ico
Crawler\res\NetCrawler.rc2
Crawler\res\Thumbs.db
Crawler\Release\DownloadData.obj
Crawler\Release\MainThread.obj
Crawler\Release\NetCrawler.exe
Crawler\Release\NetCrawler.obj
Crawler\Release\NetCrawler.pch
Crawler\Release\NetCrawler.res
Crawler\Release\NetCrawlerDlg.obj
Crawler\Release\ProjectDlg.obj
Crawler\Release\StdAfx.obj
Crawler\Release\vc60.idb
Crawler\NetCrawler.opt
Crawler\res
Crawler\Release
Crawler