爬虫用于从网上得到目标数据,根据需要对其予以利用,加以分析,得到想要的实验成果。现在讲一讲我这两天学到的东西。
第一,爬虫的算法结构,包括以下几个方面:
(1)读取网络数据
(2)将获取的数据解析为目标格式,进而筛选出想要的数据
(3)将有用数据存于本地数据库中
第二,具体实施方案
(1)读取网络数据,需要用到urllib和urllib2两个库,和需要爬取数据的资源定位符URL。
通过url,将网页所有数据
1 request = urllib2.request(url) 2 response = urllib2.response(request) 3 html = response.read()
关于url的动态变化
1 url="http://wsbs.bjepb.gov.cn/air2008/Air1.aspx?time=" 2 i=0 3 for tim in range(1364774400,1365206400,86400): 4 i=i+1 5 if(i%180==0): 6 time.sleep(15) 7 ltime=time.localtime(tim) 8 timeStr=time.strftime("%Y-%m-%d", ltime) 9 url="http://wsbs.bjepb.gov.cn/air2008/Air1.aspx?time=" 10 url=url+timeStr 11 print url
(2)利用BeautifulSoup将获取的数据解析为目标格式,进而筛选出想要的数据
1 soup = BeautifulSoup(html,"html.parser") 2 trs = soup.find("table",id="DaliyReportControl1_DataGridDataDic") 3 length = len(trs.contents)
(3)利用Access,分三步:建立数据库连接 --> 打开一张表 --> 存储数据
1 import win32com.client 2 ##建立数据库连接 3 conn = win32com.client.Dispatch(r'ADODB.Connection') 4 DSN = 'PROVIDER=Microsoft.Jet.OLEDB.4.0;DATA SOURCE=D:/test.mdb;' 5 conn.Open(DSN) 6 ##打开一个记录集 7 rs = win32com.client.Dispatch(r'ADODB.Recordset') 8 rs_name = 'aircondition'#表名 9 rs.Open('[' + rs_name + ']', conn, 1, 3) 10 print rs.RecordCount
*************
conn.Close()
1 for x in range(2,length-1): 2 if(len(trs.contents[x].contents)==8): 3 rs.AddNew() 4 rs.Fields('Station').Value=trs.contents[x].contents[2].string 5 rs.Fields('AQI').Value=trs.contents[x].contents[3].string 6 rs.Fields('Pollutants').Value=trs.contents[x].contents[4].string 7 rs.Fields('Grade').Value=trs.contents[x].contents[5].string 8 rs.Fields('Air_quality').Value=trs.contents[x].contents[6].string 9 rs.Fields('updatedate').Value=timeStr 10 rs.Update() 11 ## print str(x) + "***********8" 12 ## print trs.contents[x].contents[0].string + "," + trs.contents[x].contents[1].string + "," + trs.contents[x].contents[2].string + "," + trs.contents[x].contents[3].string + "," + trs.contents[x].contents[4].string + "," + trs.contents[x].contents[5].string + "," + trs.contents[x].contents[6].string + "," + trs.contents[x].contents[7].string 13 elif(len(trs.contents[x].contents)==7): 14 rs.AddNew() 15 rs.Fields('Station').Value=trs.contents[x].contents[1].string 16 rs.Fields('AQI').Value=trs.contents[x].contents[2].string 17 rs.Fields('Pollutants').Value=trs.contents[x].contents[3].string 18 rs.Fields('Grade').Value=trs.contents[x].contents[4].string 19 rs.Fields('Air_quality').Value=trs.contents[x].contents[5].string 20 rs.Fields('updatedate').Value=timeStr 21 rs.Update() 22 ## print str(x) + "*******" + "7" 23 ## print trs.contents[x].contents[0].string + "," + trs.contents[x].contents[1].string +"," + trs.contents[x].contents[2].string + "," + trs.contents[x].contents[3].string + "," + trs.contents[x].contents[4].string + "," + trs.contents[x].contents[5].string + "," + trs.contents[x].contents[6].string 24 print "**************"+str(i)+"***********" + str(timeStr)+"**************"
时间关系,并没有很仔细的归纳。以后有机会再整理吧。