• 爬虫之线程池的应用


    恰当的使用线程池可以很大程度上的提升爬去速率,下面介绍一个使用线程池的案例

    #爬取梨视频数据
    import requests
    import re
    from lxml import etree
    from multiprocessing.dummy import Pool
    import random
    #实例化一个线程池对象
    pool = Pool(5)
    url = 'https://www.pearvideo.com/category_1'
    headers = {
        'User-Agent':'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36'
    }
    page_text = requests.get(url=url,headers=headers).text
    tree = etree.HTML(page_text)
    li_list = tree.xpath('//div[@id="listvideoList"]/ul/li')
    
    video_url_list = []
    for li in li_list:
        detail_url = 'https://www.pearvideo.com/'+li.xpath('./div/a/@href')[0]
        detail_page = requests.get(url=detail_url,headers=headers).text
        video_url = re.findall('srcUrl="(.*?)",vdoUrl',detail_page,re.S)[0]
        video_url_list.append(video_url)
        
    video_data_list = pool.map(getVideoData,video_url_list)
    
    pool.map(saveVideo,video_data_list)
    def getVideoData(url):
        return requests.get(url=url,headers=headers).content
    def saveVideo(data):
        fileName = str(random.randint(0,5000))+'.mp4'
        with open(fileName,'wb') as fp:
            fp.write(data)
  • 相关阅读:
    其他内容
    html标签
    ambari安装集群下安装kafka manager
    greenplum-cc-web4.0监控安装
    ambari安装集群下python连接hbase之安装thrift
    hadoop运维问题记录
    Ambari2.6.0 安装HDP2.6.3(离线安装)
    mongodb 定时备份
    linux top命令详解
    Sublime Text3配置Python环境
  • 原文地址:https://www.cnblogs.com/qq631243523/p/10509171.html
Copyright © 2020-2023  润新知