scrapy发送qq邮件
发送邮件的意义
- 给scrapy加入发送邮件功能,其目的旨在报错预警,这对运维现有爬虫有着很大的积极意义
- 或者爬虫爬取结束后,发邮件告诉开发者,hi,任务已经完成了.
- 我不纠结于是否使用scrapy自带的发送邮件功能,我的目的只是去实现它,用简单直接的方式
- 发送邮件的方法很多,现在演示python自带的发送,还有scrapy自带的发送
第一种,使用python自带的邮件模块
setting里面
########## email ##########
SMTP_SERVER = 'smtp.qq.com'
SMTP_PORT = 465
SMTP_OVER_SSL = True
SMTP_CONNECTION_TIMEOUT = 10
EMAIL_PASSWORD = 'XXX' # 填写生成的授权码
EMAIL_SENDER = 'XXX@qq.com'
EMAIL_RECIPIENTS = [EMAIL_SENDER]
EMAIL_SUBJECT = 'Email from #scrapydweb'
########## email ##########
# 用于发送邮件的模块
import smtplib
from email.mime.text import MIMEText
from scrapy.utils.project import get_project_settings
settings = get_project_settings()
# QQ邮件
# 1> 配置邮箱SMTP服务器的主机地址,将来使用这个服务器收发邮件。
HOST = settings["SMTP_SERVER"]
# # 2> 配置服务的端口,默认的邮件端口是25.
PORT = settings["SMTP_PORT"]
# # 3> 指定发件人和收件人。
FROM = settings["EMAIL_SENDER"]
TO = settings["EMAIL_RECIPIENTS"]
PASSWORD = settings["EMAIL_PASSWORD"]
# # 4> 邮件标题
SUBJECT = '这是一封测试邮件'
# # 5> 邮件内容
CONTENT = '这是xxxxx发送过来的邮件。请注意查收!'
CONTENT2 = """
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html";charset="utf-8">
<title>Dcp's python email</title>
</head>
<body>
<p>Dcp's babys:</p>
<p><h1>Happy New Year!</h1></p>
</body>
</html>
"""
# # 创建邮件发送对象
# # 普通的邮件发送形式
# smtp_obj = smtplib.SMTP()
# # 数据在传输过程中会被加密的邮件发送形式。
smtp_obj = smtplib.SMTP_SSL(HOST)
#
# # 需要进行发件人的认证,授权。
# # smtp_obj就是一个第三方客户端对象
smtp_obj.connect(host=HOST, port=PORT)
# # 如果使用第三方客户端登录,要求使用授权码,不能使用真实密码,防止密码泄露。
res = smtp_obj.login(user=FROM, password=PASSWORD)
print('登录结果:', res)
#
# # 发送邮件
# msg = '
'.join(['From: {}'.format(FROM), 'To: {}'.format(TO), 'Subject: {}'.format(SUBJECT), '', CONTENT2])
# print(msg)
# print(type(msg))
# smtp_obj.sendmail(from_addr=FROM, to_addrs=TO, msg=msg.encode('utf-8'))
msg = MIMEText(CONTENT2, 'html', 'utf-8') # 邮件内容,三个参数:第一个为文本内容,第二个 html 设置文本格式,第三个 utf-8 设置编码
msg["Subject"] = "Your Red Bag is Coming.."
msg["From"] = FROM
msg['To'] = ";".join(TO) # 这里注意如果是一个接收人列表要这么写!
# print(msg)
# print(msg.as_string())
# print(type(msg))
smtp_obj.sendmail(from_addr=FROM, to_addrs=TO, msg=msg.as_string())
第二种,使用scrapy的发送,
- 我们是要发送爬虫结束的邮件,所以我们需要写一个发送邮件的middleware.
- 重点在spider_closed,即爬虫结束后执行的操作
- 也可以直接写成一个爬虫
- 还可以在爬虫异常的时候发送邮件,
########## scrapy email ##########
MAIL_HOST = SMTP_SERVER
MAIL_FROM = EMAIL_SENDER
MAIL_USER = EMAIL_SENDER
MAIL_PASS = EMAIL_PASSWORD # 授权码
MAIL_PORT = 25
########## scrapy email ##########
import scrapy
from scrapy import cmdline
from scrapy.utils.project import get_project_settings
settings = get_project_settings()
from scrapy.mail import MailSender
mailer = MailSender()
class SendEmailSpider(scrapy.Spider):
name = 'send_email'
start_urls = ['https://www.baidu.com/']
# FROM = settings["EMAIL_SENDER"]
# TO = settings["EMAIL_RECIPIENTS"]
def parse(self, response):
mailer = MailSender.from_settings(settings)
# 发送邮件
body = "some body"
mailer.send(to=settings["EMAIL_RECIPIENTS"], subject="Some subject", body=body) # cc是抄送
if __name__ == '__main__':
cmdline.execute("scrapy crawl send_email".split())