• airflow分布式部署(五)开机自启动


    airflow是一个常驻任务,我们希望它在任务失败后可以自动重启,systemd可以帮助我们实现这个功能,首先需要配置如下文件

    /etc/sysconfig/airflow

    AIRFLOW_CONFIG=/root/airflow/airflow.cfg
    AIRFLOW_HOME=/root/airflow

    /usr/lib/systemd/system/airflow-webserver.service

    [Unit]
    Description=Airflow webserver daemon
    
    [Service]
    EnvironmentFile=/etc/sysconfig/airflow
    User=root
    Group=root
    Type=simple
    ExecStart=/root/miniconda3/envs/py36/bin/airflow webserver
    Restart=on-failure
    RestartSec=5s
    PrivateTmp=true
    
    [Install]
    WantedBy=multi-user.target

    /usr/lib/systemd/system/airflow-flower.service

    [Unit]
    Description=Airflow flower daemon
    [Service]
    EnvironmentFile=/etc/sysconfig/airflow
    User=root
    Group=root
    Type=simple
    ExecStart=/root/miniconda3/envs/py36/bin/airflow flower
    Restart=on-failure
    RestartSec=5s
    PrivateTmp=true

    /usr/lib/systemd/system/airflow-scheduler.service

    [Unit]
    Description=Airflow scheduler daemon
    After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
    Wants=redis.service
    
    [Service]
    EnvironmentFile=/etc/sysconfig/airflow
    User=root
    Group=root
    Type=simple
    ExecStart=/root/miniconda3/envs/py36/bin/airflow scheduler
    Restart=always
    RestartSec=5s
    
    [Install]
    WantedBy=multi-user.target

    /usr/lib/systemd/system/airflow-worker.service

    [Unit]
    Description=Airflow celery worker daemon
    After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
    Wants=mysql.service redis.service
    
    [Service]
    EnvironmentFile=/etc/sysconfig/airflow
    User=root
    Group=root
    Type=simple
    ExecStart=/root/miniconda3/envs/py36/bin/airflow worker
    Restart=on-failure
    RestartSec=10s
    
    [Install]
    WantedBy=multi-user.target

    配置完,之后执行以下命令

    #重载所有修改过的配置文件
    sudo systemctl daemon-reload
    #查看启动耗时,关键是可以查看启动失败的原因
    sudo systemd-analyze verify airflow-webserver.service
    #查看启动失败任务的启动日志
    sudo journalctl -u airflow-webserver.service
    #查看服务状态
    systemctl status airflow-scheduler.service
    #判断是否自动重启
    sudo systemctl is-enabled airflow-worker.service
    sudo systemctl enable airflow-worker.service
    #重启任务
    systemctl stop airflow-scheduler.service
    systemctl start airflow-scheduler.service

     报错记录

    FileNotFoundError: [Errno 2] No such file or directory: 'gunicorn'

    这是因为没有把python3的path加入到PATH中,解决思路找到python3的路径加入到PATH即可

    备注:

    又有报错FileNotFoundError: [Errno 2] No such file or directory: 'airflow': 'airflow'

    反反复复始终未能解决,最终webserver/scheduler以systemctl方式启动,worker和flower以airflow worker -D的方式启动



  • 相关阅读:
    连接查询
    分组查询
    【转载】C语言 构建参数个数不固定函数
    【转载】vc编译exe的体积最小优化
    VC6微软正则表达式greta使用案例
    MultiByteToWideChar和WideCharToMultiByte
    【转载】VC操作剪切板
    VC/MFC分割字符串(SplitString)返回CStringArray
    【转载】实现UTF8与GB2312编码格式相互转换(VC)已经验证!
    VC6配置sqlite数据库
  • 原文地址:https://www.cnblogs.com/wangbin2188/p/13926301.html
Copyright © 2020-2023  润新知