• tidb4.0.4使用tiup扩容TiKV 节点


    环境:centos7、tidb4.0.4、tiup-v1.0.8 

    添加两个tikv节点  172.21.210.37-38

    思路:初始化两台服务器、配置ssh互通——>编辑配置文件——>执行扩容命令——>重启grafana

    1、初始化服务器、配置ssh互通

    1、时间同步
    2、配置ssh
    ssh-copy-id root@172.21.210.37
    ssh-copy-id root@172.21.210.38
    

    2、编辑配置文件

    tiup cluster list                                     #查看当前的集群名称列表
    tiup cluster edit-config <cluster-name>  #查看集群配置、拷贝对应的配置
    
    vi scale-out.yaml
    tikv_servers:
    - host: 172.21.210.37
      ssh_port: 22
      port: 20160
      status_port: 20180
      deploy_dir: /data1/tidb-deploy/tikv-20160
      data_dir: /data1/tidb-data/tikv-20160
      arch: amd64
      os: linux
    - host: 172.21.210.38
      ssh_port: 22
      port: 20160
      status_port: 20180
      deploy_dir: /data1/tidb-deploy/tikv-20160
      data_dir: /data1/tidb-data/tikv-20160
      arch: amd64
      os: linux

    3、执行扩容命令

    此处假设当前执行命令的用户和新增的机器打通了互信,如果不满足已打通互信的条件,需要通过 -p 来输入新机器的密码,或通过 -i 指定私钥文件。
    tiup cluster scale-out <cluster-name> scale-out.yaml
    预期输出 Scaled cluster <cluster-name> out successfully 信息,表示扩容操作成功
    
    root@host-172-21-210-32 tidb_config]# tiup cluster scale-out tidb scale-out.yaml
    Starting component `cluster`:  scale-out tidb scale-out.yaml
    Please confirm your topology:
    TiDB Cluster: tidb
    TiDB Version: v4.0.4
    Type  Host           Ports        OS/Arch       Directories
    ----  ----           -----        -------       -----------
    tikv  172.21.210.37  20160/20180  linux/x86_64  /data1/tidb-deploy/tikv-20160,/data1/tidb-data/tikv-20160
    tikv  172.21.210.38  20160/20180  linux/x86_64  /data1/tidb-deploy/tikv-20160,/data1/tidb-data/tikv-20160
    Attention:
        1. If the topology is not what you expected, check your yaml file.
        2. Please confirm there is no port/directory conflicts in same host.
    Do you want to continue? [y/N]:  y
    + [ Serial ] - SSHKeySet: privateKey=/root/.tiup/storage/cluster/clusters/tidb/ssh/id_rsa, publicKey=/root/.tiup/storage/cluster/clusters/tidb/ssh/id_rsa.pub
    
    
      - Download tikv:v4.0.4 (linux/amd64) ... Done
    + [ Serial ] - RootSSH: user=root, host=172.21.210.38, port=22, key=/root/.ssh/id_rsa
    + [ Serial ] - EnvInit: user=tidb, host=172.21.210.38
    + [ Serial ] - RootSSH: user=root, host=172.21.210.37, port=22, key=/root/.ssh/id_rsa
    + [ Serial ] - EnvInit: user=tidb, host=172.21.210.37
    + [ Serial ] - Mkdir: host=172.21.210.37, directories='/data1/tidb-deploy','/data1/tidb-data'
    + [ Serial ] - Mkdir: host=172.21.210.38, directories='/data1/tidb-deploy','/data1/tidb-data'
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.32
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.39
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.33
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.34
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.32
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.33
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.35
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.32
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.36
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.32
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.32
    + [ Serial ] - UserSSH: user=tidb, host=172.21.210.38
    
    + [ Serial ] - UserSSH: user=tidb, host=172.21.210.37
    + [ Serial ] - Mkdir: host=172.21.210.38, directories='/data1/tidb-deploy/tikv-20160','/data1/tidb-deploy/tikv-20160/log','/data1/tidb-deploy/tikv-20160/bin','/data1/tidb-deploy/tikv-20160/conf','/data1/tidb-deploy/tikv-20160/scripts'
    + [ Serial ] - Mkdir: host=172.21.210.37, directories='/data1/tidb-deploy/tikv-20160','/data1/tidb-deploy/tikv-20160/log','/data1/tidb-deploy/tikv-20160/bin','/data1/tidb-deploy/tikv-20160/conf','/data1/tidb-deploy/tikv-20160/scripts'
    
    
      - Copy blackbox_exporter -> 172.21.210.37 ... ? Mkdir: host=172.21.210.37, directories='/data1/tidb-deploy/monitor-9100','/data1/t...
      - Copy blackbox_exporter -> 172.21.210.37 ... ? Mkdir: host=172.21.210.37, directories='/data1/tidb-deploy/monitor-9100','/data1/t...
      - Copy node_exporter -> 172.21.210.37 ... ? CopyComponent: component=node_exporter, version=v0.17.0, remote=172.21.210.37:/data1/t...
      - Copy blackbox_exporter -> 172.21.210.37 ... ? MonitoredConfig: cluster=tidb, user=tidb, node_exporter_port=9100, blackbox_export...
      - Copy node_exporter -> 172.21.210.38 ... Done
    + [ Serial ] - ScaleConfig: cluster=tidb, user=tidb, host=172.21.210.37, service=tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=
    + [ Serial ] - ScaleConfig: cluster=tidb, user=tidb, host=172.21.210.38, service=tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=
    + [ Serial ] - ClusterOperate: operation=StartOperation, options={Roles:[] Nodes:[] Force:false SSHTimeout:0 OptTimeout:120 APITimeout:0 IgnoreConfigCheck:false RetainDataRoles:[] RetainDataNodes:[]}
    Starting component pd
            Starting instance pd 172.21.210.33:2379
            Starting instance pd 172.21.210.32:2379
            Start pd 172.21.210.33:2379 success
            Start pd 172.21.210.32:2379 success
    Starting component node_exporter
            Starting instance 172.21.210.32
            Start 172.21.210.32 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.32
            Start 172.21.210.32 success
    Starting component node_exporter
            Starting instance 172.21.210.33
            Start 172.21.210.33 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.33
            Start 172.21.210.33 success
    Starting component tikv
            Starting instance tikv 172.21.210.35:20160
            Starting instance tikv 172.21.210.34:20160
            Starting instance tikv 172.21.210.39:20160
            Starting instance tikv 172.21.210.36:20160
            Start tikv 172.21.210.39:20160 success
            Start tikv 172.21.210.34:20160 success
            Start tikv 172.21.210.35:20160 success
            Start tikv 172.21.210.36:20160 success
    Starting component node_exporter
            Starting instance 172.21.210.35
            Start 172.21.210.35 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.35
            Start 172.21.210.35 success
    Starting component node_exporter
            Starting instance 172.21.210.34
            Start 172.21.210.34 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.34
            Start 172.21.210.34 success
    Starting component node_exporter
            Starting instance 172.21.210.39
            Start 172.21.210.39 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.39
            Start 172.21.210.39 success
    Starting component node_exporter
            Starting instance 172.21.210.36
            Start 172.21.210.36 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.36
            Start 172.21.210.36 success
    Starting component tidb
            Starting instance tidb 172.21.210.33:4000
            Starting instance tidb 172.21.210.32:4000
            Start tidb 172.21.210.32:4000 success
            Start tidb 172.21.210.33:4000 success
    Starting component prometheus
            Starting instance prometheus 172.21.210.32:9090
            Start prometheus 172.21.210.32:9090 success
    Starting component grafana
            Starting instance grafana 172.21.210.32:3000
            Start grafana 172.21.210.32:3000 success
    Starting component alertmanager
            Starting instance alertmanager 172.21.210.32:9093
            Start alertmanager 172.21.210.32:9093 success
    Checking service state of pd
            172.21.210.32      Active: active (running) since Fri 2020-10-16 22:50:31 CST; 2 weeks 5 days ago
            172.21.210.33      Active: active (running) since Fri 2020-10-16 22:50:22 CST; 2 weeks 5 days ago
    Checking service state of tikv
            172.21.210.34      Active: active (running) since Fri 2020-10-16 22:50:19 CST; 2 weeks 5 days ago
            172.21.210.35      Active: active (running) since Fri 2020-10-16 22:50:19 CST; 2 weeks 5 days ago
            172.21.210.36      Active: active (running) since Sat 2020-10-17 02:25:23 CST; 2 weeks 5 days ago
            172.21.210.39      Active: active (running) since Fri 2020-10-16 23:34:13 CST; 2 weeks 5 days ago
    Checking service state of tidb
            172.21.210.32      Active: active (running) since Fri 2020-10-16 22:50:49 CST; 2 weeks 5 days ago
            172.21.210.33      Active: active (running) since Fri 2020-10-16 22:50:40 CST; 2 weeks 5 days ago
    Checking service state of prometheus
            172.21.210.32      Active: active (running) since Sat 2020-10-17 02:25:27 CST; 2 weeks 5 days ago
    Checking service state of grafana
            172.21.210.32      Active: active (running) since Fri 2020-10-16 23:55:07 CST; 2 weeks 5 days ago
    Checking service state of alertmanager
            172.21.210.32      Active: active (running) since Fri 2020-10-16 22:51:06 CST; 2 weeks 5 days ago
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.38
    + [Parallel] - UserSSH: user=tidb, host=172.21.210.37
    + [ Serial ] - save meta
    + [ Serial ] - ClusterOperate: operation=StartOperation, options={Roles:[] Nodes:[] Force:false SSHTimeout:0 OptTimeout:120 APITimeout:0 IgnoreConfigCheck:false RetainDataRoles:[] RetainDataNodes:[]}
    Starting component tikv
            Starting instance tikv 172.21.210.38:20160
            Starting instance tikv 172.21.210.37:20160
            Start tikv 172.21.210.37:20160 success
            Start tikv 172.21.210.38:20160 success
    Starting component node_exporter
            Starting instance 172.21.210.37
            Start 172.21.210.37 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.37
            Start 172.21.210.37 success
    Starting component node_exporter
            Starting instance 172.21.210.38
            Start 172.21.210.38 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.38
            Start 172.21.210.38 success
    Checking service state of tikv
            172.21.210.37      Active: active (running) since Thu 2020-11-05 11:33:46 CST; 3s ago
            172.21.210.38      Active: active (running) since Thu 2020-11-05 11:33:46 CST; 2s ago
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.32, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/alertmanager-9093.service, deploy_dir=/data1/tidb-deploy/alertmanager-9093, data_dir=[/data1/tidb-data/alertmanager-9093], log_dir=/data1/tidb-deploy/alertmanager-9093/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.36, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.32, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tidb-4000.service, deploy_dir=/data1/tidb-deploy/tidb-4000, data_dir=[], log_dir=/data1/tidb-deploy/tidb-4000/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.32, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/pd-2379.service, deploy_dir=/data1/tidb-deploy/pd-2379, data_dir=[/data1/tidb-data/pd-2379], log_dir=/data1/tidb-deploy/pd-2379/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.37, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.33, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tidb-4000.service, deploy_dir=/data1/tidb-deploy/tidb-4000, data_dir=[], log_dir=/data1/tidb-deploy/tidb-4000/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.35, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.32, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/prometheus-9090.service, deploy_dir=/data1/tidb-deploy/prometheus-9090, data_dir=[/data1/tidb-data/prometheus-9090], log_dir=/data1/tidb-deploy/prometheus-9090/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.34, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.32, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/grafana-3000.service, deploy_dir=/data1/tidb-deploy/grafana-3000, data_dir=[], log_dir=/data1/tidb-deploy/grafana-3000/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.38, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.33, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/pd-2379.service, deploy_dir=/data1/tidb-deploy/pd-2379, data_dir=[/data1/tidb-data/pd-2379], log_dir=/data1/tidb-deploy/pd-2379/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - InitConfig: cluster=tidb, user=tidb, host=172.21.210.39, path=/root/.tiup/storage/cluster/clusters/tidb/config-cache/tikv-20160.service, deploy_dir=/data1/tidb-deploy/tikv-20160, data_dir=[/data1/tidb-data/tikv-20160], log_dir=/data1/tidb-deploy/tikv-20160/log, cache_dir=/root/.tiup/storage/cluster/clusters/tidb/config-cache
    + [ Serial ] - ClusterOperate: operation=RestartOperation, options={Roles:[prometheus] Nodes:[] Force:false SSHTimeout:0 OptTimeout:120 APITimeout:0 IgnoreConfigCheck:false RetainDataRoles:[] RetainDataNodes:[]}
    Stopping component prometheus
            Stopping instance 172.21.210.32
            Stop prometheus 172.21.210.32:9090 success
    Starting component prometheus
            Starting instance prometheus 172.21.210.32:9090
            Start prometheus 172.21.210.32:9090 success
    Starting component node_exporter
            Starting instance 172.21.210.32
            Start 172.21.210.32 success
    Starting component blackbox_exporter
            Starting instance 172.21.210.32
            Start 172.21.210.32 success
    Checking service state of pd
            172.21.210.33      Active: active (running) since Fri 2020-10-16 22:50:22 CST; 2 weeks 5 days ago
            172.21.210.32      Active: active (running) since Fri 2020-10-16 22:50:31 CST; 2 weeks 5 days ago
    Checking service state of tikv
            172.21.210.35      Active: active (running) since Fri 2020-10-16 22:50:19 CST; 2 weeks 5 days ago
            172.21.210.39      Active: active (running) since Fri 2020-10-16 23:34:13 CST; 2 weeks 5 days ago
            172.21.210.34      Active: active (running) since Fri 2020-10-16 22:50:19 CST; 2 weeks 5 days ago
            172.21.210.36      Active: active (running) since Sat 2020-10-17 02:25:23 CST; 2 weeks 5 days ago
    Checking service state of tidb
            172.21.210.32      Active: active (running) since Fri 2020-10-16 22:50:49 CST; 2 weeks 5 days ago
            172.21.210.33      Active: active (running) since Fri 2020-10-16 22:50:40 CST; 2 weeks 5 days ago
    Checking service state of prometheus
            172.21.210.32      Active: active (running) since Thu 2020-11-05 11:33:53 CST; 2s ago
    Checking service state of grafana
            172.21.210.32      Active: active (running) since Fri 2020-10-16 23:55:07 CST; 2 weeks 5 days ago
    Checking service state of alertmanager
            172.21.210.32      Active: active (running) since Fri 2020-10-16 22:51:06 CST; 2 weeks 5 days ago
    + [ Serial ] - UpdateTopology: cluster=tidb
    Scaled cluster `tidb` out successfully

    4、查看集群状态、重启grafana

    检查集群状态
        tiup cluster display <cluster-name>
    重启grafana
        tiup cluster restart tidb -R grafana
    

      

      

    做一个决定,并不难,难的是付诸行动,并且坚持到底。
  • 相关阅读:
    mvc3在各个IIS版本中的部署
    linq学习
    常用的正则表达式
    Jenkins+Git+Maven+Tomcat的初步学习
    12个用得着的JQuery代码片段
    JQuery原理介绍及学习方法
    【前端学习】javascript面向对象编程(继承和复用)
    c# throw和throw ex
    .net 信息采集ajax数据
    C# FileSystemWatcher 并发
  • 原文地址:https://www.cnblogs.com/wukc/p/13902487.html
Copyright © 2020-2023  润新知