• <Spark><分布式安装>


    • wget下载安装
    • tar -xzvf解压压缩包
    • 修改配置文件
      • ./conf/spark-env.sh
      • export SCALA_HOME=/usr/local/scala-2.11.8
        export JAVA_HOME=/usr/local/jvm/jdk1.8.0_91
        export SPARK_MASTER_IP=host99
        export SPARK_WORKER_MEMORY=1g
        export HADOOP_CONF_DIR=/usr/local/hadoop-2.6.2/etc/hadoop
      • ./conf/slaves
    • vim /etc/profile
    • #Spark enviroment
      export SPARK_HOME=/usr/local/spark-2.1.0-bin-hadoop2.6
      export PATH="$SPARK_HOME/bin:$PATH"
    • 测试:
      • ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client --driver-memory 1g --executor-memory 1g --executor-cores 2 ./examples/jars/spark-examples_2.11-2.1.0.jar 10
      • 运行时日志:【注意看标红的log】【8032是hadoop中配置的yarn端口】
      • [root@host99 /usr/local/spark-2.1.0-bin-hadoop2.6]$./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client --driver-memory 1g --executor-memory 1g --executor-cores 2 ./examples/jars/spark-examples_2.11-2.1.0.jar 10
        17/05/15 10:14:53 INFO spark.SparkContext: Running Spark version 2.1.0
        17/05/15 10:14:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
        17/05/15 10:14:54 INFO spark.SecurityManager: Changing view acls to: root
        17/05/15 10:14:54 INFO spark.SecurityManager: Changing modify acls to: root
        17/05/15 10:14:54 INFO spark.SecurityManager: Changing view acls groups to: 
        17/05/15 10:14:54 INFO spark.SecurityManager: Changing modify acls groups to: 
        17/05/15 10:14:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
        17/05/15 10:14:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 56276.
        17/05/15 10:14:54 INFO spark.SparkEnv: Registering MapOutputTracker
        17/05/15 10:14:54 INFO spark.SparkEnv: Registering BlockManagerMaster
        17/05/15 10:14:54 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
        17/05/15 10:14:54 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
        17/05/15 10:14:54 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-f41a3f8d-45a8-4dc8-b03d-220aef5823b8
        17/05/15 10:14:54 INFO memory.MemoryStore: MemoryStore started with capacity 399.6 MB
        17/05/15 10:14:54 INFO spark.SparkEnv: Registering OutputCommitCoordinator
        17/05/15 10:14:54 INFO util.log: Logging initialized @2305ms
        17/05/15 10:14:55 INFO server.Server: jetty-9.2.z-SNAPSHOT
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25f9407e{/jobs,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@552518c3{/jobs/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1a69561c{/jobs/job,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59aa20b3{/jobs/job/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@363f6148{/stages,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4b21844c{/stages/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b28f282{/stages/stage,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@138fe6ec{/stages/stage/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e77f0f4{/stages/pool,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@19b30c92{/stages/pool/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@455351c4{/storage,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@29876704{/storage/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4816c290{/storage/rdd,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4940809c{/storage/rdd/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16423501{/environment,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4efcf8a{/environment/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a138fc5{/executors,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@379ab47b{/executors/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@307765b4{/executors/threadDump,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a9e6faf{/executors/threadDump/json,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c95ac9e{/static,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4e4efc1b{/,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@459f7aa3{/api,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cc586a8{/jobs/job/kill,null,AVAILABLE}
        17/05/15 10:14:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7db534f2{/stages/stage/kill,null,AVAILABLE}
        17/05/15 10:14:55 INFO server.ServerConnector: Started ServerConnector@24a1c17f{HTTP/1.1}{0.0.0.0:4040}
        17/05/15 10:14:55 INFO server.Server: Started @2469ms
        17/05/15 10:14:55 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
        17/05/15 10:14:55 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.3.242.99:4040
        17/05/15 10:14:55 INFO spark.SparkContext: Added JAR file:/usr/local/spark-2.1.0-bin-hadoop2.6/./examples/jars/spark-examples_2.11-2.1.0.jar at spark://10.3.242.99:56276/jars/spark-examples_2.11-2.1.0.jar with timestamp 1494814495180
        17/05/15 10:14:55 INFO client.RMProxy: Connecting to ResourceManager at host99/10.3.242.99:8032
        17/05/15 10:14:56 INFO yarn.Client: Requesting a new application from cluster with 2 NodeManagers
        17/05/15 10:14:56 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (32768 MB per container)
        17/05/15 10:14:56 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
        17/05/15 10:14:56 INFO yarn.Client: Setting up container launch context for our AM
        17/05/15 10:14:56 INFO yarn.Client: Setting up the launch environment for our AM container
        17/05/15 10:14:56 INFO yarn.Client: Preparing resources for our AM container
        17/05/15 10:14:57 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
        17/05/15 10:15:00 INFO yarn.Client: Uploading resource file:/tmp/spark-4441c48b-51a5-4b46-b9c6-87ae83be686b/__spark_libs__3038073669516991455.zip -> hdfs://host99:9000/user/root/.sparkStaging/application_1494404935972_0001/__spark_libs__3038073669516991455.zip
        17/05/15 10:15:04 INFO yarn.Client: Uploading resource file:/tmp/spark-4441c48b-51a5-4b46-b9c6-87ae83be686b/__spark_conf__2861954561840735383.zip -> hdfs://host99:9000/user/root/.sparkStaging/application_1494404935972_0001/__spark_conf__.zip
        17/05/15 10:15:04 INFO spark.SecurityManager: Changing view acls to: root
        17/05/15 10:15:04 INFO spark.SecurityManager: Changing modify acls to: root
        17/05/15 10:15:04 INFO spark.SecurityManager: Changing view acls groups to: 
        17/05/15 10:15:04 INFO spark.SecurityManager: Changing modify acls groups to: 
        17/05/15 10:15:04 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
        17/05/15 10:15:04 INFO yarn.Client: Submitting application application_1494404935972_0001 to ResourceManager
        17/05/15 10:15:04 INFO impl.YarnClientImpl: Submitted application application_1494404935972_0001
        17/05/15 10:15:04 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1494404935972_0001 and attemptId None
        17/05/15 10:15:05 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:05 INFO yarn.Client: 
        	 client token: N/A
        	 diagnostics: N/A
        	 ApplicationMaster host: N/A
        	 ApplicationMaster RPC port: -1
        	 queue: default
        	 start time: 1494814504572
        	 final status: UNDEFINED
        	 tracking URL: http://host99:8088/proxy/application_1494404935972_0001/
        	 user: root
        17/05/15 10:15:06 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:07 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:08 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:09 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:10 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:11 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:12 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:13 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:14 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:15 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:16 INFO yarn.Client: Application report for application_1494404935972_0001 (state: ACCEPTED)
        17/05/15 10:15:17 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
        17/05/15 10:15:17 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> host99, PROXY_URI_BASES -> http://host99:8088/proxy/application_1494404935972_0001), /proxy/application_1494404935972_0001
        17/05/15 10:15:17 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
        17/05/15 10:15:17 INFO yarn.Client: Application report for application_1494404935972_0001 (state: RUNNING)
        17/05/15 10:15:17 INFO yarn.Client: 
        	 client token: N/A
        	 diagnostics: N/A
        	 ApplicationMaster host: 10.3.242.99
        	 ApplicationMaster RPC port: 0
        	 queue: default
        	 start time: 1494814504572
        	 final status: UNDEFINED
        	 tracking URL: http://host99:8088/proxy/application_1494404935972_0001/
        	 user: root
        17/05/15 10:15:17 INFO cluster.YarnClientSchedulerBackend: Application application_1494404935972_0001 has started running.
        17/05/15 10:15:17 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45168.
        17/05/15 10:15:17 INFO netty.NettyBlockTransferService: Server created on 10.3.242.99:45168
        17/05/15 10:15:17 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
        17/05/15 10:15:17 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.3.242.99, 45168, None)
        17/05/15 10:15:17 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.3.242.99:45168 with 399.6 MB RAM, BlockManagerId(driver, 10.3.242.99, 45168, None)
        17/05/15 10:15:17 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.3.242.99, 45168, None)
        17/05/15 10:15:17 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.3.242.99, 45168, None)
        17/05/15 10:15:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@385dfb63{/metrics/json,null,AVAILABLE}
        17/05/15 10:15:21 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (10.3.242.99:39099) with ID 1
        17/05/15 10:15:21 INFO storage.BlockManagerMasterEndpoint: Registering block manager host99:39991 with 366.3 MB RAM, BlockManagerId(1, host99, 39991, None)
        17/05/15 10:15:24 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (10.3.242.99:39102) with ID 3
        17/05/15 10:15:25 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
        17/05/15 10:15:25 INFO storage.BlockManagerMasterEndpoint: Registering block manager host99:34903 with 366.3 MB RAM, BlockManagerId(3, host99, 34903, None)
        17/05/15 10:15:25 INFO internal.SharedState: Warehouse path is 'file:/usr/local/spark-2.1.0-bin-hadoop2.6/spark-warehouse'.
        17/05/15 10:15:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15639d09{/SQL,null,AVAILABLE}
        17/05/15 10:15:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@52bd9a27{/SQL/json,null,AVAILABLE}
        17/05/15 10:15:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5ed5b321{/SQL/execution,null,AVAILABLE}
        17/05/15 10:15:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7459a21e{/SQL/execution/json,null,AVAILABLE}
        17/05/15 10:15:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58eac00e{/static/sql,null,AVAILABLE}
        17/05/15 10:15:25 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:38
        17/05/15 10:15:25 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 10 output partitions
        17/05/15 10:15:25 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38)
        17/05/15 10:15:25 INFO scheduler.DAGScheduler: Parents of final stage: List()
        17/05/15 10:15:25 INFO scheduler.DAGScheduler: Missing parents: List()
        17/05/15 10:15:25 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents
        17/05/15 10:15:25 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1832.0 B, free 399.6 MB)
        17/05/15 10:15:25 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1172.0 B, free 399.6 MB)
        17/05/15 10:15:25 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.3.242.99:45168 (size: 1172.0 B, free: 399.6 MB)
        17/05/15 10:15:25 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:996
        17/05/15 10:15:25 INFO scheduler.DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34)
        17/05/15 10:15:25 INFO cluster.YarnScheduler: Adding task set 0.0 with 10 tasks
        17/05/15 10:15:25 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, host99, executor 1, partition 0, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:25 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, host99, executor 3, partition 1, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:25 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, host99, executor 1, partition 2, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:25 INFO scheduler.TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, host99, executor 3, partition 3, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on host99:34903 (size: 1172.0 B, free: 366.3 MB)
        17/05/15 10:15:26 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on host99:39991 (size: 1172.0 B, free: 366.3 MB)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, host99, executor 1, partition 4, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, host99, executor 1, partition 5, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 655 ms on host99 (executor 1) (1/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 707 ms on host99 (executor 1) (2/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, host99, executor 3, partition 6, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 679 ms on host99 (executor 3) (3/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, host99, executor 3, partition 7, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 676 ms on host99 (executor 3) (4/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, host99, executor 1, partition 8, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 90 ms on host99 (executor 1) (5/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, host99, executor 1, partition 9, PROCESS_LOCAL, 6034 bytes)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 89 ms on host99 (executor 1) (6/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 86 ms on host99 (executor 3) (7/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 90 ms on host99 (executor 3) (8/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 78 ms on host99 (executor 1) (9/10)
        17/05/15 10:15:26 INFO scheduler.TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 77 ms on host99 (executor 1) (10/10)
        17/05/15 10:15:26 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 0.865 s
        17/05/15 10:15:26 INFO cluster.YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool 
        17/05/15 10:15:26 INFO scheduler.DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 1.115480 s
        Pi is roughly 3.1415871415871415
        17/05/15 10:15:26 INFO server.ServerConnector: Stopped ServerConnector@24a1c17f{HTTP/1.1}{0.0.0.0:4040}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7db534f2{/stages/stage/kill,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7cc586a8{/jobs/job/kill,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@459f7aa3{/api,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4e4efc1b{/,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2c95ac9e{/static,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4a9e6faf{/executors/threadDump/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@307765b4{/executors/threadDump,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@379ab47b{/executors/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7a138fc5{/executors,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4efcf8a{/environment/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@16423501{/environment,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4940809c{/storage/rdd/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4816c290{/storage/rdd,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@29876704{/storage/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@455351c4{/storage,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@19b30c92{/stages/pool/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5e77f0f4{/stages/pool,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@138fe6ec{/stages/stage/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1b28f282{/stages/stage,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4b21844c{/stages/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@363f6148{/stages,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@59aa20b3{/jobs/job/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1a69561c{/jobs/job,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@552518c3{/jobs/json,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@25f9407e{/jobs,null,UNAVAILABLE}
        17/05/15 10:15:26 INFO ui.SparkUI: Stopped Spark web UI at http://10.3.242.99:4040
        17/05/15 10:15:26 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
        17/05/15 10:15:26 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
        17/05/15 10:15:26 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
        17/05/15 10:15:26 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
        (serviceOption=None,
         services=List(),
         started=false)
        17/05/15 10:15:26 INFO cluster.YarnClientSchedulerBackend: Stopped
        17/05/15 10:15:26 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
        17/05/15 10:15:26 INFO memory.MemoryStore: MemoryStore cleared
        17/05/15 10:15:26 INFO storage.BlockManager: BlockManager stopped
        17/05/15 10:15:26 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
        17/05/15 10:15:26 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
        17/05/15 10:15:26 INFO spark.SparkContext: Successfully stopped SparkContext
        17/05/15 10:15:26 INFO util.ShutdownHookManager: Shutdown hook called
        17/05/15 10:15:26 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4441c48b-51a5-4b46-b9c6-87ae83be686b
      • 还可以在hadoop的WebUI(port 8088)看到这个job~
    满地都是六便士,她却抬头看见了月亮。
  • 相关阅读:
    JQUERY 滚动 scroll事件老忘记 标记下
    js获取iframe里的body内容
    win8.1企业版 IIS8.5 安装php5.5.18详细图文
    JS 根据特定URL获取ID数组
    wampserver 2.5安装pear win8.1
    webstorm 文件历史找回~ 恢复正确状态~
    深入浅出数据库索引原理(转)
    Winform自定义表单(转)
    用 ASP.NET MVC 实现基于 XMLHttpRequest long polling(长轮询) 的 Comet(转)
    面对海量请求,缓存设计还应该考虑哪些问题?(转)
  • 原文地址:https://www.cnblogs.com/wttttt/p/6855239.html
Copyright © 2020-2023  润新知