• mvn 引入自定义jar 解决 mongo-spark 报错


    [root@hadoop1 bin]# ./spark-submit --class myprojectpackaging.App /usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/mycode/myprojectname/target/myprojectname-1.0-SNAPSHOT.jar
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.5/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Hello World!
    17/12/01 10:48:52 INFO spark.SparkContext: Running Spark version 2.2.0
    17/12/01 10:48:53 WARN util.Utils: Your hostname, hadoop1 resolves to a loopback address: 127.0.0.1; using 192.168.2.51 instead (on interface eno1)
    17/12/01 10:48:53 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
    17/12/01 10:48:53 INFO spark.SparkContext: Submitted application: MongoSparkConnectorIntro
    17/12/01 10:48:53 INFO spark.SecurityManager: Changing view acls to: root
    17/12/01 10:48:53 INFO spark.SecurityManager: Changing modify acls to: root
    17/12/01 10:48:53 INFO spark.SecurityManager: Changing view acls groups to: 
    17/12/01 10:48:53 INFO spark.SecurityManager: Changing modify acls groups to: 
    17/12/01 10:48:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
    17/12/01 10:48:53 INFO util.Utils: Successfully started service 'sparkDriver' on port 56112.
    17/12/01 10:48:53 INFO spark.SparkEnv: Registering MapOutputTracker
    17/12/01 10:48:53 INFO spark.SparkEnv: Registering BlockManagerMaster
    17/12/01 10:48:53 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
    17/12/01 10:48:53 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
    17/12/01 10:48:53 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-a5531bf0-ae40-44be-9a91-a2aca6c81267
    17/12/01 10:48:53 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
    17/12/01 10:48:53 INFO spark.SparkEnv: Registering OutputCommitCoordinator
    17/12/01 10:48:53 INFO util.log: Logging initialized @1291ms
    17/12/01 10:48:53 INFO server.Server: jetty-9.3.z-SNAPSHOT
    17/12/01 10:48:53 INFO server.Server: Started @1353ms
    17/12/01 10:48:53 INFO server.AbstractConnector: Started ServerConnector@6548bb7d{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
    17/12/01 10:48:53 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7668d560{/jobs,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@438bad7c{/jobs/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4fdf8f12{/jobs/job,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6979efad{/jobs/job/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a67318f{/stages,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@17f9344b{/stages/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54e81b21{/stages/stage,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50cf5a23{/stages/stage/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@273c947f{/stages/pool,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1af1347d{/stages/pool/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@20765ed5{/storage,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2899a8db{/storage/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c1a4620{/storage/rdd,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@130a0f66{/storage/rdd/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@12365c88{/environment,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2237bada{/environment/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5710768a{/executors,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e0d4a8{/executors/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@30272916{/executors/threadDump,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5bf61e67{/executors/threadDump/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@b273a59{/static,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@30865a90{/,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@777c9dc9{/api,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22175d4f{/jobs/job/kill,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b809711{/stages/stage/kill,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.2.51:4040
    17/12/01 10:48:53 INFO spark.SparkContext: Added JAR file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/mycode/myprojectname/target/myprojectname-1.0-SNAPSHOT.jar at spark://192.168.2.51:56112/jars/myprojectname-1.0-SNAPSHOT.jar with timestamp 1512096533656
    17/12/01 10:48:53 INFO executor.Executor: Starting executor ID driver on host localhost
    17/12/01 10:48:53 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41433.
    17/12/01 10:48:53 INFO netty.NettyBlockTransferService: Server created on 192.168.2.51:41433
    17/12/01 10:48:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
    17/12/01 10:48:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.2.51, 41433, None)
    17/12/01 10:48:53 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.2.51:41433 with 366.3 MB RAM, BlockManagerId(driver, 192.168.2.51, 41433, None)
    17/12/01 10:48:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.2.51, 41433, None)
    17/12/01 10:48:53 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.2.51, 41433, None)
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@44c5a16f{/metrics/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO internal.SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/bin/spark-warehouse/').
    17/12/01 10:48:53 INFO internal.SharedState: Warehouse path is 'file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/bin/spark-warehouse/'.
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70e3f36f{/SQL,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23e44287{/SQL/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61f3fbb8{/SQL/execution,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@432034a{/SQL/execution/json,null,AVAILABLE,@Spark}
    17/12/01 10:48:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@173373b4{/static/sql,null,AVAILABLE,@Spark}
    17/12/01 10:48:54 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
    Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/spark/MongoSpark
        at myprojectpackaging.App.main(App.java:31)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Caused by: java.lang.ClassNotFoundException: com.mongodb.spark.MongoSpark
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 10 more
    17/12/01 10:48:54 INFO spark.SparkContext: Invoking stop() from shutdown hook
    17/12/01 10:48:54 INFO server.AbstractConnector: Stopped Spark@6548bb7d{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
    17/12/01 10:48:54 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.2.51:4040
    17/12/01 10:48:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
    17/12/01 10:48:54 INFO memory.MemoryStore: MemoryStore cleared
    17/12/01 10:48:54 INFO storage.BlockManager: BlockManager stopped
    17/12/01 10:48:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
    17/12/01 10:48:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
    17/12/01 10:48:54 INFO spark.SparkContext: Successfully stopped SparkContext
    17/12/01 10:48:54 INFO util.ShutdownHookManager: Shutdown hook called
    17/12/01 10:48:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-94c896a5-3f94-48ce-92ff-2e5b94204d0b
  • 相关阅读:
    CSS3中的一些属性
    在QT中用git做版本管理时遇到的一些问题
    前端面试题
    js中null, undefined 和 typeof
    《高性能网站建设指南》笔记
    《JavaScript模式》一书中提到的一些坑
    关于ubuntu下看视频中文字幕乱码的问题
    js实现观察者模式
    《JavaScript高级程序设计》第六章【面向对象的程序设计】 包括对象、创建对象、继承
    前端笔试题
  • 原文地址:https://www.cnblogs.com/rsapaper/p/7940915.html
Copyright © 2020-2023  润新知