• Exception in thread "main" expected '<document start>', but found BlockMappingStart in 'reader', line 23, column 2: nimbus.host: "master"


    平台:centos-6.3-i386

       jdk-7u51

            storm 0.9.1

       python 2.6.6

         hadoop 1.2.1

      启动storm的时候,遇到这个问题,百度之后,看到大家的解决方案是在 nimbus.host: "master"前加上空格,但是,我的已经加上空格。还是出错。

          后来,google到外国网址。然后下载了纯净版的storm.yaml,替换之后,重新配置,就搞定了。

          外国网址:https://groups.google.com/forum/#!topic/storm-user/o7Xx6Oa_XKI

          以及storm.yaml文件:链接: http://pan.baidu.com/s/1kVDEa5T 密码: hfyc

        重要说明:文件最后一行,少打了一个空格。

       

         下边是百度到上别人的方法:

     
    
    
     
     1 # Licensed to the Apache Software Foundation (ASF) under one
     2 # or more contributor license agreements.  See the NOTICE file
     3 # distributed with this work for additional information
     4 # regarding copyright ownership.  The ASF licenses this file
     5 # to you under the Apache License, Version 2.0 (the
     6 # "License"); you may not use this file except in compliance
     7 # with the License.  You may obtain a copy of the License at
     8 #
     9 # http://www.apache.org/licenses/LICENSE-2.0
    10 #
    11 # Unless required by applicable law or agreed to in writing, software
    12 # distributed under the License is distributed on an "AS IS" BASIS,
    13 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    14 # See the License for the specific language governing permissions and
    15 # limitations under the License.
    16 
    17 ########### These MUST be filled in for a storm configuration
    18 # storm.zookeeper.servers:
    19      - "master"
    20      - "slave1"
    21      - "slave2"
    22 #
    23  nimbus.host: "master"
    24  storm.local.dir: "/home/hadoop/apache-storm-0.9.1-incubating/data"
    25  supervisor.slots.ports:
    26     - 6700
    27     - 6701
    28     - 6702
    29     - 6703
    30 #
    31 #
    32 # ##### These may optionally be filled in:
    33 #
    34 ## List of custom serializations
    35 # topology.kryo.register:
    36 #     - org.mycompany.MyType
    37 #     - org.mycompany.MyType2: org.mycompany.MyType2Serializer
    38 #
    39 ## List of custom kryo decorators
    40 # topology.kryo.decorators:
    41 #     - org.mycompany.MyDecorator
    42 #
    43 ## Locations of the drpc servers
    44 # drpc.servers:
    45 #     - "server1"
    46 #     - "server2"
    47 
    48 ## Metrics Consumers
    49 # topology.metrics.consumer.register:
    50 #   - class: "backtype.storm.metrics.LoggingMetricsConsumer"
    51 #     parallelism.hint: 1
    52 #   - class: "org.mycompany.MyMetricsConsumer"
    53 #     parallelism.hint: 1
    54 #     argument:
    55 #       - endpoint: "metrics-collector.mycompany.org"
    复制代码
    
    
    
     
     

     启动 storm nimbus &

     
     1  Exception in thread "main" expected '<document start>', but found BlockMappingStart
     2  in 'reader', line 23, column 2:
     3      nimbus.host: "master"
     4      ^
     5 
     6         at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)
     7         at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)
     8         at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)
     9         at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)
    10         at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)
    11         at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)
    12         at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)
    13         at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:138)
    14         at backtype.storm.utils.Utils.readStormConfig(Utils.java:178)
    15         at backtype.storm.config$read_storm_config.invoke(config.clj:116)
    16         at backtype.storm.command.config_value$_main.invoke(config_value.clj:22)
    17         at clojure.lang.AFn.applyToHelper(AFn.java:161)
    18         at clojure.lang.AFn.applyTo(AFn.java:151)
    19         at backtype.storm.command.config_value.main(Unknown Source)
    20 Exception in thread "main" expected '<document start>', but found BlockMappingStart
    21  in 'reader', line 23, column 2:
    22      nimbus.host: "master"
    23      ^
    24 
    25         at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)
    26         at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)
    27         at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)
    28         at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)
    29         at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)
    30         at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)
    31         at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)
    32         at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:138)
    33         at backtype.storm.utils.Utils.readStormConfig(Utils.java:178)
    34         at backtype.storm.config$read_storm_config.invoke(config.clj:116)
    35         at backtype.storm.command.config_value$_main.invoke(config_value.clj:22)
    36         at clojure.lang.AFn.applyToHelper(AFn.java:161)
    37         at clojure.lang.AFn.applyTo(AFn.java:151)
    38         at backtype.storm.command.config_value.main(Unknown Source)
    39 Running: java -server -Dstorm.options= -Dstorm.home=/home/hadoop/apache-storm-0.9.1-incubating -Djava.library.path= -Dstorm.conf.file= -cp /home/hadoop/apache-storm-0.9.1-incubating/lib/ring-core-1.1.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.cli-0.2.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/compojure-1.1.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jgrapht-core-0.9.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/json-simple-1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-io-1.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/objenesis-1.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/servlet-api-2.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-servlet-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/httpclient-4.1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clout-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/disruptor-2.10.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-fileupload-1.2.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/curator-client-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/netty-3.6.3.Final.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/meat-locker-0.3.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clj-time-0.4.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-codec-1.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/slf4j-api-1.6.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clojure-1.4.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/joda-time-2.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-lang-2.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/kryo-2.17.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-logging-1.1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/carbonite-1.3.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.macro-0.1.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clj-stacktrace-0.2.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-exec-1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/storm-core-0.9.1-incubating.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jetty-util-6.1.26.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/guava-13.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/core.incubator-0.1.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/snakeyaml-1.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-devel-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/minlog-1.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/logback-core-1.0.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/servlet-api-2.5-20081211.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/reflectasm-1.07-shaded.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/logback-classic-1.0.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/hiccup-0.3.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/httpcore-4.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/asm-4.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/zookeeper-3.3.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/curator-framework-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/junit-3.8.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jline-2.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.logging-0.2.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jetty-6.1.26.jar:/home/hadoop/apache-storm-0.9.1-incubating/conf -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/apache-storm-0.9.1-incubating/logback/cluster.xml backtype.storm.daemon.nimbus
     

    查看报警信息标识是在nimbus的n上,经试验原来是这几个配置名称前需要加空格。即:

    空格nimbus.host: "192.168.1.101"
    空格storm.zookeeper.port: 2181
    空格storm.local.dir: "home/hadoop/storm-0.9.1/data"
    空格supervisor.slots.ports:
     
    大家配置storm.yaml时一定要注意了。少一个空格竟然就启动不了,真是不可思议。
  • 相关阅读:
    使用WCF实现SOA面向服务编程—— 架构设计
    ASP.NET MVC 4 RC的JS/CSS打包压缩功能
    自定义WCF的配置文件
    C#综合揭秘——分部类和分部方法
    结合领域驱动设计的SOA分布式软件架构
    【转】数字证书类型
    kubeadm部署单master Kuberntes集群
    持续交付
    编译在docker alpine中可用的go程序
    百度云盘,文件重命名
  • 原文地址:https://www.cnblogs.com/mrchige/p/5907856.html
Copyright © 2020-2023  润新知