• Spark编程环境搭建


    基于Intellij IDEA搭建Spark开发环境搭

    基于Intellij IDEA搭建Spark开发环境搭——参考文档

    ● 参考文档http://spark.apache.org/docs/latest/programming-guide.html

    ● 操作步骤

    a)创建maven 项目

    b)引入依赖(Spark 依赖、打包插件等等)

    基于Intellij IDEA搭建Spark开发环境—maven vs sbt

    ● 哪个熟悉用哪个

    ● Maven也可以构建scala项目

    基于Intellij IDEA搭建Spark开发环境搭—maven构建scala项目

    ● 参考文档http://docs.scala-lang.org/tutorials/scala-with-maven.html

    ● 操作步骤

    a)用maven构建scala项目(基于net.alchim31.maven:scala-archetype-simple)

    b)pom.xml引入依赖(spark依赖、打包插件等等)

    注意:scala与java版本的兼容性

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <groupId>com.dajiangtai.test</groupId>
        <artifactId>test-spark</artifactId>
        <version>1.0-SNAPSHOT</version>
        <name>myWordCount</name>
        <inceptionYear>2008</inceptionYear>
        <properties>
            <scala.version>2.10.5</scala.version>
            <spark.version>1.6.1</spark.version>
        </properties>
    
        <repositories>
            <repository>
                <id>scala-tools.org</id>
                <name>Scala-Tools Maven2 Repository</name>
                <url>http://scala-tools.org/repo-releases</url>
            </repository>
        </repositories>
    
        <pluginRepositories>
            <pluginRepository>
                <id>scala-tools.org</id>
                <name>Scala-Tools Maven2 Repository</name>
                <url>http://scala-tools.org/repo-releases</url>
            </pluginRepository>
        </pluginRepositories>
    
        <dependencies>
            <dependency>
                <groupId>org.scala-lang</groupId>
                <artifactId>scala-library</artifactId>
                <version>${scala.version}</version>
            </dependency>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.4</version>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.specs</groupId>
                <artifactId>specs</artifactId>
                <version>1.2.5</version>
                <scope>test</scope>
            </dependency>
            <!--spark -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>${spark.version}</version>
                <scope>provided</scope>
            </dependency>
        </dependencies>
    
        <build>
            <!--
            <sourceDirectory>src/main/scala</sourceDirectory>
            <testSourceDirectory>src/test/scala</testSourceDirectory>
            -->
            <plugins>
                <plugin>
                    <groupId>org.scala-tools</groupId>
                    <artifactId>maven-scala-plugin</artifactId>
                    <executions>
                        <execution>
                            <goals>
                                <goal>compile</goal>
                                <goal>testCompile</goal>
                            </goals>
                        </execution>
                    </executions>
                    <configuration>
                        <scalaVersion>${scala.version}</scalaVersion>
                        <args>
                            <arg>-target:jvm-1.5</arg>
                        </args>
                    </configuration>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-eclipse-plugin</artifactId>
                    <configuration>
                        <downloadSources>true</downloadSources>
                        <buildcommands>
                            <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>
                        </buildcommands>
                        <additionalProjectnatures>
                            <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>
                        </additionalProjectnatures>
                        <classpathContainers>
                            <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
                            <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>
                        </classpathContainers>
                    </configuration>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-shade-plugin</artifactId>
                    <version>2.4.1</version>
                    <executions>
                        <!-- Run shade goal on package phase -->
                        <execution>
                            <phase>package</phase>
                            <goals>
                                <goal>shade</goal>
                            </goals>
                            <configuration>
                                <transformers>
                                    <!-- add Main-Class to manifest file -->
                                    <transformerimplementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                        <!--<mainClass>com.dajiang.MyDriver</mainClass>-->
                                    </transformer>
                                </transformers>
                                <createDependencyReducedPom>false</createDependencyReducedPom>
                            </configuration>
                        </execution>
                    </executions>
                </plugin>
            </plugins>
        </build>
        <reporting>
            <plugins>
                <plugin>
                    <groupId>org.scala-tools</groupId>
                    <artifactId>maven-scala-plugin</artifactId>
                    <configuration>
                        <scalaVersion>${scala.version}</scalaVersion>
                    </configuration>
                </plugin>
            </plugins>
        </reporting>
    </project>
    •  
    •  

    开发第一个Spark程序

    ● 第一个Scala版本的spark程序

    package com.dajiangtai.test
    import org.apache.spark.{SparkConf, SparkContext}
    
    /**
      * Created by lifei on 2016-6-19.
      */
    object MyWordCout {
      def main(args: Array[String]): Unit = {
        //参数检查
        if (args.length < 2) {
          System.err.println("Usage: MyWordCout <input> <output> ")
          System.exit(1)
        }
        //获取参数
        val input=args(0)
        val output=args(1)
        //创建scala版本的SparkContext
        val conf=new SparkConf().setAppName("myWordCount")
        val sc=new SparkContext(conf)
        //读取数据
        val lines=sc.textFile(input)
        //进行相关计算
        val resultRdd=lines.flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_)
        //保存结果
        resultRdd.saveAsTextFile(output)
        sc.stop()
      }
    }

    ● 第一个Java版本的spark程序

    package com.dajiangtai.test;
    
    import org.apache.spark.SparkConf;
    import org.apache.spark.api.java.JavaPairRDD;
    import org.apache.spark.api.java.JavaRDD;
    import org.apache.spark.api.java.JavaSparkContext;
    import org.apache.spark.api.java.function.FlatMapFunction;
    import org.apache.spark.api.java.function.Function2;
    import org.apache.spark.api.java.function.PairFunction;
    import scala.Tuple2;
    
    import java.util.Arrays;
    
    /**
     * Created by lifei on 2016-6-19.
     */
    public class MyJavaWordCount {
        public static void main(String[] args) {
            //参数检查
            if(args.length<2){
                System.err.println("Usage: MyJavaWordCount <input> <output> ");
                System.exit(1);
            }
            //获取参数
            String input=args[0];
            String output=args[1];
    
            //创建java版本的SparkContext
            SparkConf conf=new SparkConf().setAppName("MyJavaWordCount");
            JavaSparkContext sc=new JavaSparkContext(conf);
            //读取数据
            JavaRDD inputRdd=sc.textFile(input);
            //进行相关计算
            JavaRDD words=inputRdd.flatMap(new FlatMapFunction<string, string="">() {
                public Iterable call(String line) throws Exception {
                    return Arrays.asList(line.split(" "));
                }
            });
    
            JavaPairRDD<string,integer> result=words.mapToPair(new PairFunction<string, string,="" integer="">() {
                public Tuple2<string, integer=""> call(String word) throws Exception {
                    return new Tuple2(word,1);
                }
            }).reduceByKey(new Function2<integer, integer,="" integer="">() {
                public Integer call(Integer x, Integer y) throws Exception {
                    return x+y;
                }
            });
            //保存结果
            result.saveAsTextFile(output);
            //关闭sc
            sc.stop();
        }
    }

     

     

    运行自己开发第一个Spark程序

    ● Spark maven 项目打包

    mvn package

    ● 提交Spark 集群运行

    提交Scala版本的Wordcount

    bin/spark-submit --class com.dajiangtai.test.MyWordCount ~/testspark/test-spark-1.0.SNAPSHOT.jar ~/testspark/words.txt ~/testspark/result

    提交Java版本的Wordcount

    bin/spark-submit --class com.dajiangtai.test.MyJavaWordCount ~/testspark/test-spark-1.0.SNAPSHOT.jar ~/testspark/words.txt ~/testspark/result1

  • 相关阅读:
    2020年秋招三星面试题
    物联网金融和互联网金融的区别与联系
    数据库事务的4种隔离级别
    Access-cookie之sqlmap注入
    SDL-软件安全开发周期流程
    图片马的制作
    ssrf内网端口爆破扫描
    逻辑漏洞_验证码绕过_密码找回漏洞
    平行越权与垂直越权
    xff注入
  • 原文地址:https://www.cnblogs.com/fengyouheng/p/10294775.html
Copyright © 2020-2023  润新知