日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人文社科 > 生活经验 >内容正文

生活经验

spark入门

發布時間:2023/11/27 生活经验 32 豆豆
生活随笔 收集整理的這篇文章主要介紹了 spark入门 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

下載spark-1.6.1-bin-hadoop2.6.tgz

解壓

配置

mv spark-env.sh.template spark-env.sh
vi spark-env.sh
在該配置文件中添加如下配置
export JAVA_HOME=/usr/java/jdk1.7.0_45
export SPARK_MASTER_IP=mini1
export SPARK_MASTER_PORT=7077
保存退出
重命名并修改slaves.template文件
mv slaves.template slaves
vi slaves
在該文件中添加子節點所在的位置(Worker節點)
mini2
mini3

啟動

sbin/start-all.sh

?bin/spark-shell ?啟動單機版的spark-shell,不會再瀏覽器中看到他的信息

//啟動集群的sparkshell

bin/spark-shell --master spark://mini1:7077 --executor-memory 512m --total-executor-cores 1

--master spark://mini1:7077 指定Master的地址

--executor-memory 2g 指定每個worker可用內存為2G

--total-executor-cores 2 指定整個集群使用的cup核數為2

wc

bin/spark-submit --class org.apache.spark.examples.SparkPi ?--master spark://mini1:7077 ??--total-executor-cores 1 --executor-memory 612m??lib/spark-examples-1.6.1-hadoop2.6.0.jar 50

sc.textFile("hdfs://mini1:9000/wc/sparkInput").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_,1).sortBy(_._2,false).saveAsTextFile("hdfs://mini1:9000/wc/sparkOutput2/")

?

object WordCount {def main(args: Array[String]): Unit = {val conf = new SparkConf().setAppName("WC")val sc = new SparkContext(conf)sc.textFile(args(0)).flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).sortBy(_._2,false).saveAsTextFile(args(1))
//    sc.textFile("hdfs://mini1:9000/wc/sparkInput").flatMap(_.split(" "))// .map((_,1)).reduceByKey(_+_,1).sortBy(_._2,false).saveAsTextFile("hdfs://mini1:9000/wc/sparkOutput2/")
sc.stop()}
}

pom.xml

?

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>cn.my.spark</groupId><artifactId>helloSpark</artifactId><version>2.0</version><properties><maven.compiler.source>1.8</maven.compiler.source><maven.compiler.target>1.8</maven.compiler.target><encoding>UTF-8</encoding><scala.version>2.10.6</scala.version><spark.version>1.6.1</spark.version><hadoop.version>2.6.4</hadoop.version></properties><dependencies><dependency><groupId>org.scala-lang</groupId><artifactId>scala-library</artifactId><version>${scala.version}</version></dependency><dependency><groupId>org.apache.spark</groupId><artifactId>spark-core_2.10</artifactId><version>${spark.version}</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-client</artifactId><version>${hadoop.version}</version></dependency></dependencies><build><sourceDirectory>src/main/scala</sourceDirectory><testSourceDirectory>src/test/scala</testSourceDirectory><plugins><plugin><groupId>net.alchim31.maven</groupId><artifactId>scala-maven-plugin</artifactId><version>3.2.2</version><executions><execution><goals><goal>compile</goal><goal>testCompile</goal></goals><configuration><args><!-- scala2.11不支持這個參數,報錯<arg>-make:transitive</arg>--><arg>-make:transitive</arg><arg>-dependencyfile</arg><arg>${project.build.directory}/.scala_dependencies</arg></args></configuration></execution></executions></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-shade-plugin</artifactId><version>2.4.3</version><executions><execution><phase>package</phase><goals><goal>shade</goal></goals><configuration><filters><filter><artifact>*:*</artifact><excludes><exclude>META-INF/*.SF</exclude><exclude>META-INF/*.DSA</exclude><exclude>META-INF/*.RSA</exclude></excludes></filter></filters><transformers><transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"><!--<mainClass>cn.my.spark.WordCount</mainClass>--><mainClass></mainClass></transformer></transformers></configuration></execution></executions></plugin></plugins></build></project>

?

轉載于:https://www.cnblogs.com/rocky-AGE-24/p/7222004.html

總結

以上是生活随笔為你收集整理的spark入门的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。