日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Spark _26_Spark On Hive的配置

發布時間:2024/2/28 编程问答 28 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Spark _26_Spark On Hive的配置 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

網上的配置大多如下:?

Spark On Hive的配置

  • 在Spark客戶端配置Hive On Spark
  • 在Spark客戶端安裝包下spark-1.6.0/conf中創建文件hive-site.xml:

    配置hive的metastore路徑

    <configuration>

    ???<property>

    ????????<name>hive.metastore.uris</name>

    ????????<value>thrift://node1:9083</value>

    ???</property>

    </configuration>

  • 啟動Hive的metastore服務
  • hive --service metastore

  • 啟動zookeeper集群,啟動HDFS集群。
  • 啟動SparkShell 讀取Hive中的表總數,對比hive中查詢同一表查詢總數測試時間。
  • ./spark-shell

    --master spark://node1:7077,node2:7077

    ?--executor-cores 1

    --executor-memory 1g

    --total-executor-cores 1

    import org.apache.spark.sql.hive.HiveContext

    val hc = new HiveContext(sc)

    hc.sql("show databases").show

    hc.sql("user default").show

    hc.sql("select count(*) from jizhan").show

    然后,我是沒搞定:

    不過另有他法:

    配置之后,直接連接spark-sql

    [root@henu1 bin]# ./spark-sql 2019-10-28 21:15:12 WARN NativeCodeLoader:62 - Unable to load native-hadoop library f or your platform... using builtin-java classes where applicable2019-10-28 21:15:14 INFO metastore:376 - Trying to connect to metastore with URI thri ft://henu2:90832019-10-28 21:15:14 INFO metastore:472 - Connected to metastore. 2019-10-28 21:15:20 INFO SessionState:641 - Created local directory: /tmp/a85f1a6b-85 d3-4bf1-b366-14e375c76622_resources2019-10-28 21:15:20 INFO SessionState:641 - Created HDFS directory: /tmp/hive/root/a8 5f1a6b-85d3-4bf1-b366-14e375c766222019-10-28 21:15:20 INFO SessionState:641 - Created local directory: /tmp/root/a85f1a 6b-85d3-4bf1-b366-14e375c766222019-10-28 21:15:20 INFO SessionState:641 - Created HDFS directory: /tmp/hive/root/a8 5f1a6b-85d3-4bf1-b366-14e375c76622/_tmp_space.db2019-10-28 21:15:20 INFO SparkContext:54 - Running Spark version 2.3.1 2019-10-28 21:15:20 INFO SparkContext:54 - Submitted application: SparkSQL::192.168.2 48.2412019-10-28 21:15:20 INFO SecurityManager:54 - Changing view acls to: root 2019-10-28 21:15:20 INFO SecurityManager:54 - Changing modify acls to: root 2019-10-28 21:15:20 INFO SecurityManager:54 - Changing view acls groups to: 2019-10-28 21:15:20 INFO SecurityManager:54 - Changing modify acls groups to: 2019-10-28 21:15:20 INFO SecurityManager:54 - SecurityManager: authentication disable d; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()2019-10-28 21:15:21 INFO Utils:54 - Successfully started service 'sparkDriver' on por t 33507.2019-10-28 21:15:21 INFO SparkEnv:54 - Registering MapOutputTracker 2019-10-28 21:15:21 INFO SparkEnv:54 - Registering BlockManagerMaster 2019-10-28 21:15:21 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.stora ge.DefaultTopologyMapper for getting topology information2019-10-28 21:15:21 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint u p2019-10-28 21:15:21 INFO DiskBlockManager:54 - Created local directory at /tmp/blockm gr-d55cf82a-aff1-458b-a83a-e5d6a4210af92019-10-28 21:15:21 INFO MemoryStore:54 - MemoryStore started with capacity 413.9 MB 2019-10-28 21:15:21 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2019-10-28 21:15:21 INFO log:192 - Logging initialized @11266ms 2019-10-28 21:15:22 INFO Server:346 - jetty-9.3.z-SNAPSHOT 2019-10-28 21:15:22 INFO Server:414 - Started @11505ms 2019-10-28 21:15:22 INFO AbstractConnector:278 - Started ServerConnector@6a9b50cc{HTT P/1.1,[http/1.1]}{0.0.0.0:4040}2019-10-28 21:15:22 INFO Utils:54 - Successfully started service 'SparkUI' on port 40 40.2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7 be3a9ce{/jobs,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1 abfe081{/jobs/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2 a685eba{/jobs/job,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1 07f4980{/jobs/job/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7 5a118e6{/stages,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1 d540566{/stages/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6 014a9ba{/stages/stage,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4 a14c44f{/stages/stage/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@f 08fdce{/stages/pool,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6 bda1d19{/stages/pool/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2 8c86134{/storage,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4 492eede{/storage/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c bc8d0f{/storage/rdd,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3 7b57b54{/storage/rdd/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5 c1f6d57{/environment,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@f 288c14{/environment/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6 794ac0b{/executors,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7 be71476{/executors/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5 cb5bb88{/executors/threadDump,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1 7b6d426{/executors/threadDump/json,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5 580d62f{/static,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c f67838{/,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6 137cf6e{/api,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3 3a3c44a{/jobs/job/kill,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5 fcfca62{/stages/stage/kill,null,AVAILABLE,@Spark}2019-10-28 21:15:22 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http:/ /henu1:40402019-10-28 21:15:22 INFO Executor:54 - Starting executor ID driver on host localhost 2019-10-28 21:15:22 INFO Utils:54 - Successfully started service 'org.apache.spark.ne twork.netty.NettyBlockTransferService' on port 35317.2019-10-28 21:15:22 INFO NettyBlockTransferService:54 - Server created on henu1:35317 2019-10-28 21:15:22 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlock ReplicationPolicy for block replication policy2019-10-28 21:15:22 INFO BlockManagerMaster:54 - Registering BlockManager BlockManage rId(driver, henu1, 35317, None)2019-10-28 21:15:22 INFO BlockManagerMasterEndpoint:54 - Registering block manager he nu1:35317 with 413.9 MB RAM, BlockManagerId(driver, henu1, 35317, None)2019-10-28 21:15:22 INFO BlockManagerMaster:54 - Registered BlockManager BlockManager Id(driver, henu1, 35317, None)2019-10-28 21:15:22 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(d river, henu1, 35317, None)2019-10-28 21:15:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5 8f39564{/metrics/json,null,AVAILABLE,@Spark}2019-10-28 21:15:23 INFO SharedState:54 - loading hive config file: file:/opt/spark/c onf/hive-site.xml2019-10-28 21:15:23 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null ') to the value of spark.sql.warehouse.dir ('file:/opt/spark/bin/spark-warehouse').2019-10-28 21:15:23 INFO SharedState:54 - Warehouse path is 'file:/opt/spark/bin/spar k-warehouse'.2019-10-28 21:15:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7 d3fb0ef{/SQL,null,AVAILABLE,@Spark}2019-10-28 21:15:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7 dbe2ebf{/SQL/json,null,AVAILABLE,@Spark}2019-10-28 21:15:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4 fa9ab6{/SQL/execution,null,AVAILABLE,@Spark}2019-10-28 21:15:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2 d3ef181{/SQL/execution/json,null,AVAILABLE,@Spark}2019-10-28 21:15:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7 4e6094b{/static/sql,null,AVAILABLE,@Spark}2019-10-28 21:15:23 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.2019-10-28 21:15:23 INFO HiveClientImpl:54 - Warehouse location for Hive client (vers ion 1.2.2) is file:/opt/spark/bin/spark-warehouse2019-10-28 21:15:23 INFO metastore:291 - Mestastore configuration hive.metastore.ware house.dir changed from /user/hive/warehouse to file:/opt/spark/bin/spark-warehouse2019-10-28 21:15:23 INFO metastore:376 - Trying to connect to metastore with URI thri ft://henu2:90832019-10-28 21:15:23 INFO metastore:472 - Connected to metastore. 2019-10-28 21:15:24 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinat or endpointspark-sql>

    ?

    后續搞成了,再來補充!!!

    ?

    總結

    以上是生活随笔為你收集整理的Spark _26_Spark On Hive的配置的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。