日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 运维知识 > 数据库 >内容正文

数据库

sqoop把mysql数据导入hive

發布時間:2023/12/31 数据库 27 豆豆
生活随笔 收集整理的這篇文章主要介紹了 sqoop把mysql数据导入hive 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

環境:

軟件版本備注
Ubuntu19.10?
sqoop1.4.7?
mysql8.0.20-0ubuntu0.19.10.1 (Ubuntu)?
hadoop3.1.2?
hive3.0.0之所以和hive有關系是因為需要在.bashrc中設置HCAT_HOME
accumulo2.0.0需要配合sqoop在.bashrc中設置ACCUMULO_HOMT

準備工作:

①lsof -i:10000確保端口存在占用

②metastore確保打開

③hiveserver2確保打開

?

數據導入目標:

mysql中的sqoop_hbase中的表格book導入Hive

?

數據集建立以[1]為準

?

顯示所有數據庫:

sqoop list-databases --connect jdbc:mysql://Desktop:3306/ --username appleyuchi --password appleyuchi

顯示數據庫中所有表格:

sqoop list-tables --connect jdbc:mysql://Desktop:3306/sqoop_hbase --username appleyuchi --password appleyuchi

?

開始傳遞數據:

sqoop import --connect jdbc:mysql://Desktop:3306/sqoop_hbase --username appleyuchi --password appleyuchi --table book --hive-import --create-hive-table ?--fields-terminated-by "\t" -m 5

?

log:

/home/appleyuchi/bigdata/hadoop-3.1.2/libexec/hadoop-functions.sh: 列 2358: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER:無效的變數名稱 /home/appleyuchi/bigdata/hadoop-3.1.2/libexec/hadoop-functions.sh: 列 2453: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS:無效的變數名稱 /home/appleyuchi/bigdata/hadoop-3.1.2/libexec/hadoop-functions.sh: 列 2358: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER:無效的變數名稱 /home/appleyuchi/bigdata/hadoop-3.1.2/libexec/hadoop-functions.sh: 列 2453: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS:無效的變數名稱 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/appleyuchi/bigdata/hadoop-3.1.2/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/appleyuchi/bigdata/apache-hive-3.0.0-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/appleyuchi/bigdata/hbase-2.2.4/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2020-05-28 13:22:47,594 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary. Note: /tmp/sqoop-appleyuchi/compile/af856630222c463eee34cb9210be25ff/book.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 2020-05-28 13:22:50,084 WARN manager.MySQLManager: It looks like you are importing from mysql. 2020-05-28 13:22:50,084 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 2020-05-28 13:22:50,084 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. Hive Session ID = 5b44d7c9-8718-4936-996e-5eb4d9f3878aLogging initialized using configuration in jar:file:/home/appleyuchi/bigdata/apache-hive-3.0.0-bin/lib/hive-common-3.0.0.jar!/hive-log4j2.properties Async: true 2020-05-28 13:23:26,576 WARN session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. OK Time taken: 4.613 seconds Loading data to table default.book OK Time taken: 1.131 seconds

?

beeline中查看實驗結果:

0: jdbc:hive2://Desktop:10000> show tables; INFO : Compiling command(queryId=appleyuchi_20200528133356_e09b4c8f-4d75-4c86-810f-a2ca8cb68921): show tables INFO : Concurrency mode is disabled, not creating a lock manager INFO : Semantic Analysis Completed INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null) INFO : Completed compiling command(queryId=appleyuchi_20200528133356_e09b4c8f-4d75-4c86-810f-a2ca8cb68921); Time taken: 0.123 seconds INFO : Concurrency mode is disabled, not creating a lock manager INFO : Executing command(queryId=appleyuchi_20200528133356_e09b4c8f-4d75-4c86-810f-a2ca8cb68921): show tables INFO : Starting task [Stage-0:DDL] in serial mode INFO : Completed executing command(queryId=appleyuchi_20200528133356_e09b4c8f-4d75-4c86-810f-a2ca8cb68921); Time taken: 0.028 seconds INFO : OK INFO : Concurrency mode is disabled, not creating a lock manager +-----------+ | tab_name | +-----------+ | book | | train | +-----------+ 2 rows selected (0.339 seconds)

#--------------------------------------------------------------------------------------------------------------------------------------------------------

main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")

解決辦法:

$JAVA_HOME/jre/lib/security/java.policy加入:

grant{

permission javax.management.MBeanTrustPermission "register";

}

?

References:

[1]sqoop把mysql數據導入hbase-完整記錄

總結

以上是生活随笔為你收集整理的sqoop把mysql数据导入hive的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。