CentOS下Hive2.0.0集群模式安装详解
本文環(huán)境如下:
操作系統(tǒng):CentOS 6 32位
Hive版本:2.0.0
JDK版本:1.8.0_77 32位
Hadoop版本:2.6.4
MySQL版本:5.6.30
1. 準備工作
1.1 Hive
先完成CentOS下Hive2.0.0單機模式安裝詳解中的前三個步驟。
Hive和Hadoop一樣,有3種啟動模式,分別是單機模式,偽分布模式,分布模式。這里說一下分布模式(集群模式)的安裝部署方案。
1.2 MySQL
Hive默認使用derby數(shù)據(jù)庫存儲元數(shù)據(jù),但是該數(shù)據(jù)庫不適用于生產(chǎn)環(huán)境,這邊使用MySQL作為元數(shù)據(jù)的存儲數(shù)據(jù)庫。
所以需要先安裝好MySQL。
1.3 Hadoop
Hive依賴Hadoop,所以需要先安裝并啟動好Hadoop。
2. 數(shù)據(jù)庫相關(guān)
2.1 創(chuàng)建MySQL用戶
create user 'hive' identified by 'hive'; grant all privileges on *.* to 'hive' with grant option; flush privileges; create database hive;2.2 拷貝MySQL驅(qū)動文件
下載地址:http://dev.mysql.com/downloads/connector/j/ ,解壓后拷貝其中的mysql-connector-java-5.1.38-bin.jar到hive的lib文件夾下。
3. 修改配置文件
cd /opt/hive-2.0.0/conf vi hive-site.xml修改以下配置
<property><name>javax.jdo.option.ConnectionURL</name><value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value><description>JDBC connect string for a JDBC metastore</description> </property> <property><name>javax.jdo.option.ConnectionDriverName</name><value>com.mysql.jdbc.Driver</value><description>Driver class name for a JDBC metastore</description> </property> <property><name>javax.jdo.option.ConnectionUserName</name><value>hive</value><description>username to use against metastore database</description> </property> <property><name>javax.jdo.option.ConnectionPassword</name><value>hive</value><description>password to use against metastore database</description> </property>4. 初始化數(shù)據(jù)庫
schematool -initSchema -dbType mysql出現(xiàn)以下幾行說明初始化成功:
Starting metastore schema initialization to 2.0.0 Initialization script hive-schema-2.0.0.mysql.sql Initialization script completed schemaTool completed5. 啟動程序
hive如果出現(xiàn)hive>提示符則說明啟動成功
6. 導入數(shù)據(jù)到hive測試
6.1 創(chuàng)建測試數(shù)據(jù)
vi /root/hive-test.txt輸入以下內(nèi)容(數(shù)字和單詞之間空格分隔)
1 hadoop 2 hive 3 hbase 4 hello6.2 導入數(shù)據(jù)
輸入hive,進入hive命令行,執(zhí)行以下命令
CREATE TABLE IF NOT EXISTS words (id INT,word STRING)ROW FORMAT DELIMITED FIELDS TERMINATED BY " " LINES TERMINATED BY "\n";LOAD DATA LOCAL INPATH '/root/hive-test.txt' OVERWRITE INTO TABLE words;6.3 查詢數(shù)據(jù)
select * from words;如果正常顯示數(shù)據(jù)則說明導入成功。
6.4 插入數(shù)據(jù)
insert into words values(5,'nihao');這個時候我們可以看到程序運行了一個mr作業(yè)。
7. 常見錯誤
7.1 運行hive時出現(xiàn)
Exception in thread "main" java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClientat org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1550)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:521)at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:494)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:709)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.util.RunJar.run(RunJar.java:221)at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.lang.reflect.InvocationTargetExceptionat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)... 15 more Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory NestedThrowables: java.lang.reflect.InvocationTargetExceptionat org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:671)at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:834)at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)at java.security.AccessController.doPrivileged(Native Method)at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:397)at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:426)at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:320)at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:287)at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:55)at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:516)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:481)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:547)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:370)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5749)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)... 20 more Caused by: java.lang.reflect.InvocationTargetExceptionat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:330)at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:203)at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:162)at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:284)at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)... 49 more Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:232)at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:117)at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:82)... 67 more Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:213)... 69 more錯誤原因:
MySQL驅(qū)動找不到。參考2.2進行配置。
7.2 啟動時出現(xiàn)
Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Call From master/4.3.2.1 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefusedat org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:554)at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:494)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:709)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.util.RunJar.run(RunJar.java:221)at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.net.ConnectException: Call From master/4.3.2.1 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefusedat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)at org.apache.hadoop.ipc.Client.call(Client.java:1473)at org.apache.hadoop.ipc.Client.call(Client.java:1400)at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)at com.sun.proxy.$Proxy28.getFileInfo(Unknown Source)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source)at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1977)at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:639)at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:597)at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526)... 9 more Caused by: java.net.ConnectException: Connection refusedat sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:608)at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:706)at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:369)at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)at org.apache.hadoop.ipc.Client.call(Client.java:1439)... 29 more錯誤原因:這個實際是Hadoop的問題,namenode沒啟動。
總結(jié)
以上是生活随笔為你收集整理的CentOS下Hive2.0.0集群模式安装详解的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Kafka使用Java客户端进行访问
- 下一篇: HiveQL基本操作整理