日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程语言 > java >内容正文

java

使用Spark/Java读取已开启Kerberos认证的HBase

發布時間:2024/1/17 java 41 豆豆
生活随笔 收集整理的這篇文章主要介紹了 使用Spark/Java读取已开启Kerberos认证的HBase 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

1.賦予drguo用戶相應的權限
2.KDC中創建drguo用戶并導出相應的keytab文件
[root@bigdata28 ~]# kadmin.local?
Authenticating as principal drguo/admin@AISINO.COM with password.
kadmin.local: ?addprinc drguo/bigdata28
WARNING: no policy specified for drguo/bigdata28@AISINO.COM; defaulting to no policy
Enter password for principal "drguo/bigdata28@AISINO.COM":?
Re-enter password for principal "drguo/bigdata28@AISINO.COM":?
Principal "drguo/bigdata28@AISINO.COM" created.
kadmin.local: ?xst -norandkey -k /home/drguo/drguo_bigdata28.keytab drguo/bigdata28@AISINO.COM
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type des3-cbc-sha1 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type arcfour-hmac added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type des-hmac-sha1 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type des-cbc-md5 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
kadmin.local: ?q

3.將krb5.conf與keytab文件拷到本地,方便測試
4.使用Spark讀取HBase
package drguo.test

import java.io.IOException

import com.google.protobuf.ServiceException
import dguo.test.HBaseKerb
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{HBaseAdmin, HTable}
import org.apache.hadoop.hbase.mapreduce.{TableInputFormat}
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.security.UserGroupInformation
import org.apache.spark.{SparkConf, SparkContext}

/**
? * Created by drguo on 2018/7/18.
? */
object SparkExecHBase {


? def main(args: Array[String]): Unit = {
// ? ?HBaseKerb.getAllRows("XMJZ")
? ? System.setProperty("java.security.krb5.conf", "d:/krb5.conf")
? ? val sparkConf = new SparkConf().setAppName("SparkExecHBase").setMaster("local")
? ? val sc = new SparkContext(sparkConf)

? ? val conf = HBaseConfiguration.create()
? ? conf.set(TableInputFormat.INPUT_TABLE, "XMJZ")
? ? conf.set("hbase.zookeeper.quorum","172.19.6.28,172.19.6.29,172.19.6.30")
? ? conf.set("hbase.zookeeper.property.clientPort", "2181")
? ? conf.set("hadoop.security.authentication", "Kerberos")

? ? UserGroupInformation.setConfiguration(conf)
? ? try {
? ? ? UserGroupInformation.loginUserFromKeytab("drguo/bigdata28@AISINO.COM", "d:/drguo_bigdata28.keytab")
? ? ? HBaseAdmin.checkHBaseAvailable(conf)
? ? } catch {
? ? ? case e: IOException =>
? ? ? ? e.printStackTrace()
? ? ? case e: ServiceException =>
? ? ? ? e.printStackTrace()
? ? }

? ? val hbaseRdd = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result])
// ? ?println(hbaseRdd.toString())
? ? hbaseRdd.map( x=>x._2).map{result => (result.getRow,result.getValue(Bytes.toBytes("Info"),Bytes.toBytes("ADDTIME")))}.map(row => (new String(row._1),new String(row._2))).collect.foreach(r => (println(r._1+":"+r._2)))

? }

}

5.使用Java讀取(網上也有不少例子,但大部分都有一些重復、多余的代碼)
package dguo.test;

import com.google.protobuf.ServiceException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.security.UserGroupInformation;

import java.io.IOException;

/**
?* Created by drguo on 2018/7/18.
?*/
public class HBaseKerb {

? ? private static Configuration conf = null;
? ? static {
? ? ? ? System.setProperty("java.security.krb5.conf", "d:/krb5.conf" );
? ? ? ? //使用HBaseConfiguration的單例方法實例化
? ? ? ? conf = HBaseConfiguration.create();
? ? ? ? conf.set("hbase.zookeeper.quorum", "172.19.6.28,172.19.6.29,172.19.6.30");
? ? ? ? conf.set("hbase.zookeeper.property.clientPort", "2181");
? ? ? ? conf.set("hadoop.security.authentication" , "Kerberos" );

? ? ? ? UserGroupInformation.setConfiguration(conf);

? ? ? ? try {
? ? ? ? ? ? UserGroupInformation.loginUserFromKeytab("drguo/bigdata28@AISINO.COM", "d:/drguo_bigdata28.keytab");
? ? ? ? ? ? HBaseAdmin.checkHBaseAvailable(conf);
? ? ? ? } catch (IOException e) {
? ? ? ? ? ? e.printStackTrace();
? ? ? ? } catch (ServiceException e) {
? ? ? ? ? ? e.printStackTrace();
? ? ? ? }

? ? }

? ? public static void getAllRows(String tableName) throws IOException{
? ? ? ? HTable hTable = new HTable(conf, tableName);
? ? ? ? //得到用于掃描region的對象
? ? ? ? Scan scan = new Scan();
? ? ? ? //使用HTable得到resultcanner實現類的對象
? ? ? ? ResultScanner resultScanner = hTable.getScanner(scan);
? ? ? ? for(Result result : resultScanner){
? ? ? ? ? ? Cell[] cells = result.rawCells();
? ? ? ? ? ? for(Cell cell : cells){
? ? ? ? ? ? ? ? //得到rowkey
? ? ? ? ? ? ? ? System.out.println("行鍵:" + Bytes.toString(CellUtil.cloneRow(cell)));
? ? ? ? ? ? ? ? //得到列族
? ? ? ? ? ? ? ? System.out.println("列族" + Bytes.toString(CellUtil.cloneFamily(cell)));
? ? ? ? ? ? ? ? System.out.println("列:" + Bytes.toString(CellUtil.cloneQualifier(cell)));
? ? ? ? ? ? ? ? System.out.println("值:" + Bytes.toString(CellUtil.cloneValue(cell)));
? ? ? ? ? ? }
? ? ? ? }
? ? }

? ? public static void main(String[] args) throws IOException{
? ? ? ? getAllRows("XMJZ");
? ? }
}

PS:
出現下述錯誤往往是因為System.setProperty(“java.security.krb5.conf”, “d:/krb5.conf”)中的krb5.conf文件沒有找到(比如路徑錯誤)或是里面配置的kdc、admin_server地址錯誤。

Exception in thread “main” java.lang.IllegalArgumentException: Can’t get Kerberos realm?
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)?
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:319)?
at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:374)?
at drguo.test.SparkExecHBase$.main(SparkExecHBase.scala:32)?
at drguo.test.SparkExecHBase.main(SparkExecHBase.scala)?
Caused by: java.lang.reflect.InvocationTargetException?
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)?
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)?
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)?
at java.lang.reflect.Method.invoke(Method.java:498)?
at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:84)?
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)?
… 4 more?
Caused by: KrbException: Cannot locate default realm?
at sun.security.krb5.Config.getDefaultRealm(Config.java:1029)?
… 10 more
?

總結

以上是生活随笔為你收集整理的使用Spark/Java读取已开启Kerberos认证的HBase的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。