日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

hadoop hdfs (java api)

發布時間:2023/12/2 编程问答 32 豆豆
生活随笔 收集整理的這篇文章主要介紹了 hadoop hdfs (java api) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

簡單介紹使用java控制hdfs文件系統

一、注意namenode端訪問權限,修改hdfs-site.xml文件或修改文件目錄權限

本次采用修改hdfs-site.xml用于測試,在configuration節點中添加如下內容

<property><name>dfs.permissions.enabled</name><value>false</value></property>

二、本次使用eclipse環境新建項目完成測試

使用手動添加jar包完成環境準備,jar包位于hadoop解壓目錄?

如下:

?

hadoop-2.7.3\share\hadoop\common\hadoop-common-2.7.3.jar hadoop-2.7.3\share\hadoop\common\lib\*.jar hadoop-2.7.3\share\hadoop\hdfs\hadoop-hdfs-2.7.3.jar

添加完成jar包就可以編寫代碼,鏈接hdfs文件系統

鏈接hdfs需完成如下步驟

1.創建 org.apache.hadoop.conf.Configuration 用于指定客戶端的配置(服務器的地址,上傳下載文件的一些配置),本次采用如下方式配置

package com.huaqin.hdfs.conf;import org.apache.hadoop.conf.Configuration;public class DeFaultDfsClientConfigration extends Configuration{public DeFaultDfsClientConfigration() {this.set("fs.defaultFS","hdfs://*.*.*.*:9000");this.set("dfs.replication", "2");} }

2.編寫Utils封裝常見操作文件方法

需使用org.apache.hadoop.fs.FileSystem

通過上面的配置文件創建

FileSystem fileSystem = FileSystem.get(new DeFaultDfsClientConfigration());

創建完成之后便可以操作hdfs了,代碼封裝如下

package com.huaqin.hdfs.utils;import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import java.util.Map;import org.apache.hadoop.fs.FSDataInputStream; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IOUtils;import com.huaqin.hdfs.conf.DeFaultDfsClientConfigration;public class HDFSFileUtils {public double progressBar;public HDFSFileUtils() throws IOException {// 使用默認類加載fileSystem = FileSystem.get(new DeFaultDfsClientConfigration());}public HDFSFileUtils(DeFaultDfsClientConfigration clientConfration) throws IOException {// 使用指定類加載fileSystem = FileSystem.get(clientConfration);}// 默認客戶端配置類private FileSystem fileSystem;public void reloadClientConfigration(DeFaultDfsClientConfigration clientConfration) {fileSystem.setConf(clientConfration);}public FileStatus[] list(String fileName) throws FileNotFoundException, IllegalArgumentException, IOException {// TODO Auto-generated method stubFileStatus[] statusList = this.fileSystem.listStatus(new Path(fileName));return statusList;}public void text(String fileName) throws IllegalArgumentException, IOException {// TODO Auto-generated method stubFSDataInputStream inputStream = this.fileSystem.open(new Path(fileName));IOUtils.copyBytes(inputStream, System.out, fileSystem.getConf());}// 上傳文件public void upload(String src, String dest) throws IOException {// TODO Auto-generated method stubFileInputStream in = new FileInputStream(src);FSDataOutputStream os = this.fileSystem.create(new Path(dest), true);IOUtils.copyBytes(in, os, 4096, true);}// 刪除文件public boolean deleteFile(String dest) throws IllegalArgumentException, IOException {// TODO Auto-generated method stubboolean success = this.fileSystem.delete(new Path(dest), true);return success;}// 創建文件夾public boolean makeDir(String dest) throws IllegalArgumentException, IOException {return this.fileSystem.mkdirs(new Path(dest));}// 下載顯示進度public void download2(String dest, Map<String, Integer> descript) throws IllegalArgumentException, IOException {FSDataInputStream in = fileSystem.open(new Path(dest));descript.put("byteSize", in.available());descript.put("current", 0);byte[] bs = new byte[1024];while (-1 != (in.read(bs))) {descript.put("current", descript.get("current") + 1024);}in.close();}// 上傳顯示進度public void upload2(String src, String dest, Map<String, Long> descript)throws IllegalArgumentException, IOException {File file = new File(src);FileInputStream in = new FileInputStream(file);FSDataOutputStream out = this.fileSystem.create(new Path(dest), true);descript.put("byteSize", file.length());descript.put("current", 0l);// 0.5mbbyte[] bs = new byte[1024 * 1024 / 2];while (-1 != (in.read(bs))) {out.write(bs);descript.put("current", descript.get("current") + 1024);}out.close();in.close();}}

三、以下是JUnitTest測試環境

import java.io.IOException; import java.text.DecimalFormat; import java.util.HashMap; import java.util.Map;import org.junit.Before; import org.junit.Test;import com.huaqin.hdfs.utils.HDFSFileUtils;public class HDFSFileUtilsJUT {@Beforepublic void before() throws IOException {fileUtils = new HDFSFileUtils();}HDFSFileUtils fileUtils;@Testpublic void testCreateNEWFile() throws IOException { // fileUtils.upload("D:\\temp\\helloworld.txt", "/tmp/helloworld.txt");fileUtils.upload("E:\\devtool\\hadoop-2.7.3.tar.gz", "/hadoop-2.7.3.tar.gz");}@Testpublic void testText() throws IllegalArgumentException, IOException {fileUtils.text("/hello.txt");}@Testpublic void testDeleteFile() throws IllegalArgumentException, IOException {boolean success = fileUtils.deleteFile("/CentOS-7-x86_64-DVD-1511.iso");System.out.println(success);}@Testpublic void testZMikdirs() throws IllegalArgumentException, IOException {boolean success = fileUtils.makeDir("/tmp");System.out.println(success);}@Testpublic void testdownload2() throws IllegalArgumentException, IOException {Map<String, Integer> desc = new HashMap<>();desc.put("current", 0);desc.put("byteSize", 0);new Thread(new Runnable() {@Overridepublic void run() {// TODO Auto-generated method stubwhile (true) {try {Thread.sleep(500);System.out.printf("maxL:%d\tcurrent:%d\tsurplus:%d\n", desc.get("byteSize"),desc.get("current"),desc.get("byteSize")-desc.get("current"));} catch (InterruptedException e) {// TODO Auto-generated catch block e.printStackTrace();}}}}).start();fileUtils.download2("/hadoop-2.7.3.tar.gz",desc);}@Testpublic void testupload2() throws IllegalArgumentException, IOException {DecimalFormat df = new DecimalFormat("0.00%");Map<String, Long> desc = new HashMap<String, Long>();desc.put("current", 0l);desc.put("byteSize", 0l);new Thread(new Runnable() {@Overridepublic void run() {// TODO Auto-generated method stubwhile (true) {try {Thread.sleep(500);System.out.printf("maxL:%d\tcurrent:%d\tsurplus:%d\tprogressBar:%s\n", desc.get("byteSize"),desc.get("current"),desc.get("byteSize")-desc.get("current"),df.format((desc.get("current")+0.0)/desc.get("byteSize")));} catch (InterruptedException e) {// TODO Auto-generated catch block e.printStackTrace();}}}}).start();fileUtils.upload2("D:\\hadoop\\CentOS-7-x86_64-DVD-1511.iso", "/CentOS-7-x86_64-DVD-1511.iso",desc);}}

?

轉載于:https://www.cnblogs.com/black-/p/8677743.html

總結

以上是生活随笔為你收集整理的hadoop hdfs (java api)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。