日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

hadoop MultipleInputs fails with ClassCastException (get fileName)

發布時間:2023/12/10 编程问答 39 豆豆
生活随笔 收集整理的這篇文章主要介紹了 hadoop MultipleInputs fails with ClassCastException (get fileName) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

?

來自:http://stackoverflow.com/questions/11130145/hadoop-multipleinputs-fails-with-classcastexception

?

Following up on my comment, the Javadocs for?TaggedInputSplit?confirms that you are probably wrongly casting the input split to a FileSplit:

/*** An {@link InputSplit} that tags another InputSplit with extra data for use* by {@link DelegatingInputFormat}s and {@link DelegatingMapper}s.*/

My guess is your setup method looks something like this:

@Override protected void setup(Context context) throws IOException,InterruptedException {FileSplit split = (FileSplit) context.getInputSplit(); }

Unfortunately?TaggedInputSplit?is not public visible, so you can't easily do an?instanceof?style check, followed by a cast and then call to?TaggedInputSplit.getInputSplit()?to get the actual underlying FileSplit. So either you'll need to update the source yourself and re-compile&deploy, post a JIRA ticket to ask this to be fixed in future version (if it already hasn't been actioned in 2+) or perform some nasty?nasty?reflection hackery to get to the underlying InputSplit

This is completely untested:

@Override protected void setup(Context context) throws IOException,InterruptedException {InputSplit split = context.getInputSplit();Class<? extends InputSplit> splitClass = split.getClass();FileSplit fileSplit = null;if (splitClass.equals(FileSplit.class)) {fileSplit = (FileSplit) split;} else if (splitClass.getName().equals("org.apache.hadoop.mapreduce.lib.input.TaggedInputSplit")) {// begin reflection hackery...try {Method getInputSplitMethod = splitClass.getDeclaredMethod("getInputSplit");getInputSplitMethod.setAccessible(true);fileSplit = (FileSplit) getInputSplitMethod.invoke(split);} catch (Exception e) {// wrap and re-throw errorthrow new IOException(e);}// end reflection hackery} }

Reflection Hackery Explained:

With TaggedInputSplit being declared protected scope, it's not visible to classes outside the?org.apache.hadoop.mapreduce.lib.input?package, and therefore you cannot reference that class in your setup method. To get around this, we perform a number of reflection based operations:

  • Inspecting the class name, we can test for the type TaggedInputSplit using it's fully qualified name

    splitClass.getName().equals("org.apache.hadoop.mapreduce.lib.input.TaggedInputSplit")

  • We know we want to call the?TaggedInputSplit.getInputSplit()?method to recover the wrapped input split, so we utilize the?Class.getMethod(..)?reflection method to acquire a reference to the method:

    Method getInputSplitMethod = splitClass.getDeclaredMethod("getInputSplit");

  • The class still isn't public visible so we use the setAccessible(..) method to override this, stopping the security manager from throwing an exception

    getInputSplitMethod.setAccessible(true);

  • Finally we invoke the method on the reference to the input split and cast the result to a FileSplit (optimistically hoping its a instance of this type!):

    fileSplit = (FileSplit) getInputSplitMethod.invoke(split);

  • 轉載于:https://www.cnblogs.com/sunxucool/p/3727200.html

    總結

    以上是生活随笔為你收集整理的hadoop MultipleInputs fails with ClassCastException (get fileName)的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。