日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程语言 > python >内容正文

python

None.org.apache.spark.api.python.PythonAccumulatorV2

發布時間:2023/12/20 python 27 豆豆
生活随笔 收集整理的這篇文章主要介紹了 None.org.apache.spark.api.python.PythonAccumulatorV2 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

完整報錯如下:
?

2019-05-21 15:19:00 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Traceback (most recent call last):File "WordCount.py", line 48, in <module>sc=CreateSparkContext()File "WordCount.py", line 37, in CreateSparkContextspark = SparkSession.builder.appName("Intro").config("spark.master", "local").getOrCreate()File "/home/appleyuchi/.virtualenvs/python2.7/local/lib/python2.7/site-packages/pyspark/sql/session.py", line 173, in getOrCreatesc = SparkContext.getOrCreate(sparkConf)File "/home/appleyuchi/.virtualenvs/python2.7/local/lib/python2.7/site-packages/pyspark/context.py", line 367, in getOrCreateSparkContext(conf=conf or SparkConf())File "/home/appleyuchi/.virtualenvs/python2.7/local/lib/python2.7/site-packages/pyspark/context.py", line 136, in __init__conf, jsc, profiler_cls)File "/home/appleyuchi/.virtualenvs/python2.7/local/lib/python2.7/site-packages/pyspark/context.py", line 207, in _do_initself._javaAccumulator = self._jvm.PythonAccumulatorV2(host, port, auth_token)File "/home/appleyuchi/.virtualenvs/python2.7/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 1525, in __call__answer, self._gateway_client, None, self._fqn)File "/home/appleyuchi/.virtualenvs/python2.7/local/lib/python2.7/site-packages/py4j/protocol.py", line 332, in get_return_valueformat(target_id, ".", name, value)) py4j.protocol.Py4JError: An error occurred while calling None.org.apache.spark.api.python.PythonAccumulatorV2. Trace: py4j.Py4JException: Constructor org.apache.spark.api.python.PythonAccumulatorV2([class java.lang.String, class java.lang.Integer, class java.lang.String]) does not existat py4j.reflection.ReflectionEngine.getConstructor(ReflectionEngine.java:179)at py4j.reflection.ReflectionEngine.getConstructor(ReflectionEngine.java:196)at py4j.Gateway.invoke(Gateway.java:237)at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)at py4j.GatewayConnection.run(GatewayConnection.java:238)at java.lang.Thread.run(Thread.java:748)

解決方案如下:
.py文件開頭加入:

# -*- coding: UTF-8 -*- import findspark findspark.init()

?

總結

以上是生活随笔為你收集整理的None.org.apache.spark.api.python.PythonAccumulatorV2的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。