Не удается запустить оболочки Spark — Ошибка noSuchMethod: scala.Some.value

#python #scala #apache-spark #pyspark

Вопрос:

Получение ошибки при запуске «spark-shell» или «pyspark» для оболочки питания anaconda. Локальная установка.

Установлен SPARK_HOME Установлен
HADOOP_HOME установлен
JAVA_HOME установлен

 (base) PS C:Users....> spark-shell
Exception in thread "main" java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
        at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:134)
        at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
        at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
        at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
        at org.apache.spark.deploy.SparkSubmit$anon$2.doSubmit(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
(base) PS C:Users....>
 

При выполнении pyspark

 (base) PS C:Usersthyago.carvalho> pyspark
Python 3.8.8 (default, Apr 13 2021, 15:08:03) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
Exception in thread "main" java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
        at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:134)
        at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
        at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
        at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
        at org.apache.spark.deploy.SparkSubmit$anon$2.doSubmit(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "C:sparkspark-3.1.2-bin-hadoop3.2pythonpysparkshell.py", line 35, in <module>
    SparkContext._ensure_initialized()  # type: ignore
  File "C:sparkspark-3.1.2-bin-hadoop3.2pythonpysparkcontext.py", line 331, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "C:sparkspark-3.1.2-bin-hadoop3.2pythonpysparkjava_gateway.py", line 108, in launch_gateway
    raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number
>>>  
 

Комментарии:

1. какова версия java и spark?