尝试在我自己的机器上运行pySpark时出错

休伊

我想尝试在自己的Mac(版本10.11.6)上运行Spark。我下载了Spark 2.0.0,然后尝试运行./bin/pyspark。

但是,出现以下错误:

Python 2.7.12 |Anaconda custom (x86_64)| (default, Jul  2 2016, 17:43:17) 
[GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2336.11.00)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/09/13 15:27:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/13 15:27:47 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: huey: huey: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:846)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:839)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:839)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:896)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:896)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:896)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:388)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:236)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:211)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: huey: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
    ... 20 more
16/09/13 15:27:47 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor).  This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:526)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:236)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.GatewayConnection.run(GatewayConnection.java:211)
java.lang.Thread.run(Thread.java:745)
16/09/13 15:27:47 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: huey: huey: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:846)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:839)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:839)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:896)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:896)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:896)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:388)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:236)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:211)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: huey: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
    ... 20 more
Traceback (most recent call last):
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/pyspark/shell.py", line 47, in <module>
    spark = SparkSession.builder.getOrCreate()
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/pyspark/sql/session.py", line 169, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py", line 294, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py", line 115, in __init__
    conf, jsc, profiler_cls)
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py", line 168, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py", line 233, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip/py4j/java_gateway.py", line 1183, in __call__
  File "/Users/hkwik/Downloads/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip/py4j/protocol.py", line 312, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.UnknownHostException: huey: huey: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:846)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:839)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:839)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:896)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:896)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:896)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:388)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:236)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:211)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: huey: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
    ... 20 more

我尝试将SPARK_LOCAL_IP设置为localhost无济于事。有任何想法吗?

休伊

似乎在/ etc / hosts中添加一行以将127.0.0.1映射到huey可以解决此问题。

本文收集自互联网,转载请注明来源。

如有侵权,请联系[email protected] 删除。

编辑于
0

我来说两句

0条评论
登录后参与评论

相关文章

来自分类Dev

尝试在laravel上运行gulp时出错

来自分类Dev

尝试在laravel上运行gulp时出错

来自分类Dev

我在尝试使discord.js机器人上出错:

来自分类Dev

在我的 ubuntu 机器上安装 oracle java 8 时出错

来自分类Dev

尝试在詹金斯上运行mstest时出错

来自分类Dev

尝试在詹金斯上运行mstest时出错

来自分类Dev

尝试在 ios 上运行 react-native 时出错

来自分类Dev

我尝试在 Ubuntu 17.04 上安装 npm 时出错

来自分类Dev

在 Python Anywhere 上运行我的 Discord Bot 时出错

来自分类Dev

在两台机器上运行AngularJS Seed E2E测试时出错

来自分类Dev

通过 Xcode 运行 React Native 项目时在 iOS 机器上出错。错误代码是 -999

来自分类Dev

PySpark尝试运行Word2Vec示例时出错

来自分类Dev

制作我自己的激活函数时出错

来自分类Dev

当我尝试在Linux上运行程序时,我的程序抛出错误(在Windows上运行正常)

来自分类Dev

尝试运行迦太基时出错

来自分类Dev

尝试运行Windows Steam时出错

来自分类Dev

尝试运行 make 命令时出错

来自分类Dev

运行pyspark kafka steam时出错

来自分类Dev

尝试运行 FizzBuzz 时尝试输出“Fizz”时出错

来自分类Dev

尝试将适配器部署到在Tomcat 7上运行的MobileFirst Platform 6.3时出错

来自分类Dev

尝试在Yahoo Streaming-Benchmark上运行STORM_TEST时出错

来自分类Dev

尝试在某些GIMP Python插件代码上运行Sphinx时出错

来自分类Dev

尝试在Linux系统上运行可执行文件时出错

来自分类Dev

尝试调试我的代码时出错

来自分类Dev

当我尝试删除元素时出错

来自分类Dev

尝试调试我的代码时出错

来自分类Dev

我尝试加载夹具时出错

来自分类Dev

尝试在 Ubuntu 上安装 PHP 时出错

来自分类Dev

运行Azure机器学习Web服务时出错,但实验正常

Related 相关文章

  1. 1

    尝试在laravel上运行gulp时出错

  2. 2

    尝试在laravel上运行gulp时出错

  3. 3

    我在尝试使discord.js机器人上出错:

  4. 4

    在我的 ubuntu 机器上安装 oracle java 8 时出错

  5. 5

    尝试在詹金斯上运行mstest时出错

  6. 6

    尝试在詹金斯上运行mstest时出错

  7. 7

    尝试在 ios 上运行 react-native 时出错

  8. 8

    我尝试在 Ubuntu 17.04 上安装 npm 时出错

  9. 9

    在 Python Anywhere 上运行我的 Discord Bot 时出错

  10. 10

    在两台机器上运行AngularJS Seed E2E测试时出错

  11. 11

    通过 Xcode 运行 React Native 项目时在 iOS 机器上出错。错误代码是 -999

  12. 12

    PySpark尝试运行Word2Vec示例时出错

  13. 13

    制作我自己的激活函数时出错

  14. 14

    当我尝试在Linux上运行程序时,我的程序抛出错误(在Windows上运行正常)

  15. 15

    尝试运行迦太基时出错

  16. 16

    尝试运行Windows Steam时出错

  17. 17

    尝试运行 make 命令时出错

  18. 18

    运行pyspark kafka steam时出错

  19. 19

    尝试运行 FizzBuzz 时尝试输出“Fizz”时出错

  20. 20

    尝试将适配器部署到在Tomcat 7上运行的MobileFirst Platform 6.3时出错

  21. 21

    尝试在Yahoo Streaming-Benchmark上运行STORM_TEST时出错

  22. 22

    尝试在某些GIMP Python插件代码上运行Sphinx时出错

  23. 23

    尝试在Linux系统上运行可执行文件时出错

  24. 24

    尝试调试我的代码时出错

  25. 25

    当我尝试删除元素时出错

  26. 26

    尝试调试我的代码时出错

  27. 27

    我尝试加载夹具时出错

  28. 28

    尝试在 Ubuntu 上安装 PHP 时出错

  29. 29

    运行Azure机器学习Web服务时出错,但实验正常

热门标签

归档