Apache Spark: error starting PySpark

On a Centos machine, Python v2.6.6, and Apache Spark v1.2.1

Getting the following error while trying to execute. / pyspark

It seems to be a problem with python but unable to understand

15/06/18 08:11:16 INFO spark.SparkContext: Successfully stopped SparkContext
Traceback (most recent call last):
  File "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/shell.py", line 45, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/context.py", line 105, in __init__
    conf, jsc)
  File "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/context.py", line 157, in _do_init
    self._accumulatorServer = accumulators._start_update_server()
  File "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/accumulators.py", line 269, in _start_update_server
    server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
  File "/usr/lib64/python2.6/SocketServer.py", line 402, in __init__
    self.server_bind()
  File "/usr/lib64/python2.6/SocketServer.py", line 413, in server_bind
    self.socket.bind(self.server_address)
  File "<string>", line 1, in bind
socket.gaierror: [Errno -2] Name or service not known
>>> 15/06/18 08:11:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/06/18 08:11:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
+4
source share
1 answer

From the logs, it seems that pyspark cannot understand the localhost host . Please check the file / etc / hosts, if localhost is not available, add it should solve this problem.

eg:

[Ip] [Hostname] localhost

If you cannot change the server host edit / python / pyspark / accumulators.py line number 269 below

server = AccumulatorServer (("[server host name from hosts file", 0), _UpdateRequestHandler)

+6

All Articles